Nov 22 08:22:05 crc systemd[1]: Starting Kubernetes Kubelet... Nov 22 08:22:05 crc restorecon[4733]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:05 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:06 crc restorecon[4733]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Nov 22 08:22:06 crc restorecon[4733]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Nov 22 08:22:06 crc kubenswrapper[4743]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 08:22:06 crc kubenswrapper[4743]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Nov 22 08:22:06 crc kubenswrapper[4743]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 08:22:06 crc kubenswrapper[4743]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 08:22:06 crc kubenswrapper[4743]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Nov 22 08:22:06 crc kubenswrapper[4743]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.853296 4743 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859204 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859278 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859290 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859299 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859309 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859322 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859330 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859337 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859346 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859356 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859365 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859373 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859381 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859388 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859399 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859410 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859420 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859430 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859441 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859452 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859463 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859472 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859482 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859491 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859500 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859525 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859536 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859547 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859557 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859566 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859604 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859615 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859625 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859634 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859648 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859663 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859672 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859682 4743 feature_gate.go:330] unrecognized feature gate: Example Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859690 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859699 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859709 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859719 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859730 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859741 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859749 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859758 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859769 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859778 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859786 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859811 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859822 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859831 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859841 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859849 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859858 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859866 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859876 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859887 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859896 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859906 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859914 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859923 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859933 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859942 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859950 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859958 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859966 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859974 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859982 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859990 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.859997 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860138 4743 flags.go:64] FLAG: --address="0.0.0.0" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860156 4743 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860169 4743 flags.go:64] FLAG: --anonymous-auth="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860181 4743 flags.go:64] FLAG: --application-metrics-count-limit="100" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860192 4743 flags.go:64] FLAG: --authentication-token-webhook="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860201 4743 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860212 4743 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860231 4743 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860241 4743 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860250 4743 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860259 4743 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860268 4743 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860278 4743 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860287 4743 flags.go:64] FLAG: --cgroup-root="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860296 4743 flags.go:64] FLAG: --cgroups-per-qos="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860305 4743 flags.go:64] FLAG: --client-ca-file="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860314 4743 flags.go:64] FLAG: --cloud-config="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860323 4743 flags.go:64] FLAG: --cloud-provider="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860332 4743 flags.go:64] FLAG: --cluster-dns="[]" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860344 4743 flags.go:64] FLAG: --cluster-domain="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860352 4743 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860361 4743 flags.go:64] FLAG: --config-dir="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860371 4743 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860383 4743 flags.go:64] FLAG: --container-log-max-files="5" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860394 4743 flags.go:64] FLAG: --container-log-max-size="10Mi" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860404 4743 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860414 4743 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860426 4743 flags.go:64] FLAG: --containerd-namespace="k8s.io" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860438 4743 flags.go:64] FLAG: --contention-profiling="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860450 4743 flags.go:64] FLAG: --cpu-cfs-quota="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860461 4743 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860473 4743 flags.go:64] FLAG: --cpu-manager-policy="none" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860483 4743 flags.go:64] FLAG: --cpu-manager-policy-options="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860495 4743 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860504 4743 flags.go:64] FLAG: --enable-controller-attach-detach="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860513 4743 flags.go:64] FLAG: --enable-debugging-handlers="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860522 4743 flags.go:64] FLAG: --enable-load-reader="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860531 4743 flags.go:64] FLAG: --enable-server="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860541 4743 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860552 4743 flags.go:64] FLAG: --event-burst="100" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860562 4743 flags.go:64] FLAG: --event-qps="50" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860570 4743 flags.go:64] FLAG: --event-storage-age-limit="default=0" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860610 4743 flags.go:64] FLAG: --event-storage-event-limit="default=0" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860621 4743 flags.go:64] FLAG: --eviction-hard="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860632 4743 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860641 4743 flags.go:64] FLAG: --eviction-minimum-reclaim="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860649 4743 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860658 4743 flags.go:64] FLAG: --eviction-soft="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860667 4743 flags.go:64] FLAG: --eviction-soft-grace-period="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860676 4743 flags.go:64] FLAG: --exit-on-lock-contention="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860685 4743 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860695 4743 flags.go:64] FLAG: --experimental-mounter-path="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860704 4743 flags.go:64] FLAG: --fail-cgroupv1="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860712 4743 flags.go:64] FLAG: --fail-swap-on="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860722 4743 flags.go:64] FLAG: --feature-gates="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860733 4743 flags.go:64] FLAG: --file-check-frequency="20s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860742 4743 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860751 4743 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860760 4743 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860770 4743 flags.go:64] FLAG: --healthz-port="10248" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860778 4743 flags.go:64] FLAG: --help="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860787 4743 flags.go:64] FLAG: --hostname-override="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860796 4743 flags.go:64] FLAG: --housekeeping-interval="10s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860805 4743 flags.go:64] FLAG: --http-check-frequency="20s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860814 4743 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860824 4743 flags.go:64] FLAG: --image-credential-provider-config="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860832 4743 flags.go:64] FLAG: --image-gc-high-threshold="85" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860844 4743 flags.go:64] FLAG: --image-gc-low-threshold="80" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860859 4743 flags.go:64] FLAG: --image-service-endpoint="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860882 4743 flags.go:64] FLAG: --kernel-memcg-notification="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860895 4743 flags.go:64] FLAG: --kube-api-burst="100" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860907 4743 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860918 4743 flags.go:64] FLAG: --kube-api-qps="50" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860927 4743 flags.go:64] FLAG: --kube-reserved="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860936 4743 flags.go:64] FLAG: --kube-reserved-cgroup="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860945 4743 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860954 4743 flags.go:64] FLAG: --kubelet-cgroups="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860963 4743 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860972 4743 flags.go:64] FLAG: --lock-file="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860985 4743 flags.go:64] FLAG: --log-cadvisor-usage="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.860994 4743 flags.go:64] FLAG: --log-flush-frequency="5s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861004 4743 flags.go:64] FLAG: --log-json-info-buffer-size="0" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861017 4743 flags.go:64] FLAG: --log-json-split-stream="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861026 4743 flags.go:64] FLAG: --log-text-info-buffer-size="0" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861035 4743 flags.go:64] FLAG: --log-text-split-stream="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861044 4743 flags.go:64] FLAG: --logging-format="text" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861055 4743 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861065 4743 flags.go:64] FLAG: --make-iptables-util-chains="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861073 4743 flags.go:64] FLAG: --manifest-url="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861082 4743 flags.go:64] FLAG: --manifest-url-header="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861094 4743 flags.go:64] FLAG: --max-housekeeping-interval="15s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861104 4743 flags.go:64] FLAG: --max-open-files="1000000" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861124 4743 flags.go:64] FLAG: --max-pods="110" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861137 4743 flags.go:64] FLAG: --maximum-dead-containers="-1" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861147 4743 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861156 4743 flags.go:64] FLAG: --memory-manager-policy="None" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861165 4743 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861175 4743 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861183 4743 flags.go:64] FLAG: --node-ip="192.168.126.11" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861192 4743 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861212 4743 flags.go:64] FLAG: --node-status-max-images="50" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861221 4743 flags.go:64] FLAG: --node-status-update-frequency="10s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861231 4743 flags.go:64] FLAG: --oom-score-adj="-999" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861240 4743 flags.go:64] FLAG: --pod-cidr="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861274 4743 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861292 4743 flags.go:64] FLAG: --pod-manifest-path="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861302 4743 flags.go:64] FLAG: --pod-max-pids="-1" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861312 4743 flags.go:64] FLAG: --pods-per-core="0" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861321 4743 flags.go:64] FLAG: --port="10250" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861330 4743 flags.go:64] FLAG: --protect-kernel-defaults="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861339 4743 flags.go:64] FLAG: --provider-id="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861348 4743 flags.go:64] FLAG: --qos-reserved="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861357 4743 flags.go:64] FLAG: --read-only-port="10255" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861366 4743 flags.go:64] FLAG: --register-node="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861375 4743 flags.go:64] FLAG: --register-schedulable="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861385 4743 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861400 4743 flags.go:64] FLAG: --registry-burst="10" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861409 4743 flags.go:64] FLAG: --registry-qps="5" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861420 4743 flags.go:64] FLAG: --reserved-cpus="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861433 4743 flags.go:64] FLAG: --reserved-memory="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861461 4743 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861475 4743 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861486 4743 flags.go:64] FLAG: --rotate-certificates="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861497 4743 flags.go:64] FLAG: --rotate-server-certificates="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861508 4743 flags.go:64] FLAG: --runonce="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861519 4743 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861531 4743 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861544 4743 flags.go:64] FLAG: --seccomp-default="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861556 4743 flags.go:64] FLAG: --serialize-image-pulls="true" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861565 4743 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861606 4743 flags.go:64] FLAG: --storage-driver-db="cadvisor" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861615 4743 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861625 4743 flags.go:64] FLAG: --storage-driver-password="root" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861634 4743 flags.go:64] FLAG: --storage-driver-secure="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861643 4743 flags.go:64] FLAG: --storage-driver-table="stats" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861652 4743 flags.go:64] FLAG: --storage-driver-user="root" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861661 4743 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861671 4743 flags.go:64] FLAG: --sync-frequency="1m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861680 4743 flags.go:64] FLAG: --system-cgroups="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861690 4743 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861705 4743 flags.go:64] FLAG: --system-reserved-cgroup="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861714 4743 flags.go:64] FLAG: --tls-cert-file="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861723 4743 flags.go:64] FLAG: --tls-cipher-suites="[]" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861737 4743 flags.go:64] FLAG: --tls-min-version="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861746 4743 flags.go:64] FLAG: --tls-private-key-file="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861755 4743 flags.go:64] FLAG: --topology-manager-policy="none" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861764 4743 flags.go:64] FLAG: --topology-manager-policy-options="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861776 4743 flags.go:64] FLAG: --topology-manager-scope="container" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861786 4743 flags.go:64] FLAG: --v="2" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861810 4743 flags.go:64] FLAG: --version="false" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861822 4743 flags.go:64] FLAG: --vmodule="" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861837 4743 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.861854 4743 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862102 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862114 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862123 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862131 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862139 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862148 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862158 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862169 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862178 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862188 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862197 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862205 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862213 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862220 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862227 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862236 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862243 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862251 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862259 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862267 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862274 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862282 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862289 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862297 4743 feature_gate.go:330] unrecognized feature gate: Example Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862307 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862316 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862327 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862334 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862347 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862355 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862363 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862371 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862378 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862385 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862397 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862405 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862413 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862423 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862432 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862454 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862467 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862478 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862489 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862500 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862508 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862517 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862528 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862538 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862548 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862558 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862567 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862607 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862619 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862628 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862636 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862643 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862652 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862660 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862667 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862675 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862689 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862696 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862704 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862712 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862719 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862727 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862734 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862742 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862750 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862757 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.862768 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.864247 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.879334 4743 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.879391 4743 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879519 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879533 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879542 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879551 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879559 4743 feature_gate.go:330] unrecognized feature gate: Example Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879567 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879599 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879609 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879618 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879626 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879635 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879642 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879654 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879667 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879676 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879684 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879694 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879702 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879711 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879719 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879728 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879737 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879745 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879754 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879762 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879770 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879778 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879786 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879793 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879801 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879808 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879816 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879824 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879832 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879843 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879854 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879863 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879871 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879882 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879893 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879902 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879913 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879922 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879931 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879940 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879948 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879959 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879968 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879976 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879986 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.879994 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880003 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880011 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880020 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880028 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880036 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880044 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880053 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880060 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880069 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880076 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880085 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880093 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880100 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880109 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880116 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880124 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880132 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880139 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880147 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880159 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.880173 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880392 4743 feature_gate.go:330] unrecognized feature gate: NewOLM Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880406 4743 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880416 4743 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880424 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880433 4743 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880442 4743 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880451 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880459 4743 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880468 4743 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880477 4743 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880485 4743 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880494 4743 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880502 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880511 4743 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880519 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880527 4743 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880534 4743 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880542 4743 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880552 4743 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880562 4743 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880570 4743 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880601 4743 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880609 4743 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880617 4743 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880625 4743 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880633 4743 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880640 4743 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880648 4743 feature_gate.go:330] unrecognized feature gate: PlatformOperators Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880658 4743 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880667 4743 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880675 4743 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880683 4743 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880691 4743 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880698 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880707 4743 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880715 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880723 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880730 4743 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880737 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880745 4743 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880752 4743 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880760 4743 feature_gate.go:330] unrecognized feature gate: SignatureStores Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880768 4743 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880775 4743 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880782 4743 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880790 4743 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880798 4743 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880806 4743 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880813 4743 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880820 4743 feature_gate.go:330] unrecognized feature gate: GatewayAPI Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880830 4743 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880841 4743 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880851 4743 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880859 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880868 4743 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880877 4743 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880886 4743 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880895 4743 feature_gate.go:330] unrecognized feature gate: Example Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880903 4743 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880911 4743 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880920 4743 feature_gate.go:330] unrecognized feature gate: InsightsConfig Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880927 4743 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880938 4743 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880946 4743 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880955 4743 feature_gate.go:330] unrecognized feature gate: PinnedImages Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880962 4743 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880970 4743 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880978 4743 feature_gate.go:330] unrecognized feature gate: OVNObservability Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880986 4743 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.880994 4743 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Nov 22 08:22:06 crc kubenswrapper[4743]: W1122 08:22:06.881003 4743 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.881014 4743 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.882424 4743 server.go:940] "Client rotation is on, will bootstrap in background" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.889429 4743 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.889609 4743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.891691 4743 server.go:997] "Starting client certificate rotation" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.891745 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.892093 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-13 06:30:43.521260373 +0000 UTC Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.892248 4743 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1246h8m36.629017534s for next certificate rotation Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.928003 4743 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.933630 4743 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 08:22:06 crc kubenswrapper[4743]: I1122 08:22:06.964252 4743 log.go:25] "Validated CRI v1 runtime API" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.006223 4743 log.go:25] "Validated CRI v1 image API" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.009723 4743 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.021301 4743 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-11-22-08-17-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.021368 4743 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.043128 4743 manager.go:217] Machine: {Timestamp:2025-11-22 08:22:07.039674904 +0000 UTC m=+0.746035996 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:b3ab2120-2923-4414-bbef-16ed8728100f BootID:3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b5:4e:5f Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b5:4e:5f Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:5a:7c:62 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4d:d4:af Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:bc:e0:ab Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:5b:76:c1 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:8d:1c:d1 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:26:00:62:c0:24:8e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3e:90:10:73:f1:51 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.043467 4743 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.043724 4743 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.045188 4743 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.045475 4743 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.045518 4743 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.045823 4743 topology_manager.go:138] "Creating topology manager with none policy" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.045840 4743 container_manager_linux.go:303] "Creating device plugin manager" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.046736 4743 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.046780 4743 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.047130 4743 state_mem.go:36] "Initialized new in-memory state store" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.047263 4743 server.go:1245] "Using root directory" path="/var/lib/kubelet" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.053622 4743 kubelet.go:418] "Attempting to sync node with API server" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.053660 4743 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.053687 4743 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.053703 4743 kubelet.go:324] "Adding apiserver pod source" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.053717 4743 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.061116 4743 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.062861 4743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Nov 22 08:22:07 crc kubenswrapper[4743]: W1122 08:22:07.064757 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:07 crc kubenswrapper[4743]: W1122 08:22:07.064766 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.065615 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.065614 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.066660 4743 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068242 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068272 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068281 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068290 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068304 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068314 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068323 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068337 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068349 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068360 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068374 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.068383 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.072112 4743 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.073437 4743 server.go:1280] "Started kubelet" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.074539 4743 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.074552 4743 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.075028 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.075612 4743 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Nov 22 08:22:07 crc systemd[1]: Started Kubernetes Kubelet. Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.079538 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.079646 4743 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.086235 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 03:23:44.64494291 +0000 UTC Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.086301 4743 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 595h1m37.55864446s for next certificate rotation Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.086465 4743 volume_manager.go:287] "The desired_state_of_world populator starts" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.086497 4743 volume_manager.go:289] "Starting Kubelet Volume Manager" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.086612 4743 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.086749 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.087522 4743 server.go:460] "Adding debug handlers to kubelet server" Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.088132 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.245:6443: connect: connection refused" interval="200ms" Nov 22 08:22:07 crc kubenswrapper[4743]: W1122 08:22:07.088247 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.088334 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.088485 4743 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.088546 4743 factory.go:55] Registering systemd factory Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.088571 4743 factory.go:221] Registration of the systemd container factory successfully Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.089230 4743 factory.go:153] Registering CRI-O factory Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.089315 4743 factory.go:221] Registration of the crio container factory successfully Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.089397 4743 factory.go:103] Registering Raw factory Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.089473 4743 manager.go:1196] Started watching for new ooms in manager Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.090820 4743 manager.go:319] Starting recovery of all containers Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.089045 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.245:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a4681bb23b1fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 08:22:07.073374716 +0000 UTC m=+0.779735808,LastTimestamp:2025-11-22 08:22:07.073374716 +0000 UTC m=+0.779735808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097530 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097638 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097657 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097675 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097688 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097703 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097715 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097730 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097747 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097760 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097773 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097787 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097800 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097815 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097827 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097841 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097855 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097871 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097883 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097897 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097910 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097922 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097934 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097947 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097959 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.097995 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098012 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098025 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098038 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098050 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098064 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098077 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098091 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098103 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098116 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098130 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098145 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098160 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098172 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098185 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098197 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098210 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098222 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098236 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098250 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098262 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098274 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098287 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098306 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098321 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098333 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098348 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098366 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098382 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098397 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098411 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098426 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.098442 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.106825 4743 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.106898 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.106921 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.106945 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.106961 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.106980 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.106992 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107005 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107021 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107033 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107049 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107061 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107072 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107088 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107101 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107113 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107130 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107146 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107163 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107177 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107190 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107204 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107220 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107235 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107248 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107262 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107276 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.107289 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.108767 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.108851 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.108902 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.108926 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.108975 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.108995 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109013 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109037 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109057 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109072 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109098 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109117 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109139 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109157 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109174 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109197 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109257 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109293 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109326 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109374 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109409 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109434 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109462 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109508 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109533 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109564 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109609 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109634 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109662 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109683 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109724 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109745 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109775 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109806 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109827 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109853 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109875 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109905 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109928 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109948 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109972 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.109995 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.110016 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.110041 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.110061 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.111803 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.111882 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112014 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112082 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112163 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112246 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112310 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112411 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112532 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112625 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112699 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112766 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112837 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112895 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.112956 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113019 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113078 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113258 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113356 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113458 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113526 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113678 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113743 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113813 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113880 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.113937 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114003 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114062 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114121 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114180 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114239 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114323 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114414 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114480 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114552 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114636 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114714 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114782 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114859 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114920 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114978 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115049 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.114443 4743 manager.go:324] Recovery completed Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115111 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115244 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115321 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115417 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115489 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115560 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115655 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115717 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115838 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.115927 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116052 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116123 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116187 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116249 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116326 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116430 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116500 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116560 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116662 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116732 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116796 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116868 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116934 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.116994 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.117052 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.117117 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.117427 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.117550 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.117634 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.117696 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.117767 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.117830 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.117897 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.117963 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.118049 4743 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.118103 4743 reconstruct.go:97] "Volume reconstruction finished" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.118156 4743 reconciler.go:26] "Reconciler: start to sync state" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.126245 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.127942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.127982 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.128334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.129374 4743 cpu_manager.go:225] "Starting CPU manager" policy="none" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.129394 4743 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.129416 4743 state_mem.go:36] "Initialized new in-memory state store" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.144716 4743 policy_none.go:49] "None policy: Start" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.145823 4743 memory_manager.go:170] "Starting memorymanager" policy="None" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.145854 4743 state_mem.go:35] "Initializing new in-memory state store" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.147521 4743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.149700 4743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.150241 4743 status_manager.go:217] "Starting to sync pod status with apiserver" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.150292 4743 kubelet.go:2335] "Starting kubelet main sync loop" Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.150354 4743 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Nov 22 08:22:07 crc kubenswrapper[4743]: W1122 08:22:07.150910 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.150980 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.187368 4743 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.229024 4743 manager.go:334] "Starting Device Plugin manager" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.229091 4743 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.229109 4743 server.go:79] "Starting device plugin registration server" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.229649 4743 eviction_manager.go:189] "Eviction manager: starting control loop" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.229667 4743 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.229903 4743 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.229986 4743 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.229996 4743 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.237025 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.251312 4743 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.251429 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.252647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.252693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.252705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.252833 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.253222 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.253311 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.253647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.253681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.253692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.253797 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.253924 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.253962 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.254744 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.254764 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.254774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.254867 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255190 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255202 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255263 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255295 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255302 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.255998 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.256089 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.256243 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.256299 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.256821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.256850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.256863 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.257074 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.257114 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.257250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.257292 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.257307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.257909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.257946 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.257959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.258147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.258174 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.258184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.288992 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.245:6443: connect: connection refused" interval="400ms" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.319747 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.319814 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.319845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.319869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.319892 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.319914 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.320002 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.320068 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.320106 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.320182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.320221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.320242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.320281 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.320331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.320381 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.329996 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.331421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.331473 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.331493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.331537 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.332089 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.245:6443: connect: connection refused" node="crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421452 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421517 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421548 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421568 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421650 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421669 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421690 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421727 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421744 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421762 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421781 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.421800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422259 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422313 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422380 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422400 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422423 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422476 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422508 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422541 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422594 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422630 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422661 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422692 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.422723 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.532781 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.534372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.534397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.534426 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.534449 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.534925 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.245:6443: connect: connection refused" node="crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.581723 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.593706 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.619743 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.643135 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: W1122 08:22:07.643766 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-89da42ce549efa6f318a331a72b180cecd72f290d0c86acbf7956603a237b83b WatchSource:0}: Error finding container 89da42ce549efa6f318a331a72b180cecd72f290d0c86acbf7956603a237b83b: Status 404 returned error can't find the container with id 89da42ce549efa6f318a331a72b180cecd72f290d0c86acbf7956603a237b83b Nov 22 08:22:07 crc kubenswrapper[4743]: W1122 08:22:07.644218 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-abb03b7209dfa3bd868322795c9c2d058897fd89b427c3465008d20fb0737e20 WatchSource:0}: Error finding container abb03b7209dfa3bd868322795c9c2d058897fd89b427c3465008d20fb0737e20: Status 404 returned error can't find the container with id abb03b7209dfa3bd868322795c9c2d058897fd89b427c3465008d20fb0737e20 Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.648873 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:07 crc kubenswrapper[4743]: W1122 08:22:07.649476 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-e497b85d61475a16aa103e68d0c09000996b660a3c0393064c1a30441be71d38 WatchSource:0}: Error finding container e497b85d61475a16aa103e68d0c09000996b660a3c0393064c1a30441be71d38: Status 404 returned error can't find the container with id e497b85d61475a16aa103e68d0c09000996b660a3c0393064c1a30441be71d38 Nov 22 08:22:07 crc kubenswrapper[4743]: W1122 08:22:07.656808 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-c158a8b9922cd27b771d73606697101927908fe374cd1f575cb01e674a1229d8 WatchSource:0}: Error finding container c158a8b9922cd27b771d73606697101927908fe374cd1f575cb01e674a1229d8: Status 404 returned error can't find the container with id c158a8b9922cd27b771d73606697101927908fe374cd1f575cb01e674a1229d8 Nov 22 08:22:07 crc kubenswrapper[4743]: W1122 08:22:07.666251 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-b16879e944b60f342d4be57ddfd459dac752d7b023f7aa6271027be7e51b3c73 WatchSource:0}: Error finding container b16879e944b60f342d4be57ddfd459dac752d7b023f7aa6271027be7e51b3c73: Status 404 returned error can't find the container with id b16879e944b60f342d4be57ddfd459dac752d7b023f7aa6271027be7e51b3c73 Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.690839 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.245:6443: connect: connection refused" interval="800ms" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.935953 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.938087 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.938125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.938137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:07 crc kubenswrapper[4743]: I1122 08:22:07.938164 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 08:22:07 crc kubenswrapper[4743]: E1122 08:22:07.938782 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.245:6443: connect: connection refused" node="crc" Nov 22 08:22:08 crc kubenswrapper[4743]: W1122 08:22:08.028772 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:08 crc kubenswrapper[4743]: E1122 08:22:08.028874 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.076507 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:08 crc kubenswrapper[4743]: W1122 08:22:08.082052 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:08 crc kubenswrapper[4743]: E1122 08:22:08.082175 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.154789 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"89da42ce549efa6f318a331a72b180cecd72f290d0c86acbf7956603a237b83b"} Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.155972 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b16879e944b60f342d4be57ddfd459dac752d7b023f7aa6271027be7e51b3c73"} Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.156900 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c158a8b9922cd27b771d73606697101927908fe374cd1f575cb01e674a1229d8"} Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.158225 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e497b85d61475a16aa103e68d0c09000996b660a3c0393064c1a30441be71d38"} Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.159163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"abb03b7209dfa3bd868322795c9c2d058897fd89b427c3465008d20fb0737e20"} Nov 22 08:22:08 crc kubenswrapper[4743]: W1122 08:22:08.349746 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:08 crc kubenswrapper[4743]: E1122 08:22:08.349838 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:08 crc kubenswrapper[4743]: E1122 08:22:08.492102 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.245:6443: connect: connection refused" interval="1.6s" Nov 22 08:22:08 crc kubenswrapper[4743]: W1122 08:22:08.572731 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:08 crc kubenswrapper[4743]: E1122 08:22:08.572870 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.739393 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.742305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.742336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.742345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:08 crc kubenswrapper[4743]: I1122 08:22:08.742364 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 08:22:08 crc kubenswrapper[4743]: E1122 08:22:08.742659 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.245:6443: connect: connection refused" node="crc" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.076693 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.164373 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105" exitCode=0 Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.164432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105"} Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.164496 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.166224 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.166265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.166279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.167388 4743 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad" exitCode=0 Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.167444 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad"} Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.167472 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.168518 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.168642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.168687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.168707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.170363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.170429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.170437 4743 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16" exitCode=0 Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.170452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.170482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16"} Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.170557 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.172236 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.172274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.172287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.174872 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1"} Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.174907 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6"} Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.179295 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168" exitCode=0 Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.179331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168"} Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.179460 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.180691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.180722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:09 crc kubenswrapper[4743]: I1122 08:22:09.180732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:09 crc kubenswrapper[4743]: E1122 08:22:09.315956 4743 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.245:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187a4681bb23b1fc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-11-22 08:22:07.073374716 +0000 UTC m=+0.779735808,LastTimestamp:2025-11-22 08:22:07.073374716 +0000 UTC m=+0.779735808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.076358 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:10 crc kubenswrapper[4743]: E1122 08:22:10.093715 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.245:6443: connect: connection refused" interval="3.2s" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.185340 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc" exitCode=0 Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.185400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc"} Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.185509 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.186318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.186348 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.186359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.191055 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7"} Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.191088 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9"} Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.191106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43"} Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.191117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222"} Nov 22 08:22:10 crc kubenswrapper[4743]: W1122 08:22:10.193692 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:10 crc kubenswrapper[4743]: E1122 08:22:10.193750 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.195190 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e4f89923f4bf5b5ce2ce85146d7c472421f1dbc5b8d20103bfd00de9599c2c17"} Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.195225 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.196287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.196328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.196339 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.200208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793"} Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.200305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351"} Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.200322 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f"} Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.200618 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.204120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.204182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.204329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.213533 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba"} Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.213593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21"} Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.213642 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.216097 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.216136 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.216149 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.343403 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.344766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.344804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.344814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.344842 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 08:22:10 crc kubenswrapper[4743]: E1122 08:22:10.345360 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.245:6443: connect: connection refused" node="crc" Nov 22 08:22:10 crc kubenswrapper[4743]: I1122 08:22:10.859219 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 08:22:11 crc kubenswrapper[4743]: W1122 08:22:11.060300 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:11 crc kubenswrapper[4743]: E1122 08:22:11.060406 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.075710 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:11 crc kubenswrapper[4743]: W1122 08:22:11.177046 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.245:6443: connect: connection refused Nov 22 08:22:11 crc kubenswrapper[4743]: E1122 08:22:11.177149 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.245:6443: connect: connection refused" logger="UnhandledError" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.221152 4743 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251" exitCode=0 Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.221234 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251"} Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.221310 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.223012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.223069 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.223088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.228041 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.228248 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ad7d06a741a8b8801f1773c7ae1614edba076eedb2c55a00263f4d6cf6d78379"} Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.228391 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.228527 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.228531 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.228942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.228966 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.228980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.230270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.230330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.230283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.230386 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.230403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.230358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.230880 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.230933 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.230955 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:11 crc kubenswrapper[4743]: I1122 08:22:11.956290 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.235904 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282"} Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.235952 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44"} Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.235964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165"} Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.235976 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91"} Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.238121 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.240023 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ad7d06a741a8b8801f1773c7ae1614edba076eedb2c55a00263f4d6cf6d78379" exitCode=255 Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.240127 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.240185 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.240103 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ad7d06a741a8b8801f1773c7ae1614edba076eedb2c55a00263f4d6cf6d78379"} Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.241548 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.241598 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.241614 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.242053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.242099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.242118 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.246846 4743 scope.go:117] "RemoveContainer" containerID="ad7d06a741a8b8801f1773c7ae1614edba076eedb2c55a00263f4d6cf6d78379" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.531327 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.531554 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.532951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.532989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:12 crc kubenswrapper[4743]: I1122 08:22:12.533000 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.005612 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.013917 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.250549 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525"} Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.250897 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.252927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.252976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.252988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.254001 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.256026 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.256041 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec"} Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.256130 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.256239 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.256321 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.257218 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.257253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.257265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.257330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.257371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.257386 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.432318 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.545828 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.547181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.547227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.547248 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.547283 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 08:22:13 crc kubenswrapper[4743]: I1122 08:22:13.840249 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.258508 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.258617 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.258628 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.258722 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.260384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.260423 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.260435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.260687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.260750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.260774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.260951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.261004 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:14 crc kubenswrapper[4743]: I1122 08:22:14.261022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.166443 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.261187 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.261225 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.261267 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.262684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.262711 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.262721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.262753 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.262689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.262791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.262809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.262807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:15 crc kubenswrapper[4743]: I1122 08:22:15.262865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:16 crc kubenswrapper[4743]: I1122 08:22:16.263993 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:16 crc kubenswrapper[4743]: I1122 08:22:16.264971 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:16 crc kubenswrapper[4743]: I1122 08:22:16.265013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:16 crc kubenswrapper[4743]: I1122 08:22:16.265022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:16 crc kubenswrapper[4743]: I1122 08:22:16.840143 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 08:22:16 crc kubenswrapper[4743]: I1122 08:22:16.840240 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 22 08:22:17 crc kubenswrapper[4743]: E1122 08:22:17.237238 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 08:22:17 crc kubenswrapper[4743]: I1122 08:22:17.458776 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Nov 22 08:22:17 crc kubenswrapper[4743]: I1122 08:22:17.459117 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:17 crc kubenswrapper[4743]: I1122 08:22:17.461234 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:17 crc kubenswrapper[4743]: I1122 08:22:17.461285 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:17 crc kubenswrapper[4743]: I1122 08:22:17.461297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:19 crc kubenswrapper[4743]: I1122 08:22:19.130450 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:19 crc kubenswrapper[4743]: I1122 08:22:19.130622 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:19 crc kubenswrapper[4743]: I1122 08:22:19.132003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:19 crc kubenswrapper[4743]: I1122 08:22:19.132054 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:19 crc kubenswrapper[4743]: I1122 08:22:19.132066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:21 crc kubenswrapper[4743]: W1122 08:22:21.750560 4743 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Nov 22 08:22:21 crc kubenswrapper[4743]: I1122 08:22:21.750676 4743 trace.go:236] Trace[73094093]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 08:22:11.749) (total time: 10001ms): Nov 22 08:22:21 crc kubenswrapper[4743]: Trace[73094093]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (08:22:21.750) Nov 22 08:22:21 crc kubenswrapper[4743]: Trace[73094093]: [10.001445102s] [10.001445102s] END Nov 22 08:22:21 crc kubenswrapper[4743]: E1122 08:22:21.750698 4743 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Nov 22 08:22:21 crc kubenswrapper[4743]: I1122 08:22:21.957083 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 08:22:21 crc kubenswrapper[4743]: I1122 08:22:21.957156 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 08:22:22 crc kubenswrapper[4743]: I1122 08:22:22.078021 4743 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Nov 22 08:22:22 crc kubenswrapper[4743]: I1122 08:22:22.636567 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Nov 22 08:22:22 crc kubenswrapper[4743]: I1122 08:22:22.636660 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.161910 4743 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.841397 4743 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.841774 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.965119 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.965514 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.966224 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.966315 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.967844 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.967918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.967931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:26 crc kubenswrapper[4743]: I1122 08:22:26.972866 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:27 crc kubenswrapper[4743]: E1122 08:22:27.238140 4743 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.293609 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.293950 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.294004 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.295147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.295187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.295198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.492840 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.493159 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.496255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.496307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.496321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.510805 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Nov 22 08:22:27 crc kubenswrapper[4743]: E1122 08:22:27.636781 4743 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.640793 4743 trace.go:236] Trace[1874842281]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 08:22:14.412) (total time: 13228ms): Nov 22 08:22:27 crc kubenswrapper[4743]: Trace[1874842281]: ---"Objects listed" error: 13227ms (08:22:27.640) Nov 22 08:22:27 crc kubenswrapper[4743]: Trace[1874842281]: [13.228031858s] [13.228031858s] END Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.640837 4743 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.641160 4743 trace.go:236] Trace[666460610]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 08:22:15.992) (total time: 11649ms): Nov 22 08:22:27 crc kubenswrapper[4743]: Trace[666460610]: ---"Objects listed" error: 11649ms (08:22:27.641) Nov 22 08:22:27 crc kubenswrapper[4743]: Trace[666460610]: [11.649039669s] [11.649039669s] END Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.641190 4743 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Nov 22 08:22:27 crc kubenswrapper[4743]: E1122 08:22:27.646002 4743 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.647446 4743 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.649500 4743 trace.go:236] Trace[1590862511]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (22-Nov-2025 08:22:16.283) (total time: 11366ms): Nov 22 08:22:27 crc kubenswrapper[4743]: Trace[1590862511]: ---"Objects listed" error: 11366ms (08:22:27.649) Nov 22 08:22:27 crc kubenswrapper[4743]: Trace[1590862511]: [11.366247155s] [11.366247155s] END Nov 22 08:22:27 crc kubenswrapper[4743]: I1122 08:22:27.649525 4743 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.064865 4743 apiserver.go:52] "Watching apiserver" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.068148 4743 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.068445 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.068782 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.069400 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.069454 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.069510 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.069567 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.069635 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.069722 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.070117 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.070164 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.072270 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.072447 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.072557 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.072727 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.072862 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.073182 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.073486 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.073747 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.073907 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.087051 4743 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.100295 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.110007 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.125722 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.136038 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.145761 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.149951 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.149989 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150007 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150027 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150044 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150064 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150084 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150100 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150124 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150139 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150155 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150173 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150194 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150216 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150236 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150253 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150269 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150287 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150402 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150410 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150444 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150524 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150729 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150783 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150826 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.150854 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.151210 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.151500 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.151516 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.151599 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.151659 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.151725 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.151749 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.151758 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.151799 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.151936 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152152 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152161 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152207 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152241 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152254 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152300 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152330 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152366 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152392 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152410 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152424 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152525 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152537 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152564 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152614 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152640 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152663 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152692 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152718 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152939 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152974 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.152981 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153002 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153028 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153053 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153081 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153105 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153114 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153131 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153164 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153189 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153213 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153226 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153237 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153266 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153295 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153367 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153393 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153437 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153465 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153496 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153524 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153555 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153601 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153633 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153659 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153693 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153752 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153781 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153808 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153837 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153864 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153888 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153925 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153951 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153977 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154030 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154055 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153403 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154106 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153435 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154107 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153449 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153556 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153655 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153682 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153687 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153750 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154176 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154240 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154263 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154291 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154317 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154340 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154361 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154386 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154409 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154431 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154454 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154475 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154517 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154539 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154559 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154598 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154620 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154640 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154663 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154688 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154712 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154735 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154755 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154779 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154805 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154827 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154848 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154867 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154889 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154908 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154927 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154948 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154968 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154989 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155011 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155031 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155082 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155101 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155121 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155146 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155166 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155189 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155215 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155244 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155269 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155293 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155313 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155333 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155355 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155376 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155397 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155417 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155436 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155459 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155501 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155522 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155546 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155567 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155614 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155664 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155687 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155711 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155734 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155757 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155779 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155801 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155821 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155843 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155865 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155886 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155911 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155956 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155982 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156029 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156051 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156072 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156095 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156145 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156168 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156197 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156223 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156247 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156270 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156298 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156324 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156349 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156379 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156403 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156429 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156451 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156476 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156500 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156522 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156546 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156724 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156840 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156880 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156906 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156932 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156955 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156977 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.157000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.157026 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153891 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.157377 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153895 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153828 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153923 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153949 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153963 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154100 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.153442 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154287 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154361 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154435 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154458 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154631 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.154734 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155200 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155597 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155604 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155623 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155675 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155685 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155897 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.155952 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156241 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156252 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156271 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.156704 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.157026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.157910 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.157911 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.158117 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.158139 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.158204 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.158399 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.158778 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.158957 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.159021 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.159087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.159168 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.159211 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.159353 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.159350 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.159353 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.159446 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.159449 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.159727 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160373 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160460 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160494 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160522 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160547 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160613 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160635 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160660 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160673 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160688 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160712 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160782 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160816 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160851 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160878 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160884 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160907 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.160985 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161010 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161056 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161081 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161107 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161231 4743 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161246 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161260 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161275 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161288 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161322 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161336 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161349 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161361 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161375 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161387 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161397 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161406 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161416 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161426 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161435 4743 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161445 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161454 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161452 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161463 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161474 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161509 4743 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161807 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161829 4743 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161847 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161864 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161880 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161896 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161911 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161927 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161941 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161952 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161965 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161980 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.161990 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162000 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162011 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162021 4743 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162031 4743 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162041 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162051 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162061 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162071 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162080 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162090 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162100 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162112 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162122 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162133 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162144 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162154 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162164 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162176 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162187 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162198 4743 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162212 4743 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162225 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162235 4743 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162245 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162255 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162265 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162274 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162283 4743 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162293 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162302 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162311 4743 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162321 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162330 4743 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162346 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162357 4743 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162367 4743 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162380 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162389 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162400 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162409 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162418 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162428 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162437 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162446 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162456 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162467 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162476 4743 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162486 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162497 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162507 4743 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162518 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162709 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162874 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.162930 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.163129 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.163191 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.163360 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.163615 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.163638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.163647 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.163662 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.163766 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.163966 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.163968 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.164244 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.164403 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.164545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.164627 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.164809 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.164884 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.164940 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.165171 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.165190 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.165267 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.165562 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.165621 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.165908 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.166077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.166406 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.166498 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.166845 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.166879 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.166994 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.167029 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.167037 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.167515 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.167528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.168028 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.168376 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.167611 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.168500 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.169278 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.169794 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.169835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.169872 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.170167 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.170196 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.170503 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.170953 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.170991 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.171053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.171512 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.171711 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.171977 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.172023 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.172469 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.169810 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.172768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.172842 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.172956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.173540 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.173818 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.174214 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.174245 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.174392 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.174483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.174621 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.175053 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.175084 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.173759 4743 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.175570 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.175678 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:28.675645365 +0000 UTC m=+22.382006417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.176032 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:22:28.675998073 +0000 UTC m=+22.382359145 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.176327 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.180394 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.180532 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.180617 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.180691 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:28.68066998 +0000 UTC m=+22.387031032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.181157 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.181482 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.182016 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.182465 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.188429 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.188700 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.188776 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.189063 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:28.689031975 +0000 UTC m=+22.395393027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.189052 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.189809 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.189810 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.190471 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.190515 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.190544 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.190562 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.190647 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.190661 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:28.690638447 +0000 UTC m=+22.396999709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.190779 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.191065 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.191166 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.191206 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.191205 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.191474 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.191430 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.191691 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.191981 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.192134 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.192336 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.192619 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.193094 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.193391 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.193488 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.193697 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.193707 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.193912 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.194346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.196514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.198066 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.200142 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.201615 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.201727 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.203077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.203538 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.203608 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.204390 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.206009 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.207301 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.207385 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.207942 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.208286 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.209732 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.214398 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.234494 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.245858 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.250205 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.265940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266002 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266067 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266080 4743 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266090 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266100 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266110 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266119 4743 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266127 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266136 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266144 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266152 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266162 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266173 4743 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266183 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266193 4743 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266201 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266209 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266217 4743 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266225 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266232 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266241 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266249 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266257 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266264 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266272 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266280 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266290 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266302 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266315 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266326 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266336 4743 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266346 4743 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266355 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266365 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266375 4743 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266384 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266392 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266401 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266411 4743 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266422 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266430 4743 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266440 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266449 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266458 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266466 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266474 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266484 4743 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266492 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266501 4743 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266509 4743 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266517 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266526 4743 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266534 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266543 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266551 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266560 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266568 4743 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266594 4743 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266604 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266612 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266623 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266707 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266770 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266786 4743 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266801 4743 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266817 4743 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266859 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266876 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266891 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266929 4743 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266943 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266958 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266971 4743 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.266985 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267008 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267023 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267040 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267054 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267091 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267109 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267123 4743 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267139 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267175 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267190 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267204 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267220 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267254 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267269 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267284 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267299 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267332 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267347 4743 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267361 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267375 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267408 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267425 4743 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267439 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267453 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267487 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267503 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267517 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267530 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267543 4743 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267592 4743 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267610 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267627 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267701 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267782 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267800 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267853 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267868 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.267881 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.319464 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.389428 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.394644 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.396397 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Nov 22 08:22:28 crc kubenswrapper[4743]: W1122 08:22:28.423614 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-70b17ba3f066be1b252b94d06df1e5d42246836f574f4bd47d51d79f3ddbe11f WatchSource:0}: Error finding container 70b17ba3f066be1b252b94d06df1e5d42246836f574f4bd47d51d79f3ddbe11f: Status 404 returned error can't find the container with id 70b17ba3f066be1b252b94d06df1e5d42246836f574f4bd47d51d79f3ddbe11f Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.772529 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.772695 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:22:29.77267734 +0000 UTC m=+23.479038392 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.773015 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.773042 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.773074 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773090 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.773100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773146 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:29.773135493 +0000 UTC m=+23.479496545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773226 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773226 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773261 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773282 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773294 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773244 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773325 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773326 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:29.773304481 +0000 UTC m=+23.479665563 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773365 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:29.773355524 +0000 UTC m=+23.479716656 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:28 crc kubenswrapper[4743]: E1122 08:22:28.773381 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:29.773373485 +0000 UTC m=+23.479734607 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.876730 4743 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Nov 22 08:22:28 crc kubenswrapper[4743]: I1122 08:22:28.876789 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.150770 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.150946 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.154418 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.154976 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.155818 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.156427 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.157039 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.157606 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.158215 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.158776 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.159391 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.159964 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.160455 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.161166 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.162487 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.163203 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.163893 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.164540 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.165241 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.165790 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.166473 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.167193 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.167767 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.168457 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.169003 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.169910 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.170541 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.174117 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.177331 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.177999 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.178851 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.179861 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.180306 4743 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.180405 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.182889 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.183520 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.184044 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.186477 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.187390 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.188033 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.189328 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.190615 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.191166 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.192076 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.193323 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.194484 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.195503 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.196678 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.197472 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.198916 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.199444 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.200481 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.201062 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.201838 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.202959 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.203506 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.301116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942"} Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.301182 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9"} Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.301197 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"15d0255dc434ae4b8be89acd9565c4d2df79dc3a9fdfdf4e3861777ba5f0167a"} Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.302604 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca"} Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.302644 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"70b17ba3f066be1b252b94d06df1e5d42246836f574f4bd47d51d79f3ddbe11f"} Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.304105 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.304664 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.306681 4743 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec" exitCode=255 Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.306751 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec"} Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.306826 4743 scope.go:117] "RemoveContainer" containerID="ad7d06a741a8b8801f1773c7ae1614edba076eedb2c55a00263f4d6cf6d78379" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.307635 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"e44215c7fb8ab0aab591fd03a792abb0b3f53599e3657c75ada6a5f9dad0c272"} Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.317192 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.321596 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.321911 4743 scope.go:117] "RemoveContainer" containerID="33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec" Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.322136 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.331953 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.350829 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.363289 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.376838 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.396849 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.421733 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.433162 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.446629 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.459278 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.473634 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.489938 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.507154 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.511654 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7v699"] Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.512002 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7v699" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.513744 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.513754 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.513927 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.522658 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d06a741a8b8801f1773c7ae1614edba076eedb2c55a00263f4d6cf6d78379\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"message\\\":\\\"W1122 08:22:10.464787 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 08:22:10.465144 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763799730 cert, and key in /tmp/serving-cert-126359243/serving-signer.crt, /tmp/serving-cert-126359243/serving-signer.key\\\\nI1122 08:22:10.730707 1 observer_polling.go:159] Starting file observer\\\\nW1122 08:22:10.733916 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 08:22:10.734116 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:10.737709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-126359243/tls.crt::/tmp/serving-cert-126359243/tls.key\\\\\\\"\\\\nF1122 08:22:11.032398 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.547218 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.564016 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d06a741a8b8801f1773c7ae1614edba076eedb2c55a00263f4d6cf6d78379\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"message\\\":\\\"W1122 08:22:10.464787 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 08:22:10.465144 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763799730 cert, and key in /tmp/serving-cert-126359243/serving-signer.crt, /tmp/serving-cert-126359243/serving-signer.key\\\\nI1122 08:22:10.730707 1 observer_polling.go:159] Starting file observer\\\\nW1122 08:22:10.733916 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 08:22:10.734116 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:10.737709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-126359243/tls.crt::/tmp/serving-cert-126359243/tls.key\\\\\\\"\\\\nF1122 08:22:11.032398 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.590038 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.605796 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.617509 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.629133 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.643513 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.653628 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.667907 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.679787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx9j4\" (UniqueName: \"kubernetes.io/projected/a5a54581-c9c3-4c51-b2ed-a3477e2a3159-kube-api-access-nx9j4\") pod \"node-resolver-7v699\" (UID: \"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\") " pod="openshift-dns/node-resolver-7v699" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.679850 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a5a54581-c9c3-4c51-b2ed-a3477e2a3159-hosts-file\") pod \"node-resolver-7v699\" (UID: \"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\") " pod="openshift-dns/node-resolver-7v699" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.683986 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.780466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.780529 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.780555 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.780599 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.780627 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx9j4\" (UniqueName: \"kubernetes.io/projected/a5a54581-c9c3-4c51-b2ed-a3477e2a3159-kube-api-access-nx9j4\") pod \"node-resolver-7v699\" (UID: \"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\") " pod="openshift-dns/node-resolver-7v699" Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780661 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:22:31.780640935 +0000 UTC m=+25.487001997 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.780686 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.780715 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a5a54581-c9c3-4c51-b2ed-a3477e2a3159-hosts-file\") pod \"node-resolver-7v699\" (UID: \"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\") " pod="openshift-dns/node-resolver-7v699" Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780764 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780798 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780809 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780838 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780858 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780869 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780877 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:31.780861327 +0000 UTC m=+25.487222379 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780909 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:31.780898378 +0000 UTC m=+25.487259520 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.780773 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a5a54581-c9c3-4c51-b2ed-a3477e2a3159-hosts-file\") pod \"node-resolver-7v699\" (UID: \"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\") " pod="openshift-dns/node-resolver-7v699" Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780935 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780967 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:31.780958181 +0000 UTC m=+25.487319333 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.780993 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:29 crc kubenswrapper[4743]: E1122 08:22:29.781045 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:31.781038156 +0000 UTC m=+25.487399208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.799379 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx9j4\" (UniqueName: \"kubernetes.io/projected/a5a54581-c9c3-4c51-b2ed-a3477e2a3159-kube-api-access-nx9j4\") pod \"node-resolver-7v699\" (UID: \"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\") " pod="openshift-dns/node-resolver-7v699" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.823452 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7v699" Nov 22 08:22:29 crc kubenswrapper[4743]: W1122 08:22:29.841591 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5a54581_c9c3_4c51_b2ed_a3477e2a3159.slice/crio-b657d5aa1f99728d3d90c7bed13d6013dce281ac8ec504782b8d8fdcc908bf54 WatchSource:0}: Error finding container b657d5aa1f99728d3d90c7bed13d6013dce281ac8ec504782b8d8fdcc908bf54: Status 404 returned error can't find the container with id b657d5aa1f99728d3d90c7bed13d6013dce281ac8ec504782b8d8fdcc908bf54 Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.889603 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-cbpnf"] Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.889930 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.890281 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-mwvcl"] Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.891065 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xk98p"] Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.891302 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.891672 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.894285 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.894670 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p8glw"] Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.895375 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.895647 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.895680 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.899519 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.901181 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.901759 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.901859 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.902056 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.902271 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.902540 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.902618 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.902816 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.902960 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.903024 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.903116 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.903161 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.903384 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.903511 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.903658 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.916615 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.933198 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.953111 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.966392 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bae39197-d188-40a8-880d-0d2e6e528f86-rootfs\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984138 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8fcce96-e512-4437-bf8f-d56269b1ce26-cni-binary-copy\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-config\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984232 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-var-lib-kubelet\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-node-log\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984291 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-os-release\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-socket-dir-parent\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-etc-openvswitch\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984372 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-netd\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984399 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-hostroot\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984459 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-daemon-config\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984488 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd9v8\" (UniqueName: \"kubernetes.io/projected/a1de4b47-eed0-431f-a7a9-a944ce8791bd-kube-api-access-hd9v8\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-kubelet\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984542 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-slash\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984607 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bae39197-d188-40a8-880d-0d2e6e528f86-proxy-tls\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984641 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4knjx\" (UniqueName: \"kubernetes.io/projected/f8fcce96-e512-4437-bf8f-d56269b1ce26-kube-api-access-4knjx\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-cnibin\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-var-lib-cni-bin\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-systemd-units\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984756 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-systemd\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984785 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-script-lib\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984826 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-system-cni-dir\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984855 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-run-k8s-cni-cncf-io\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984882 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-etc-kubernetes\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984909 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-system-cni-dir\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984943 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-run-multus-certs\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.984996 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-bin\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985032 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-cnibin\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985094 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a1de4b47-eed0-431f-a7a9-a944ce8791bd-cni-binary-copy\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985121 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-conf-dir\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35d29494-f9cd-46b7-be04-d7a848a72fee-ovn-node-metrics-cert\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-log-socket\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985207 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-cni-dir\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-var-lib-cni-multus\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-openvswitch\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-ovn\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985318 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8fcce96-e512-4437-bf8f-d56269b1ce26-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985349 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-run-netns\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkdp6\" (UniqueName: \"kubernetes.io/projected/35d29494-f9cd-46b7-be04-d7a848a72fee-kube-api-access-vkdp6\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985407 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bae39197-d188-40a8-880d-0d2e6e528f86-mcd-auth-proxy-config\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985436 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ddkv\" (UniqueName: \"kubernetes.io/projected/bae39197-d188-40a8-880d-0d2e6e528f86-kube-api-access-9ddkv\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985352 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d06a741a8b8801f1773c7ae1614edba076eedb2c55a00263f4d6cf6d78379\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"message\\\":\\\"W1122 08:22:10.464787 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 08:22:10.465144 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763799730 cert, and key in /tmp/serving-cert-126359243/serving-signer.crt, /tmp/serving-cert-126359243/serving-signer.key\\\\nI1122 08:22:10.730707 1 observer_polling.go:159] Starting file observer\\\\nW1122 08:22:10.733916 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 08:22:10.734116 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:10.737709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-126359243/tls.crt::/tmp/serving-cert-126359243/tls.key\\\\\\\"\\\\nF1122 08:22:11.032398 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:29Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985463 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-os-release\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-netns\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-var-lib-openvswitch\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985854 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:29 crc kubenswrapper[4743]: I1122 08:22:29.985880 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-env-overrides\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.055563 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.071899 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086256 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-var-lib-cni-bin\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086435 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-systemd-units\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-systemd\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-script-lib\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086496 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-etc-kubernetes\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086521 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-system-cni-dir\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086533 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-systemd\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086544 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-run-k8s-cni-cncf-io\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-var-lib-cni-bin\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086570 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-system-cni-dir\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086610 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-system-cni-dir\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-run-multus-certs\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-run-k8s-cni-cncf-io\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086644 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-bin\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-etc-kubernetes\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086664 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086683 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-bin\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a1de4b47-eed0-431f-a7a9-a944ce8791bd-cni-binary-copy\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086518 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-systemd-units\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086712 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-conf-dir\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086737 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-cnibin\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086758 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-log-socket\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086779 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35d29494-f9cd-46b7-be04-d7a848a72fee-ovn-node-metrics-cert\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-cni-dir\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086821 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-var-lib-cni-multus\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086842 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-openvswitch\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086865 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-ovn\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkdp6\" (UniqueName: \"kubernetes.io/projected/35d29494-f9cd-46b7-be04-d7a848a72fee-kube-api-access-vkdp6\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086912 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8fcce96-e512-4437-bf8f-d56269b1ce26-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086933 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-run-netns\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bae39197-d188-40a8-880d-0d2e6e528f86-mcd-auth-proxy-config\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086971 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ddkv\" (UniqueName: \"kubernetes.io/projected/bae39197-d188-40a8-880d-0d2e6e528f86-kube-api-access-9ddkv\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086994 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-os-release\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-env-overrides\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-netns\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-var-lib-openvswitch\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087093 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bae39197-d188-40a8-880d-0d2e6e528f86-rootfs\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087109 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8fcce96-e512-4437-bf8f-d56269b1ce26-cni-binary-copy\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087126 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-config\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087177 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-var-lib-kubelet\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087187 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-node-log\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087215 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-netd\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087222 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-run-netns\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086690 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-run-multus-certs\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087235 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-conf-dir\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-os-release\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-socket-dir-parent\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-etc-openvswitch\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087307 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-kubelet\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087321 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-slash\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087336 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-hostroot\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087351 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-daemon-config\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd9v8\" (UniqueName: \"kubernetes.io/projected/a1de4b47-eed0-431f-a7a9-a944ce8791bd-kube-api-access-hd9v8\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087365 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a1de4b47-eed0-431f-a7a9-a944ce8791bd-cni-binary-copy\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.086665 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-system-cni-dir\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087387 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bae39197-d188-40a8-880d-0d2e6e528f86-proxy-tls\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087401 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-script-lib\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087428 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4knjx\" (UniqueName: \"kubernetes.io/projected/f8fcce96-e512-4437-bf8f-d56269b1ce26-kube-api-access-4knjx\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087458 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-cnibin\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-cnibin\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087856 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-os-release\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087919 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8fcce96-e512-4437-bf8f-d56269b1ce26-cnibin\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-slash\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087954 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-socket-dir-parent\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087958 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-log-socket\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.087996 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-etc-openvswitch\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088019 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-kubelet\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bae39197-d188-40a8-880d-0d2e6e528f86-mcd-auth-proxy-config\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088411 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-env-overrides\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088427 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-var-lib-cni-multus\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088485 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-cni-dir\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088511 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-openvswitch\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088532 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-node-log\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-host-var-lib-kubelet\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088589 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-ovn\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088613 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-netd\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088661 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a1de4b47-eed0-431f-a7a9-a944ce8791bd-multus-daemon-config\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088709 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-hostroot\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088742 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-var-lib-openvswitch\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088768 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-netns\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088781 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088813 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-ovn-kubernetes\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088877 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8fcce96-e512-4437-bf8f-d56269b1ce26-cni-binary-copy\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.088890 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/bae39197-d188-40a8-880d-0d2e6e528f86-rootfs\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.089024 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a1de4b47-eed0-431f-a7a9-a944ce8791bd-os-release\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.089089 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-config\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.089681 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8fcce96-e512-4437-bf8f-d56269b1ce26-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.092359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bae39197-d188-40a8-880d-0d2e6e528f86-proxy-tls\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.092899 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35d29494-f9cd-46b7-be04-d7a848a72fee-ovn-node-metrics-cert\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.103679 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.103945 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ddkv\" (UniqueName: \"kubernetes.io/projected/bae39197-d188-40a8-880d-0d2e6e528f86-kube-api-access-9ddkv\") pod \"machine-config-daemon-xk98p\" (UID: \"bae39197-d188-40a8-880d-0d2e6e528f86\") " pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.106185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkdp6\" (UniqueName: \"kubernetes.io/projected/35d29494-f9cd-46b7-be04-d7a848a72fee-kube-api-access-vkdp6\") pod \"ovnkube-node-p8glw\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.107000 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd9v8\" (UniqueName: \"kubernetes.io/projected/a1de4b47-eed0-431f-a7a9-a944ce8791bd-kube-api-access-hd9v8\") pod \"multus-cbpnf\" (UID: \"a1de4b47-eed0-431f-a7a9-a944ce8791bd\") " pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.116041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4knjx\" (UniqueName: \"kubernetes.io/projected/f8fcce96-e512-4437-bf8f-d56269b1ce26-kube-api-access-4knjx\") pod \"multus-additional-cni-plugins-mwvcl\" (UID: \"f8fcce96-e512-4437-bf8f-d56269b1ce26\") " pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.118923 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.134391 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.149955 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.151077 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:30 crc kubenswrapper[4743]: E1122 08:22:30.151199 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.151077 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:30 crc kubenswrapper[4743]: E1122 08:22:30.151387 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.161867 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.174917 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.188461 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.202887 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ad7d06a741a8b8801f1773c7ae1614edba076eedb2c55a00263f4d6cf6d78379\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"message\\\":\\\"W1122 08:22:10.464787 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1122 08:22:10.465144 1 crypto.go:601] Generating new CA for check-endpoints-signer@1763799730 cert, and key in /tmp/serving-cert-126359243/serving-signer.crt, /tmp/serving-cert-126359243/serving-signer.key\\\\nI1122 08:22:10.730707 1 observer_polling.go:159] Starting file observer\\\\nW1122 08:22:10.733916 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1122 08:22:10.734116 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:10.737709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-126359243/tls.crt::/tmp/serving-cert-126359243/tls.key\\\\\\\"\\\\nF1122 08:22:11.032398 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.205495 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cbpnf" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.216164 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.218041 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: W1122 08:22:30.221073 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1de4b47_eed0_431f_a7a9_a944ce8791bd.slice/crio-2b1ea3c6977a856f9d7b832de0a635ca12372403ff5a5c516944767a5c9f70a6 WatchSource:0}: Error finding container 2b1ea3c6977a856f9d7b832de0a635ca12372403ff5a5c516944767a5c9f70a6: Status 404 returned error can't find the container with id 2b1ea3c6977a856f9d7b832de0a635ca12372403ff5a5c516944767a5c9f70a6 Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.222330 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" Nov 22 08:22:30 crc kubenswrapper[4743]: W1122 08:22:30.233351 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbae39197_d188_40a8_880d_0d2e6e528f86.slice/crio-a0ac10b67a3b7b3e714d9b08e3cd51183a82f8ccfe99aa2f2086247475ed18e3 WatchSource:0}: Error finding container a0ac10b67a3b7b3e714d9b08e3cd51183a82f8ccfe99aa2f2086247475ed18e3: Status 404 returned error can't find the container with id a0ac10b67a3b7b3e714d9b08e3cd51183a82f8ccfe99aa2f2086247475ed18e3 Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.234411 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.235507 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: W1122 08:22:30.245821 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8fcce96_e512_4437_bf8f_d56269b1ce26.slice/crio-65828a13ce2d94da6cffdbd532111193288f51a58452df82b9b89a5c5af939a4 WatchSource:0}: Error finding container 65828a13ce2d94da6cffdbd532111193288f51a58452df82b9b89a5c5af939a4: Status 404 returned error can't find the container with id 65828a13ce2d94da6cffdbd532111193288f51a58452df82b9b89a5c5af939a4 Nov 22 08:22:30 crc kubenswrapper[4743]: W1122 08:22:30.261445 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d29494_f9cd_46b7_be04_d7a848a72fee.slice/crio-1eafb21df4fb93347557916ab6b8edc7a938e0ad4966be779b032072746d9792 WatchSource:0}: Error finding container 1eafb21df4fb93347557916ab6b8edc7a938e0ad4966be779b032072746d9792: Status 404 returned error can't find the container with id 1eafb21df4fb93347557916ab6b8edc7a938e0ad4966be779b032072746d9792 Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.271776 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.300021 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.311231 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"a0ac10b67a3b7b3e714d9b08e3cd51183a82f8ccfe99aa2f2086247475ed18e3"} Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.312775 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.319197 4743 scope.go:117] "RemoveContainer" containerID="33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec" Nov 22 08:22:30 crc kubenswrapper[4743]: E1122 08:22:30.319381 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.332851 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cbpnf" event={"ID":"a1de4b47-eed0-431f-a7a9-a944ce8791bd","Type":"ContainerStarted","Data":"2b1ea3c6977a856f9d7b832de0a635ca12372403ff5a5c516944767a5c9f70a6"} Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.334175 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.335700 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7v699" event={"ID":"a5a54581-c9c3-4c51-b2ed-a3477e2a3159","Type":"ContainerStarted","Data":"ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10"} Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.335750 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7v699" event={"ID":"a5a54581-c9c3-4c51-b2ed-a3477e2a3159","Type":"ContainerStarted","Data":"b657d5aa1f99728d3d90c7bed13d6013dce281ac8ec504782b8d8fdcc908bf54"} Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.343231 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"1eafb21df4fb93347557916ab6b8edc7a938e0ad4966be779b032072746d9792"} Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.343852 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" event={"ID":"f8fcce96-e512-4437-bf8f-d56269b1ce26","Type":"ContainerStarted","Data":"65828a13ce2d94da6cffdbd532111193288f51a58452df82b9b89a5c5af939a4"} Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.351712 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.370723 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.401635 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.415281 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.430435 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.447154 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.467832 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.481894 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.505116 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.517850 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.542016 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.560050 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.580556 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.591328 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:30 crc kubenswrapper[4743]: I1122 08:22:30.601661 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:30Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.108812 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.151507 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.151701 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.348554 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2"} Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.350408 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4" exitCode=0 Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.350812 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4"} Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.352199 4743 generic.go:334] "Generic (PLEG): container finished" podID="f8fcce96-e512-4437-bf8f-d56269b1ce26" containerID="037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f" exitCode=0 Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.352262 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" event={"ID":"f8fcce96-e512-4437-bf8f-d56269b1ce26","Type":"ContainerDied","Data":"037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f"} Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.357095 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cbpnf" event={"ID":"a1de4b47-eed0-431f-a7a9-a944ce8791bd","Type":"ContainerStarted","Data":"42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b"} Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.359023 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24"} Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.359055 4743 scope.go:117] "RemoveContainer" containerID="33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.359059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202"} Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.359171 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.366494 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.382706 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.396254 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.417750 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.438775 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.456967 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.473336 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.494339 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.513102 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.523913 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.537195 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.549148 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.564953 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.587633 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.601870 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.613593 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.629556 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.646387 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.663335 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.677050 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.692616 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.708050 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.728198 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.743014 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.755166 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.786180 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:31Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.802642 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.802759 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.802784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.802810 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:31 crc kubenswrapper[4743]: I1122 08:22:31.802826 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.802912 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.802959 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:35.802944543 +0000 UTC m=+29.509305595 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803198 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:22:35.803183675 +0000 UTC m=+29.509544727 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803277 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803303 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803323 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803332 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803195 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803354 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803340 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803399 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:35.803378835 +0000 UTC m=+29.509739937 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803419 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:35.803411556 +0000 UTC m=+29.509772688 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:31 crc kubenswrapper[4743]: E1122 08:22:31.803446 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:35.803427917 +0000 UTC m=+29.509789009 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.151564 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.151621 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:32 crc kubenswrapper[4743]: E1122 08:22:32.152093 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:32 crc kubenswrapper[4743]: E1122 08:22:32.152203 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.364272 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" event={"ID":"f8fcce96-e512-4437-bf8f-d56269b1ce26","Type":"ContainerStarted","Data":"340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83"} Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.372884 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f"} Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.372943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12"} Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.372957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce"} Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.372966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382"} Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.372976 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809"} Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.372995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d"} Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.378889 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.390951 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.408214 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.418615 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.429360 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.442405 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.456125 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.469702 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.480767 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.515857 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.552357 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.565037 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:32 crc kubenswrapper[4743]: I1122 08:22:32.575774 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:32Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.151458 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:33 crc kubenswrapper[4743]: E1122 08:22:33.151642 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.377363 4743 generic.go:334] "Generic (PLEG): container finished" podID="f8fcce96-e512-4437-bf8f-d56269b1ce26" containerID="340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83" exitCode=0 Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.377561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" event={"ID":"f8fcce96-e512-4437-bf8f-d56269b1ce26","Type":"ContainerDied","Data":"340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83"} Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.395335 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.410068 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.423994 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.436162 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.453867 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.466566 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.480450 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.498160 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.514101 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.530697 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.551779 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.571361 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.584431 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.844027 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.848245 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.852397 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.858771 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.870560 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.883376 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.894816 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.910819 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.923819 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.934775 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.960226 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.975366 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:33 crc kubenswrapper[4743]: I1122 08:22:33.990113 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.003944 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.026537 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.038903 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.047077 4743 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.049287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.049410 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.049481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.049678 4743 kubelet_node_status.go:76] "Attempting to register node" node="crc" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.051527 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.056359 4743 kubelet_node_status.go:115] "Node was previously registered" node="crc" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.056728 4743 kubelet_node_status.go:79] "Successfully registered node" node="crc" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.058079 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.058116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.058125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.058143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.058154 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.066394 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: E1122 08:22:34.075405 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.078710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.078746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.078755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.078769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.078780 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.083075 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: E1122 08:22:34.088709 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.091524 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.091883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.091912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.091921 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.091935 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.091947 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: E1122 08:22:34.101208 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.104894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.104927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.104936 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.104951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.104961 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.106346 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: E1122 08:22:34.116989 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.122100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.122564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.122620 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.122640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.122652 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.123105 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.134725 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: E1122 08:22:34.136610 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: E1122 08:22:34.136744 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.138391 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.138421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.138435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.138453 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.138465 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.151249 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.151305 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:34 crc kubenswrapper[4743]: E1122 08:22:34.151381 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:34 crc kubenswrapper[4743]: E1122 08:22:34.151427 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.153026 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.165279 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.177848 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.188900 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.206709 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.217891 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.228957 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.240833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.240912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.240928 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.240955 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.240972 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.343670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.343706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.343716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.343731 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.343741 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.374404 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gmgcj"] Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.374879 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.382282 4743 generic.go:334] "Generic (PLEG): container finished" podID="f8fcce96-e512-4437-bf8f-d56269b1ce26" containerID="f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563" exitCode=0 Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.382338 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" event={"ID":"f8fcce96-e512-4437-bf8f-d56269b1ce26","Type":"ContainerDied","Data":"f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.383023 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.383112 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.383155 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.383225 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.422114 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.446308 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.446354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.446368 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.446387 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.446399 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.455891 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.470569 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.497596 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.511485 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.525943 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.531216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/06d28622-f91f-485b-9396-f489884f2c13-serviceca\") pod \"node-ca-gmgcj\" (UID: \"06d28622-f91f-485b-9396-f489884f2c13\") " pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.531263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06d28622-f91f-485b-9396-f489884f2c13-host\") pod \"node-ca-gmgcj\" (UID: \"06d28622-f91f-485b-9396-f489884f2c13\") " pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.531287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdbcw\" (UniqueName: \"kubernetes.io/projected/06d28622-f91f-485b-9396-f489884f2c13-kube-api-access-cdbcw\") pod \"node-ca-gmgcj\" (UID: \"06d28622-f91f-485b-9396-f489884f2c13\") " pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.539300 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.548629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.548667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.548677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.548693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.548704 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.553239 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.582254 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.596476 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.611944 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.623094 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.631984 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/06d28622-f91f-485b-9396-f489884f2c13-serviceca\") pod \"node-ca-gmgcj\" (UID: \"06d28622-f91f-485b-9396-f489884f2c13\") " pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.632028 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06d28622-f91f-485b-9396-f489884f2c13-host\") pod \"node-ca-gmgcj\" (UID: \"06d28622-f91f-485b-9396-f489884f2c13\") " pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.632049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdbcw\" (UniqueName: \"kubernetes.io/projected/06d28622-f91f-485b-9396-f489884f2c13-kube-api-access-cdbcw\") pod \"node-ca-gmgcj\" (UID: \"06d28622-f91f-485b-9396-f489884f2c13\") " pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.632111 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/06d28622-f91f-485b-9396-f489884f2c13-host\") pod \"node-ca-gmgcj\" (UID: \"06d28622-f91f-485b-9396-f489884f2c13\") " pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.634091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/06d28622-f91f-485b-9396-f489884f2c13-serviceca\") pod \"node-ca-gmgcj\" (UID: \"06d28622-f91f-485b-9396-f489884f2c13\") " pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.641069 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.651326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.651361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.651369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.651382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.651391 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.653162 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdbcw\" (UniqueName: \"kubernetes.io/projected/06d28622-f91f-485b-9396-f489884f2c13-kube-api-access-cdbcw\") pod \"node-ca-gmgcj\" (UID: \"06d28622-f91f-485b-9396-f489884f2c13\") " pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.656972 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.671647 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.688290 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.688640 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gmgcj" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.702756 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.714605 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.724933 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.735095 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: W1122 08:22:34.740085 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d28622_f91f_485b_9396_f489884f2c13.slice/crio-9349631a0ab05c27894f191d1c9d5509553a3a853cce7aa3b59750426583e621 WatchSource:0}: Error finding container 9349631a0ab05c27894f191d1c9d5509553a3a853cce7aa3b59750426583e621: Status 404 returned error can't find the container with id 9349631a0ab05c27894f191d1c9d5509553a3a853cce7aa3b59750426583e621 Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.753949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.753985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.753994 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.754008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.754017 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.758495 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.771774 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.785043 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.794411 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.807822 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.822233 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.835491 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.851261 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.858491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.858528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.858536 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.858551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.858561 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.864097 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.876944 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.961173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.961221 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.961234 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.961255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:34 crc kubenswrapper[4743]: I1122 08:22:34.961272 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:34Z","lastTransitionTime":"2025-11-22T08:22:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.064178 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.064214 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.064226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.064241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.064253 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:35Z","lastTransitionTime":"2025-11-22T08:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.151566 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.152088 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.166733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.166787 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.166802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.166826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.166842 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:35Z","lastTransitionTime":"2025-11-22T08:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.270291 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.270336 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.270346 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.270361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.270371 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:35Z","lastTransitionTime":"2025-11-22T08:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.372603 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.372628 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.372635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.372648 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.372657 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:35Z","lastTransitionTime":"2025-11-22T08:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.395706 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.404296 4743 generic.go:334] "Generic (PLEG): container finished" podID="f8fcce96-e512-4437-bf8f-d56269b1ce26" containerID="db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60" exitCode=0 Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.404337 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" event={"ID":"f8fcce96-e512-4437-bf8f-d56269b1ce26","Type":"ContainerDied","Data":"db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.411168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gmgcj" event={"ID":"06d28622-f91f-485b-9396-f489884f2c13","Type":"ContainerStarted","Data":"42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.411251 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gmgcj" event={"ID":"06d28622-f91f-485b-9396-f489884f2c13","Type":"ContainerStarted","Data":"9349631a0ab05c27894f191d1c9d5509553a3a853cce7aa3b59750426583e621"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.421698 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.436711 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.457746 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.470043 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.478594 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.478628 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.478640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.478657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.478668 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:35Z","lastTransitionTime":"2025-11-22T08:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.481622 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.492758 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.501212 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.512424 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.525243 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.536963 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.550349 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.561782 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.578719 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.581068 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.581112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.581124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.581141 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.581152 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:35Z","lastTransitionTime":"2025-11-22T08:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.590969 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.606899 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.619794 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.631622 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.643696 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.655356 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.676866 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.690537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.690597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.690607 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.690625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.690635 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:35Z","lastTransitionTime":"2025-11-22T08:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.697021 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.707423 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.718560 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.749061 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.792682 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.793303 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.793347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.793396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.793447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.793464 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:35Z","lastTransitionTime":"2025-11-22T08:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.834124 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.842853 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.843005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.843066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843135 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:22:43.843092373 +0000 UTC m=+37.549453505 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843150 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.843211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843230 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843260 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843271 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:43.843252028 +0000 UTC m=+37.549613310 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843280 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.843312 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843351 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:43.84332687 +0000 UTC m=+37.549688092 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843441 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843484 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:43.843468805 +0000 UTC m=+37.549829867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843524 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843548 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843567 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:35 crc kubenswrapper[4743]: E1122 08:22:35.843642 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:43.843627599 +0000 UTC m=+37.549988821 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.874772 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.896916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.896956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.896970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.896989 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.897000 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:35Z","lastTransitionTime":"2025-11-22T08:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.913622 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.951498 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.995888 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.999784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.999881 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.999901 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.999930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:35 crc kubenswrapper[4743]: I1122 08:22:35.999949 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:35Z","lastTransitionTime":"2025-11-22T08:22:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.102988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.103032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.103043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.103059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.103071 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:36Z","lastTransitionTime":"2025-11-22T08:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.151362 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:36 crc kubenswrapper[4743]: E1122 08:22:36.151511 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.151774 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:36 crc kubenswrapper[4743]: E1122 08:22:36.152049 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.205713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.205770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.205791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.205818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.205837 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:36Z","lastTransitionTime":"2025-11-22T08:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.308922 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.308983 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.308996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.309016 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.309028 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:36Z","lastTransitionTime":"2025-11-22T08:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.412407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.412443 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.412451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.412464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.412474 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:36Z","lastTransitionTime":"2025-11-22T08:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.418771 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" event={"ID":"f8fcce96-e512-4437-bf8f-d56269b1ce26","Type":"ContainerStarted","Data":"4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3"} Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.446978 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.468851 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.488177 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.504699 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.515558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.515651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.515666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.515687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.515703 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:36Z","lastTransitionTime":"2025-11-22T08:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.525397 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.542246 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.566200 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.581074 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.599302 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.616115 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.618018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.618063 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.618074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.618091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.618101 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:36Z","lastTransitionTime":"2025-11-22T08:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.633801 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.652421 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.683276 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.693894 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.715521 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:36Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.721321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.721365 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.721382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.721406 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.721422 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:36Z","lastTransitionTime":"2025-11-22T08:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.825514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.825559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.825588 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.825606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.825617 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:36Z","lastTransitionTime":"2025-11-22T08:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.928726 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.928803 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.928822 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.928853 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:36 crc kubenswrapper[4743]: I1122 08:22:36.928875 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:36Z","lastTransitionTime":"2025-11-22T08:22:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.031690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.031756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.031774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.031804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.031823 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:37Z","lastTransitionTime":"2025-11-22T08:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.134627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.134738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.134759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.134790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.134826 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:37Z","lastTransitionTime":"2025-11-22T08:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.151034 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:37 crc kubenswrapper[4743]: E1122 08:22:37.151253 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.173696 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.188419 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.211276 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.232482 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.238356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.238421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.238445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.238476 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.238496 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:37Z","lastTransitionTime":"2025-11-22T08:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.261465 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.279245 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.301560 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.320079 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.334349 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.341088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.341115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.341127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.341147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.341160 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:37Z","lastTransitionTime":"2025-11-22T08:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.348255 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.387381 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.422363 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.430999 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.443745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.443772 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.443781 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.443796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.443805 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:37Z","lastTransitionTime":"2025-11-22T08:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.445792 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.462680 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.479341 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.493375 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.510049 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.521310 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.542548 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.546068 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.546101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.546110 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.546123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.546132 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:37Z","lastTransitionTime":"2025-11-22T08:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.557476 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.573761 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.583863 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.596910 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.610511 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.627051 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.639295 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.648885 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.648931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.648943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.648962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.648975 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:37Z","lastTransitionTime":"2025-11-22T08:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.677999 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.712691 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.751551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.751657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.751678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.751704 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.751723 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:37Z","lastTransitionTime":"2025-11-22T08:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.761054 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.801755 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:37Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.855086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.855146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.855167 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.855194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.855211 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:37Z","lastTransitionTime":"2025-11-22T08:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.958924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.958997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.959021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.959052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:37 crc kubenswrapper[4743]: I1122 08:22:37.959074 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:37Z","lastTransitionTime":"2025-11-22T08:22:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.063241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.063334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.063360 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.063945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.064221 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:38Z","lastTransitionTime":"2025-11-22T08:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.151065 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.151222 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:38 crc kubenswrapper[4743]: E1122 08:22:38.151420 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:38 crc kubenswrapper[4743]: E1122 08:22:38.151784 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.167443 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.167497 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.167514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.167538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.167556 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:38Z","lastTransitionTime":"2025-11-22T08:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.270525 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.270845 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.270868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.270892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.270910 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:38Z","lastTransitionTime":"2025-11-22T08:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.373811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.373877 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.373893 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.373919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.373942 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:38Z","lastTransitionTime":"2025-11-22T08:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.438413 4743 generic.go:334] "Generic (PLEG): container finished" podID="f8fcce96-e512-4437-bf8f-d56269b1ce26" containerID="4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3" exitCode=0 Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.438482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" event={"ID":"f8fcce96-e512-4437-bf8f-d56269b1ce26","Type":"ContainerDied","Data":"4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.439357 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.440330 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.440401 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.463881 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.481166 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.481220 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.481233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.481251 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.481264 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:38Z","lastTransitionTime":"2025-11-22T08:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.486116 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.499041 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.499171 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.510316 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.527743 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.543842 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.561757 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.576168 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.583158 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.583188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.583196 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.583210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.583219 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:38Z","lastTransitionTime":"2025-11-22T08:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.593281 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.612904 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.624710 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.656069 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.679429 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.685501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.685570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.685604 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.685629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.685646 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:38Z","lastTransitionTime":"2025-11-22T08:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.697449 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.716509 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.731868 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.744399 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.774530 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.788949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.788996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.789006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.789026 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.789036 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:38Z","lastTransitionTime":"2025-11-22T08:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.792851 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.808483 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.821480 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.831665 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.871357 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.891783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.891824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.891837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.891857 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.891872 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:38Z","lastTransitionTime":"2025-11-22T08:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.893026 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.917678 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.929339 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.940443 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.953383 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.964630 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.978815 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.991613 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:38Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.994297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.994352 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.994366 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.994388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:38 crc kubenswrapper[4743]: I1122 08:22:38.994401 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:38Z","lastTransitionTime":"2025-11-22T08:22:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.096727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.096809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.096834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.096871 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.096897 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:39Z","lastTransitionTime":"2025-11-22T08:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.151002 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:39 crc kubenswrapper[4743]: E1122 08:22:39.151661 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.201388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.201481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.201505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.201541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.201565 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:39Z","lastTransitionTime":"2025-11-22T08:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.304331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.304369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.304378 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.304396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.304410 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:39Z","lastTransitionTime":"2025-11-22T08:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.406683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.406743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.406756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.406774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.406785 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:39Z","lastTransitionTime":"2025-11-22T08:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.446476 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.509621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.509673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.509684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.509702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.509714 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:39Z","lastTransitionTime":"2025-11-22T08:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.613304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.613369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.613382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.613412 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.613431 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:39Z","lastTransitionTime":"2025-11-22T08:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.716606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.716666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.716675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.716728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.716746 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:39Z","lastTransitionTime":"2025-11-22T08:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.820035 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.820072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.820083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.820098 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.820109 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:39Z","lastTransitionTime":"2025-11-22T08:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.923667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.923721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.923742 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.923766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:39 crc kubenswrapper[4743]: I1122 08:22:39.923784 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:39Z","lastTransitionTime":"2025-11-22T08:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.027213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.027289 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.027312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.027342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.027386 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:40Z","lastTransitionTime":"2025-11-22T08:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.130154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.130219 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.130230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.130250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.130261 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:40Z","lastTransitionTime":"2025-11-22T08:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.150778 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.150818 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:40 crc kubenswrapper[4743]: E1122 08:22:40.150999 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:40 crc kubenswrapper[4743]: E1122 08:22:40.151132 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.234673 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.234725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.234745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.234770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.234789 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:40Z","lastTransitionTime":"2025-11-22T08:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.338229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.338331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.338356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.338392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.338420 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:40Z","lastTransitionTime":"2025-11-22T08:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.440689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.440782 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.440801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.440836 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.440855 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:40Z","lastTransitionTime":"2025-11-22T08:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.456318 4743 generic.go:334] "Generic (PLEG): container finished" podID="f8fcce96-e512-4437-bf8f-d56269b1ce26" containerID="4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038" exitCode=0 Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.456425 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" event={"ID":"f8fcce96-e512-4437-bf8f-d56269b1ce26","Type":"ContainerDied","Data":"4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.456528 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.483565 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.524404 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.543443 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.543503 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.543517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.543541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.543556 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:40Z","lastTransitionTime":"2025-11-22T08:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.546938 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.567268 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.581992 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.595800 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.621508 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.638645 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.646241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.646289 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.646301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.646319 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.646332 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:40Z","lastTransitionTime":"2025-11-22T08:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.654854 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.669375 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.680874 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.693271 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.704119 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.719523 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.733630 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:40Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.748815 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.748870 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.748882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.748901 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.748914 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:40Z","lastTransitionTime":"2025-11-22T08:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.850952 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.851006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.851022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.851042 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.851057 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:40Z","lastTransitionTime":"2025-11-22T08:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.954721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.954774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.954787 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.954808 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:40 crc kubenswrapper[4743]: I1122 08:22:40.954824 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:40Z","lastTransitionTime":"2025-11-22T08:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.058017 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.058067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.058077 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.058100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.058113 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:41Z","lastTransitionTime":"2025-11-22T08:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.150738 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:41 crc kubenswrapper[4743]: E1122 08:22:41.150934 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.160217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.160254 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.160262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.160277 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.160289 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:41Z","lastTransitionTime":"2025-11-22T08:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.263439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.263483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.263495 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.263511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.263521 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:41Z","lastTransitionTime":"2025-11-22T08:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.367498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.367615 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.367645 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.367678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.367699 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:41Z","lastTransitionTime":"2025-11-22T08:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.465450 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" event={"ID":"f8fcce96-e512-4437-bf8f-d56269b1ce26","Type":"ContainerStarted","Data":"4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.470463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.470512 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.470532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.470558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.470608 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:41Z","lastTransitionTime":"2025-11-22T08:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.479393 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.490844 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.516132 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.536249 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.551918 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.570702 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.572651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.572686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.572696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.572721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.572743 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:41Z","lastTransitionTime":"2025-11-22T08:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.584283 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.607514 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.620010 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.632258 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.644664 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.662219 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.675323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.675392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.675407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.675424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.675435 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:41Z","lastTransitionTime":"2025-11-22T08:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.675668 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.688182 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.702503 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:41Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.778260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.778308 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.778323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.778343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.778356 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:41Z","lastTransitionTime":"2025-11-22T08:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.881769 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.881812 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.881823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.881839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.881848 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:41Z","lastTransitionTime":"2025-11-22T08:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.985966 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.986041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.986064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.986100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:41 crc kubenswrapper[4743]: I1122 08:22:41.986122 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:41Z","lastTransitionTime":"2025-11-22T08:22:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.088472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.088623 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.088651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.088685 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.088704 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:42Z","lastTransitionTime":"2025-11-22T08:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.150973 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.151116 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:42 crc kubenswrapper[4743]: E1122 08:22:42.151248 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:42 crc kubenswrapper[4743]: E1122 08:22:42.151479 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.151748 4743 scope.go:117] "RemoveContainer" containerID="33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.191254 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.191297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.191307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.191321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.191332 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:42Z","lastTransitionTime":"2025-11-22T08:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.294493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.294567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.294607 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.294635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.294659 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:42Z","lastTransitionTime":"2025-11-22T08:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.397243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.397283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.397295 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.397313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.397325 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:42Z","lastTransitionTime":"2025-11-22T08:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.500711 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.500750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.500758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.500774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.500785 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:42Z","lastTransitionTime":"2025-11-22T08:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.604006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.604085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.604108 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.604140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.604163 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:42Z","lastTransitionTime":"2025-11-22T08:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.638900 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx"] Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.639309 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.641882 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.643396 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.663982 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.683900 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.698154 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.707511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.707554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.707567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.707602 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.707615 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:42Z","lastTransitionTime":"2025-11-22T08:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.714658 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.718385 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f4c88f46-4abf-4975-b03c-52d9be99a9e6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.718456 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj6r9\" (UniqueName: \"kubernetes.io/projected/f4c88f46-4abf-4975-b03c-52d9be99a9e6-kube-api-access-cj6r9\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.718530 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f4c88f46-4abf-4975-b03c-52d9be99a9e6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.718602 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f4c88f46-4abf-4975-b03c-52d9be99a9e6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.732227 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.746604 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.759463 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.771919 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.784152 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.804871 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.810138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.810180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.810195 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.810216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.810229 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:42Z","lastTransitionTime":"2025-11-22T08:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.819965 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f4c88f46-4abf-4975-b03c-52d9be99a9e6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.820009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f4c88f46-4abf-4975-b03c-52d9be99a9e6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.820060 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj6r9\" (UniqueName: \"kubernetes.io/projected/f4c88f46-4abf-4975-b03c-52d9be99a9e6-kube-api-access-cj6r9\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.820109 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f4c88f46-4abf-4975-b03c-52d9be99a9e6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.820749 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f4c88f46-4abf-4975-b03c-52d9be99a9e6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.820763 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f4c88f46-4abf-4975-b03c-52d9be99a9e6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.825502 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f4c88f46-4abf-4975-b03c-52d9be99a9e6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.828993 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.841952 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj6r9\" (UniqueName: \"kubernetes.io/projected/f4c88f46-4abf-4975-b03c-52d9be99a9e6-kube-api-access-cj6r9\") pod \"ovnkube-control-plane-749d76644c-rf4vx\" (UID: \"f4c88f46-4abf-4975-b03c-52d9be99a9e6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.843429 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.861275 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.872353 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.892955 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.905866 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:42Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.913025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.913067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.913080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.913099 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.913111 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:42Z","lastTransitionTime":"2025-11-22T08:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:42 crc kubenswrapper[4743]: I1122 08:22:42.955738 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" Nov 22 08:22:42 crc kubenswrapper[4743]: W1122 08:22:42.968558 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4c88f46_4abf_4975_b03c_52d9be99a9e6.slice/crio-ebfc78c1744bc442a39a16274bc1148258fb3853428dd4285016e7204cdf2299 WatchSource:0}: Error finding container ebfc78c1744bc442a39a16274bc1148258fb3853428dd4285016e7204cdf2299: Status 404 returned error can't find the container with id ebfc78c1744bc442a39a16274bc1148258fb3853428dd4285016e7204cdf2299 Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.015917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.015961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.015977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.015997 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.016011 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:43Z","lastTransitionTime":"2025-11-22T08:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.118876 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.118922 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.118936 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.118957 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.118972 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:43Z","lastTransitionTime":"2025-11-22T08:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.150907 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.151136 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.222153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.222217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.222243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.222274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.222296 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:43Z","lastTransitionTime":"2025-11-22T08:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.325513 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.325571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.325612 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.325635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.325649 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:43Z","lastTransitionTime":"2025-11-22T08:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.427862 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.427910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.427922 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.427940 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.427951 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:43Z","lastTransitionTime":"2025-11-22T08:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.475279 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" event={"ID":"f4c88f46-4abf-4975-b03c-52d9be99a9e6","Type":"ContainerStarted","Data":"ebfc78c1744bc442a39a16274bc1148258fb3853428dd4285016e7204cdf2299"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.531163 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.531205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.531216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.531234 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.531247 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:43Z","lastTransitionTime":"2025-11-22T08:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.633622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.633680 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.633693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.633720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.633735 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:43Z","lastTransitionTime":"2025-11-22T08:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.736407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.736514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.736534 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.736559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.736628 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:43Z","lastTransitionTime":"2025-11-22T08:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.746154 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4vkc4"] Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.746604 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.746670 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.764958 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.776836 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.793515 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.804133 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.815107 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.828055 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.831007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.831091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr995\" (UniqueName: \"kubernetes.io/projected/8426c723-9bfa-4856-b445-b01251484a35-kube-api-access-gr995\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.839391 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.839468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.839528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.839567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.839630 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:43Z","lastTransitionTime":"2025-11-22T08:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.839759 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.851944 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.862725 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.881915 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.895110 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.917100 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.932222 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.932415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.932483 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.932527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.932634 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:22:59.932523035 +0000 UTC m=+53.638884147 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.932701 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.932755 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr995\" (UniqueName: \"kubernetes.io/projected/8426c723-9bfa-4856-b445-b01251484a35-kube-api-access-gr995\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.932780 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:59.932756002 +0000 UTC m=+53.639117084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.932821 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.932884 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.932902 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.932923 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.932957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.933001 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:59.932981509 +0000 UTC m=+53.639342561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.932828 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.933042 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs podName:8426c723-9bfa-4856-b445-b01251484a35 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:44.433033531 +0000 UTC m=+38.139394583 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs") pod "network-metrics-daemon-4vkc4" (UID: "8426c723-9bfa-4856-b445-b01251484a35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.933112 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.933121 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.933128 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.933149 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:59.933143854 +0000 UTC m=+53.639504906 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.933187 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:43 crc kubenswrapper[4743]: E1122 08:22:43.933283 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:59.933259438 +0000 UTC m=+53.639620650 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.938944 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.941969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.942002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.942016 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.942044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.942057 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:43Z","lastTransitionTime":"2025-11-22T08:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.958361 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.980557 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:43 crc kubenswrapper[4743]: I1122 08:22:43.995918 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:43Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.020969 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:44Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.045279 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.045353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.045372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.045397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.045418 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.148095 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.148130 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.148140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.148154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.148163 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.150709 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:44 crc kubenswrapper[4743]: E1122 08:22:44.150803 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.150924 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:44 crc kubenswrapper[4743]: E1122 08:22:44.151017 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.239404 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr995\" (UniqueName: \"kubernetes.io/projected/8426c723-9bfa-4856-b445-b01251484a35-kube-api-access-gr995\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.250469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.250517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.250526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.250542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.250554 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.352517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.352566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.352617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.352643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.352660 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.390879 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.390923 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.390932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.390945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.390955 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: E1122 08:22:44.404417 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:44Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.407592 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.407620 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.407629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.407641 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.407651 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: E1122 08:22:44.423832 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:44Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.428452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.428480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.428488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.428502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.428511 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.439569 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:44 crc kubenswrapper[4743]: E1122 08:22:44.439746 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:44 crc kubenswrapper[4743]: E1122 08:22:44.439836 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs podName:8426c723-9bfa-4856-b445-b01251484a35 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:45.439811667 +0000 UTC m=+39.146172749 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs") pod "network-metrics-daemon-4vkc4" (UID: "8426c723-9bfa-4856-b445-b01251484a35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:44 crc kubenswrapper[4743]: E1122 08:22:44.445334 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:44Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.449735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.449770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.449779 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.449803 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.449813 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: E1122 08:22:44.469932 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:44Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.475066 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.475116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.475134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.475159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.475186 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: E1122 08:22:44.488387 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:44Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:44 crc kubenswrapper[4743]: E1122 08:22:44.488516 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.490038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.490101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.490120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.490147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.490166 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.593409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.593462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.593481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.593499 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.593508 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.696413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.696460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.696469 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.696488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.696497 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.799369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.799462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.799490 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.799529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.799556 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.903178 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.903235 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.903247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.903265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:44 crc kubenswrapper[4743]: I1122 08:22:44.903279 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:44Z","lastTransitionTime":"2025-11-22T08:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.006226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.006267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.006276 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.006292 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.006302 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:45Z","lastTransitionTime":"2025-11-22T08:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.108187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.108231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.108248 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.108268 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.108294 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:45Z","lastTransitionTime":"2025-11-22T08:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.151107 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.151118 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:45 crc kubenswrapper[4743]: E1122 08:22:45.151280 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:22:45 crc kubenswrapper[4743]: E1122 08:22:45.151382 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.210981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.211022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.211031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.211045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.211054 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:45Z","lastTransitionTime":"2025-11-22T08:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.313371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.313437 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.313483 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.313542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.313559 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:45Z","lastTransitionTime":"2025-11-22T08:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.415674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.415718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.415728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.415746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.415757 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:45Z","lastTransitionTime":"2025-11-22T08:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.450711 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:45 crc kubenswrapper[4743]: E1122 08:22:45.450857 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:45 crc kubenswrapper[4743]: E1122 08:22:45.450931 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs podName:8426c723-9bfa-4856-b445-b01251484a35 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:47.450911074 +0000 UTC m=+41.157272126 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs") pod "network-metrics-daemon-4vkc4" (UID: "8426c723-9bfa-4856-b445-b01251484a35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.484171 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.485922 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.518089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.518123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.518138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.518157 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.518172 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:45Z","lastTransitionTime":"2025-11-22T08:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.620591 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.620625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.620634 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.620649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.620658 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:45Z","lastTransitionTime":"2025-11-22T08:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.723967 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.724019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.724037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.724061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.724081 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:45Z","lastTransitionTime":"2025-11-22T08:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.826665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.826807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.826833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.826862 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.826880 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:45Z","lastTransitionTime":"2025-11-22T08:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.929227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.929287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.929306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.929335 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:45 crc kubenswrapper[4743]: I1122 08:22:45.929352 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:45Z","lastTransitionTime":"2025-11-22T08:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.032176 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.032221 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.032237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.032272 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.032285 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:46Z","lastTransitionTime":"2025-11-22T08:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.134979 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.135019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.135031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.135049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.135061 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:46Z","lastTransitionTime":"2025-11-22T08:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.151114 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:46 crc kubenswrapper[4743]: E1122 08:22:46.151247 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.151285 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:46 crc kubenswrapper[4743]: E1122 08:22:46.151501 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.238093 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.238151 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.238169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.238194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.238211 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:46Z","lastTransitionTime":"2025-11-22T08:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.340984 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.341040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.341053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.341072 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.341086 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:46Z","lastTransitionTime":"2025-11-22T08:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.444800 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.444873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.444895 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.444924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.444944 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:46Z","lastTransitionTime":"2025-11-22T08:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.547465 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.547516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.547524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.547537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.547547 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:46Z","lastTransitionTime":"2025-11-22T08:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.649394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.649421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.649429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.649442 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.649453 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:46Z","lastTransitionTime":"2025-11-22T08:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.753199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.753297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.753322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.753355 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.753380 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:46Z","lastTransitionTime":"2025-11-22T08:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.856335 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.856372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.856382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.856398 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.856409 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:46Z","lastTransitionTime":"2025-11-22T08:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.959471 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.959520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.959529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.959546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:46 crc kubenswrapper[4743]: I1122 08:22:46.959564 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:46Z","lastTransitionTime":"2025-11-22T08:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.063217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.063264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.063276 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.063294 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.063304 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:47Z","lastTransitionTime":"2025-11-22T08:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.151238 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.151308 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:47 crc kubenswrapper[4743]: E1122 08:22:47.151423 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:22:47 crc kubenswrapper[4743]: E1122 08:22:47.151661 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.170182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.170475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.170553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.170653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.170779 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:47Z","lastTransitionTime":"2025-11-22T08:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.172323 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.187464 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.204411 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.216561 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.235160 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.249023 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.261707 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.272954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.272993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.273002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.273018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.273028 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:47Z","lastTransitionTime":"2025-11-22T08:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.288131 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.303621 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.319898 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.331515 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.347058 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.361530 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.375783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.376227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.376290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.376368 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.376378 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.376514 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:47Z","lastTransitionTime":"2025-11-22T08:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.393068 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.410930 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.426188 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:47Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.474255 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:47 crc kubenswrapper[4743]: E1122 08:22:47.474484 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:47 crc kubenswrapper[4743]: E1122 08:22:47.474609 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs podName:8426c723-9bfa-4856-b445-b01251484a35 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:51.474560372 +0000 UTC m=+45.180921464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs") pod "network-metrics-daemon-4vkc4" (UID: "8426c723-9bfa-4856-b445-b01251484a35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.479933 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.480089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.480159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.480222 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.480280 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:47Z","lastTransitionTime":"2025-11-22T08:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.493767 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" event={"ID":"f4c88f46-4abf-4975-b03c-52d9be99a9e6","Type":"ContainerStarted","Data":"6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.582815 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.582868 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.582885 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.582910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.582926 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:47Z","lastTransitionTime":"2025-11-22T08:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.685829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.685873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.685884 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.685904 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.685916 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:47Z","lastTransitionTime":"2025-11-22T08:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.789056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.789126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.789147 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.789177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.789195 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:47Z","lastTransitionTime":"2025-11-22T08:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.891719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.891795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.891813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.892243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.892299 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:47Z","lastTransitionTime":"2025-11-22T08:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.996045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.996102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.996118 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.996144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:47 crc kubenswrapper[4743]: I1122 08:22:47.996162 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:47Z","lastTransitionTime":"2025-11-22T08:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.099210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.099283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.099307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.099340 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.099369 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:48Z","lastTransitionTime":"2025-11-22T08:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.150862 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.150898 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:48 crc kubenswrapper[4743]: E1122 08:22:48.150999 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:48 crc kubenswrapper[4743]: E1122 08:22:48.151116 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.204306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.204354 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.204367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.204391 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.204405 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:48Z","lastTransitionTime":"2025-11-22T08:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.280027 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.280265 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 08:22:48 crc kubenswrapper[4743]: E1122 08:22:48.280938 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c is running failed: container process not found" containerID="3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Nov 22 08:22:48 crc kubenswrapper[4743]: E1122 08:22:48.281469 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c is running failed: container process not found" containerID="3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Nov 22 08:22:48 crc kubenswrapper[4743]: E1122 08:22:48.281919 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c is running failed: container process not found" containerID="3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Nov 22 08:22:48 crc kubenswrapper[4743]: E1122 08:22:48.281955 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:22:48 crc kubenswrapper[4743]: E1122 08:22:48.282241 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c is running failed: container process not found" containerID="3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Nov 22 08:22:48 crc kubenswrapper[4743]: E1122 08:22:48.282817 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c is running failed: container process not found" containerID="3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Nov 22 08:22:48 crc kubenswrapper[4743]: E1122 08:22:48.283136 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c is running failed: container process not found" containerID="3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c" cmd=["/bin/bash","-c","#!/bin/bash\ntest -f /etc/cni/net.d/10-ovn-kubernetes.conf\n"] Nov 22 08:22:48 crc kubenswrapper[4743]: E1122 08:22:48.283236 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.307635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.307690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.307702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.307721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.307735 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:48Z","lastTransitionTime":"2025-11-22T08:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.410397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.410462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.410502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.410533 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.410647 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:48Z","lastTransitionTime":"2025-11-22T08:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.499520 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/0.log" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.503320 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c" exitCode=1 Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.503410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.503834 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.504452 4743 scope.go:117] "RemoveContainer" containerID="3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.514873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.514905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.514915 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.514933 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.514944 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:48Z","lastTransitionTime":"2025-11-22T08:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.530341 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.546125 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.560972 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.571744 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.600710 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.617847 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.617885 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.617897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.617916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.617928 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:48Z","lastTransitionTime":"2025-11-22T08:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.625828 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.643965 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.656766 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.668764 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.682388 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.696909 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.710235 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.720293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.720342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.720353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.720372 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.720384 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:48Z","lastTransitionTime":"2025-11-22T08:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.723445 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.733119 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.747569 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.756014 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.766701 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.777096 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.794212 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.804910 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.815543 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.822462 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.822499 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.822509 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.822524 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.822536 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:48Z","lastTransitionTime":"2025-11-22T08:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.827555 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.840466 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.852796 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.864973 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.876855 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.891403 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.910120 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.924520 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.924647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.925252 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.925264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.925284 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.925301 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:48Z","lastTransitionTime":"2025-11-22T08:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.952106 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.968613 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:48 crc kubenswrapper[4743]: I1122 08:22:48.983041 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:48Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.002412 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:47Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:43.411319 6023 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 08:22:43.411360 6023 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 08:22:43.411376 6023 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:43.411382 6023 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:43.411397 6023 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:43.411396 6023 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 08:22:43.411415 6023 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:43.411424 6023 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 08:22:43.411431 6023 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 08:22:43.411434 6023 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 08:22:43.411443 6023 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:43.411487 6023 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 08:22:43.411531 6023 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 08:22:43.411532 6023 factory.go:656] Stopping watch factory\\\\nI1122 08:22:43.411550 6023 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.011852 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.027827 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.027867 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.027880 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.027898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.027909 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:49Z","lastTransitionTime":"2025-11-22T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.129931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.129968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.129977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.129994 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.130005 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:49Z","lastTransitionTime":"2025-11-22T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.150558 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.150604 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:49 crc kubenswrapper[4743]: E1122 08:22:49.150727 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:49 crc kubenswrapper[4743]: E1122 08:22:49.150829 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.232678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.232745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.232757 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.232778 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.232790 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:49Z","lastTransitionTime":"2025-11-22T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.335256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.335304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.335317 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.335359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.335375 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:49Z","lastTransitionTime":"2025-11-22T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.439000 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.439085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.439110 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.439140 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.439162 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:49Z","lastTransitionTime":"2025-11-22T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.511406 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/0.log" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.516535 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.517358 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.519776 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" event={"ID":"f4c88f46-4abf-4975-b03c-52d9be99a9e6","Type":"ContainerStarted","Data":"d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.539615 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.542683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.542768 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.542797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.542828 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.542851 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:49Z","lastTransitionTime":"2025-11-22T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.575601 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.593051 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.608790 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.622552 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.637886 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.645637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.645690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.645707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.645729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.645745 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:49Z","lastTransitionTime":"2025-11-22T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.653983 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.669623 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.686403 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.699114 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.713050 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.744191 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:47Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:43.411319 6023 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 08:22:43.411360 6023 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 08:22:43.411376 6023 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:43.411382 6023 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:43.411397 6023 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:43.411396 6023 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 08:22:43.411415 6023 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:43.411424 6023 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 08:22:43.411431 6023 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 08:22:43.411434 6023 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 08:22:43.411443 6023 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:43.411487 6023 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 08:22:43.411531 6023 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 08:22:43.411532 6023 factory.go:656] Stopping watch factory\\\\nI1122 08:22:43.411550 6023 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.747842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.747888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.747901 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.747919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.747931 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:49Z","lastTransitionTime":"2025-11-22T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.760787 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.783546 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.801260 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.818030 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.832037 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.847303 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.850087 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.850146 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.850164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.850188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.850203 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:49Z","lastTransitionTime":"2025-11-22T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.873097 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.887041 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.899024 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.915748 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.933312 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.945726 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.953152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.953195 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.953205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.953221 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.953230 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:49Z","lastTransitionTime":"2025-11-22T08:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.959381 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.981998 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:49 crc kubenswrapper[4743]: I1122 08:22:49.998610 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:49Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.011363 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:50Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.031708 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:47Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:43.411319 6023 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 08:22:43.411360 6023 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 08:22:43.411376 6023 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:43.411382 6023 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:43.411397 6023 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:43.411396 6023 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 08:22:43.411415 6023 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:43.411424 6023 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 08:22:43.411431 6023 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 08:22:43.411434 6023 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 08:22:43.411443 6023 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:43.411487 6023 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 08:22:43.411531 6023 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 08:22:43.411532 6023 factory.go:656] Stopping watch factory\\\\nI1122 08:22:43.411550 6023 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:50Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.047027 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:50Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.055667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.055841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.055902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.055963 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.056039 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:50Z","lastTransitionTime":"2025-11-22T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.066540 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:50Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.084504 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:50Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.100159 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:50Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.110857 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:50Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.151245 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:50 crc kubenswrapper[4743]: E1122 08:22:50.151384 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.151643 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:50 crc kubenswrapper[4743]: E1122 08:22:50.151841 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.158132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.158231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.158243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.158259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.158267 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:50Z","lastTransitionTime":"2025-11-22T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.260966 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.261178 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.261965 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.262014 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.262031 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:50Z","lastTransitionTime":"2025-11-22T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.364117 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.364160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.364169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.364184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.364193 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:50Z","lastTransitionTime":"2025-11-22T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.467078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.467127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.467137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.467155 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.467170 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:50Z","lastTransitionTime":"2025-11-22T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.570053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.570092 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.570103 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.570125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.570136 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:50Z","lastTransitionTime":"2025-11-22T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.673313 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.673380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.673392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.673409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.673419 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:50Z","lastTransitionTime":"2025-11-22T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.776641 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.776693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.776710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.776734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.776753 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:50Z","lastTransitionTime":"2025-11-22T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.880460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.880549 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.880567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.880622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.880642 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:50Z","lastTransitionTime":"2025-11-22T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.983258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.983350 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.983364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.983400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:50 crc kubenswrapper[4743]: I1122 08:22:50.983420 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:50Z","lastTransitionTime":"2025-11-22T08:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.086234 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.086300 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.086310 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.086332 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.086343 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:51Z","lastTransitionTime":"2025-11-22T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.150732 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.151001 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:51 crc kubenswrapper[4743]: E1122 08:22:51.151206 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:51 crc kubenswrapper[4743]: E1122 08:22:51.151404 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.189507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.189610 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.189624 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.189650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.189663 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:51Z","lastTransitionTime":"2025-11-22T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.292596 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.292637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.292646 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.292660 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.292670 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:51Z","lastTransitionTime":"2025-11-22T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.396557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.396644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.396661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.396686 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.396705 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:51Z","lastTransitionTime":"2025-11-22T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.499880 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.499942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.499957 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.499988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.500010 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:51Z","lastTransitionTime":"2025-11-22T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.526996 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:51 crc kubenswrapper[4743]: E1122 08:22:51.527281 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:51 crc kubenswrapper[4743]: E1122 08:22:51.527363 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs podName:8426c723-9bfa-4856-b445-b01251484a35 nodeName:}" failed. No retries permitted until 2025-11-22 08:22:59.527338837 +0000 UTC m=+53.233699919 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs") pod "network-metrics-daemon-4vkc4" (UID: "8426c723-9bfa-4856-b445-b01251484a35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.602642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.602685 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.602696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.602716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.602725 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:51Z","lastTransitionTime":"2025-11-22T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.705448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.705521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.705546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.705617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.705641 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:51Z","lastTransitionTime":"2025-11-22T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.808447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.808513 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.808531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.808560 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.808605 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:51Z","lastTransitionTime":"2025-11-22T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.911406 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.911464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.911472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.911486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:51 crc kubenswrapper[4743]: I1122 08:22:51.911496 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:51Z","lastTransitionTime":"2025-11-22T08:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.013849 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.013897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.013906 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.013922 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.013935 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:52Z","lastTransitionTime":"2025-11-22T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.116692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.116732 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.116743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.116763 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.116777 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:52Z","lastTransitionTime":"2025-11-22T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.151300 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.151393 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:52 crc kubenswrapper[4743]: E1122 08:22:52.151467 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:52 crc kubenswrapper[4743]: E1122 08:22:52.151536 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.219764 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.219827 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.219847 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.219873 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.219892 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:52Z","lastTransitionTime":"2025-11-22T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.323102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.323156 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.323170 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.323188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.323199 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:52Z","lastTransitionTime":"2025-11-22T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.426561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.426676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.426696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.426725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.426746 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:52Z","lastTransitionTime":"2025-11-22T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.529390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.529427 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.529435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.529448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.529457 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:52Z","lastTransitionTime":"2025-11-22T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.532550 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/1.log" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.533273 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/0.log" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.536156 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6" exitCode=1 Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.536217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.536264 4743 scope.go:117] "RemoveContainer" containerID="3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.537373 4743 scope.go:117] "RemoveContainer" containerID="c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6" Nov 22 08:22:52 crc kubenswrapper[4743]: E1122 08:22:52.537654 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.582242 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.602210 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.624228 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.633107 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.633153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.633164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.633182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.633191 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:52Z","lastTransitionTime":"2025-11-22T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.642816 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.663282 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.680535 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.708321 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.728104 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.737028 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.737109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.737124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.737144 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.737155 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:52Z","lastTransitionTime":"2025-11-22T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.745405 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.762474 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.777082 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.798803 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.811833 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.824174 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.842553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.842654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.842675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.842707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.842761 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:52Z","lastTransitionTime":"2025-11-22T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.844206 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.875303 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:47Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:43.411319 6023 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 08:22:43.411360 6023 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 08:22:43.411376 6023 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:43.411382 6023 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:43.411397 6023 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:43.411396 6023 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 08:22:43.411415 6023 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:43.411424 6023 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 08:22:43.411431 6023 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 08:22:43.411434 6023 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 08:22:43.411443 6023 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:43.411487 6023 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 08:22:43.411531 6023 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 08:22:43.411532 6023 factory.go:656] Stopping watch factory\\\\nI1122 08:22:43.411550 6023 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:51Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 08:22:50.479112 6281 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479170 6281 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479947 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:50.479985 6281 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:50.479993 6281 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:50.480031 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:50.480040 6281 factory.go:656] Stopping watch factory\\\\nI1122 08:22:50.480075 6281 ovnkube.go:599] Stopped ovnkube\\\\nI1122 08:22:50.480099 6281 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 08:22:50.480116 6281 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:50.480195 6281 handler.go:208] Removed *v1.Node event handler 7\\\\nF1122 08:22:50.480210 6281 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.886241 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:52Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.945530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.945571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.945599 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.945616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:52 crc kubenswrapper[4743]: I1122 08:22:52.945629 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:52Z","lastTransitionTime":"2025-11-22T08:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.048840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.048906 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.048924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.048949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.048970 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:53Z","lastTransitionTime":"2025-11-22T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.150916 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.150973 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:53 crc kubenswrapper[4743]: E1122 08:22:53.151158 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:22:53 crc kubenswrapper[4743]: E1122 08:22:53.151315 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.152455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.152899 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.153058 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.153182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.153285 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:53Z","lastTransitionTime":"2025-11-22T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.257238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.257826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.257872 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.257897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.257909 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:53Z","lastTransitionTime":"2025-11-22T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.360778 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.360846 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.360865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.360891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.360908 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:53Z","lastTransitionTime":"2025-11-22T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.463557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.463643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.463660 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.463714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.463731 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:53Z","lastTransitionTime":"2025-11-22T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.541435 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/1.log" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.566633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.566717 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.566735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.566759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.566777 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:53Z","lastTransitionTime":"2025-11-22T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.670326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.670385 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.670397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.670432 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.670446 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:53Z","lastTransitionTime":"2025-11-22T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.773824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.773887 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.773904 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.773932 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.773950 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:53Z","lastTransitionTime":"2025-11-22T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.877981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.878183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.878209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.878284 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.878308 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:53Z","lastTransitionTime":"2025-11-22T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.981905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.981949 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.981977 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.981996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:53 crc kubenswrapper[4743]: I1122 08:22:53.982005 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:53Z","lastTransitionTime":"2025-11-22T08:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.084156 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.084200 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.084213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.084230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.084243 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.151024 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.151057 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:54 crc kubenswrapper[4743]: E1122 08:22:54.151170 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:54 crc kubenswrapper[4743]: E1122 08:22:54.151255 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.187018 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.187262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.187362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.187430 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.187509 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.289804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.289839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.289850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.289865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.289874 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.392558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.392643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.392657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.392680 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.392695 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.495904 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.495968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.495985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.496012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.496032 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.579735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.579783 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.579792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.579811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.579824 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: E1122 08:22:54.601050 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:54Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.606098 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.606150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.606164 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.606190 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.606209 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: E1122 08:22:54.620664 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:54Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.626486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.626541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.626553 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.626570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.626603 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: E1122 08:22:54.640030 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:54Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.643961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.644000 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.644013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.644034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.644051 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: E1122 08:22:54.660763 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:54Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.664711 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.664751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.664766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.664790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.664805 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: E1122 08:22:54.676037 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:54Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:54 crc kubenswrapper[4743]: E1122 08:22:54.676269 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.678225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.678262 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.678271 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.678286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.678301 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.781488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.781567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.781635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.781669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.781696 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.885021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.885080 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.885098 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.885124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.885142 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.987996 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.988060 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.988077 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.988101 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:54 crc kubenswrapper[4743]: I1122 08:22:54.988119 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:54Z","lastTransitionTime":"2025-11-22T08:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.091498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.091554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.091570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.091625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.091653 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:55Z","lastTransitionTime":"2025-11-22T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.150940 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:55 crc kubenswrapper[4743]: E1122 08:22:55.151232 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.151415 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:55 crc kubenswrapper[4743]: E1122 08:22:55.151650 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.195122 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.195198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.195216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.195244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.195267 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:55Z","lastTransitionTime":"2025-11-22T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.298964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.299022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.299036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.299058 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.299073 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:55Z","lastTransitionTime":"2025-11-22T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.402047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.402112 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.402130 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.402158 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.402218 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:55Z","lastTransitionTime":"2025-11-22T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.505130 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.505199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.505210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.505226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.505238 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:55Z","lastTransitionTime":"2025-11-22T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.608597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.608663 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.608676 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.608701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.608714 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:55Z","lastTransitionTime":"2025-11-22T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.712119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.712180 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.712193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.712212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.712225 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:55Z","lastTransitionTime":"2025-11-22T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.815532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.815644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.815662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.815689 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.815707 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:55Z","lastTransitionTime":"2025-11-22T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.918566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.918662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.918674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.918693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:55 crc kubenswrapper[4743]: I1122 08:22:55.918712 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:55Z","lastTransitionTime":"2025-11-22T08:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.020813 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.020926 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.020951 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.020986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.021011 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:56Z","lastTransitionTime":"2025-11-22T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.123746 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.123807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.123826 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.123851 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.123870 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:56Z","lastTransitionTime":"2025-11-22T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.151095 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.151161 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:56 crc kubenswrapper[4743]: E1122 08:22:56.151266 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:56 crc kubenswrapper[4743]: E1122 08:22:56.151416 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.228438 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.228533 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.228549 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.228599 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.228627 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:56Z","lastTransitionTime":"2025-11-22T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.331882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.331930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.331942 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.331962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.331973 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:56Z","lastTransitionTime":"2025-11-22T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.435531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.435637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.435655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.435682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.435700 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:56Z","lastTransitionTime":"2025-11-22T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.538250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.538300 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.538314 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.538334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.538347 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:56Z","lastTransitionTime":"2025-11-22T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.641019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.641083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.641102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.641126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.641144 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:56Z","lastTransitionTime":"2025-11-22T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.744519 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.744623 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.744665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.744697 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.744719 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:56Z","lastTransitionTime":"2025-11-22T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.848947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.849016 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.849032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.849055 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.849070 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:56Z","lastTransitionTime":"2025-11-22T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.951661 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.951752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.951788 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.951821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:56 crc kubenswrapper[4743]: I1122 08:22:56.951846 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:56Z","lastTransitionTime":"2025-11-22T08:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.053918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.053966 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.053978 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.053995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.054005 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:57Z","lastTransitionTime":"2025-11-22T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.150626 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.150629 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:57 crc kubenswrapper[4743]: E1122 08:22:57.150909 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:57 crc kubenswrapper[4743]: E1122 08:22:57.151112 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.156687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.156721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.156731 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.156748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.156757 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:57Z","lastTransitionTime":"2025-11-22T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.171840 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.185825 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.207247 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:47Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:43.411319 6023 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 08:22:43.411360 6023 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 08:22:43.411376 6023 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:43.411382 6023 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:43.411397 6023 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:43.411396 6023 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 08:22:43.411415 6023 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:43.411424 6023 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 08:22:43.411431 6023 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 08:22:43.411434 6023 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 08:22:43.411443 6023 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:43.411487 6023 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 08:22:43.411531 6023 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 08:22:43.411532 6023 factory.go:656] Stopping watch factory\\\\nI1122 08:22:43.411550 6023 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:51Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 08:22:50.479112 6281 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479170 6281 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479947 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:50.479985 6281 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:50.479993 6281 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:50.480031 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:50.480040 6281 factory.go:656] Stopping watch factory\\\\nI1122 08:22:50.480075 6281 ovnkube.go:599] Stopped ovnkube\\\\nI1122 08:22:50.480099 6281 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 08:22:50.480116 6281 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:50.480195 6281 handler.go:208] Removed *v1.Node event handler 7\\\\nF1122 08:22:50.480210 6281 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.221666 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.241085 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.259186 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.259238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.259249 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.259266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.259279 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:57Z","lastTransitionTime":"2025-11-22T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.264033 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.279315 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.294567 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.305064 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.325117 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.346410 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.363238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.363319 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.363334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.363355 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.363368 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:57Z","lastTransitionTime":"2025-11-22T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.364830 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.381353 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.393499 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.405888 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.419712 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.434953 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:57Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.465608 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.465651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.465678 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.465692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.465702 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:57Z","lastTransitionTime":"2025-11-22T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.568338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.568436 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.568455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.568510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.568530 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:57Z","lastTransitionTime":"2025-11-22T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.673321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.673397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.673414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.673443 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.673465 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:57Z","lastTransitionTime":"2025-11-22T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.776334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.776420 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.776439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.776482 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.776523 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:57Z","lastTransitionTime":"2025-11-22T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.879547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.879629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.879639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.879654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.879666 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:57Z","lastTransitionTime":"2025-11-22T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.982493 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.982568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.982629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.982662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:57 crc kubenswrapper[4743]: I1122 08:22:57.982684 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:57Z","lastTransitionTime":"2025-11-22T08:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.086305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.086359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.086374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.086400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.086421 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:58Z","lastTransitionTime":"2025-11-22T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.150948 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.150994 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:22:58 crc kubenswrapper[4743]: E1122 08:22:58.151160 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:22:58 crc kubenswrapper[4743]: E1122 08:22:58.151340 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.190794 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.190861 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.190883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.190914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.190936 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:58Z","lastTransitionTime":"2025-11-22T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.294540 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.294607 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.294629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.294651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.294664 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:58Z","lastTransitionTime":"2025-11-22T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.397058 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.397105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.397114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.397130 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.397139 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:58Z","lastTransitionTime":"2025-11-22T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.500178 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.500230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.500245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.500267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.500285 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:58Z","lastTransitionTime":"2025-11-22T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.603285 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.603339 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.603353 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.603376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.603396 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:58Z","lastTransitionTime":"2025-11-22T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.705839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.705878 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.705888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.705905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.705915 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:58Z","lastTransitionTime":"2025-11-22T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.809789 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.809838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.809850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.809866 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.809908 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:58Z","lastTransitionTime":"2025-11-22T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.884874 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.897243 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:58Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.907067 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:58Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.913635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.913733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.913758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.913791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.913821 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:58Z","lastTransitionTime":"2025-11-22T08:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.924332 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:58Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.935361 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:58Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.946727 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:58Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.959446 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:58Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.969057 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:58Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:58 crc kubenswrapper[4743]: I1122 08:22:58.984116 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:58Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.000001 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:58Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.012471 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:59Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.016397 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.016531 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.016816 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.016919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.017003 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:59Z","lastTransitionTime":"2025-11-22T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.024678 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:59Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.054081 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:47Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:43.411319 6023 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 08:22:43.411360 6023 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 08:22:43.411376 6023 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:43.411382 6023 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:43.411397 6023 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:43.411396 6023 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 08:22:43.411415 6023 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:43.411424 6023 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 08:22:43.411431 6023 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 08:22:43.411434 6023 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 08:22:43.411443 6023 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:43.411487 6023 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 08:22:43.411531 6023 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 08:22:43.411532 6023 factory.go:656] Stopping watch factory\\\\nI1122 08:22:43.411550 6023 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:51Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 08:22:50.479112 6281 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479170 6281 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479947 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:50.479985 6281 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:50.479993 6281 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:50.480031 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:50.480040 6281 factory.go:656] Stopping watch factory\\\\nI1122 08:22:50.480075 6281 ovnkube.go:599] Stopped ovnkube\\\\nI1122 08:22:50.480099 6281 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 08:22:50.480116 6281 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:50.480195 6281 handler.go:208] Removed *v1.Node event handler 7\\\\nF1122 08:22:50.480210 6281 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:59Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.067544 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:59Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.085142 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:59Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.097674 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:59Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.107603 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:59Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.116347 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:22:59Z is after 2025-08-24T17:21:41Z" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.119383 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.119432 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.119446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.119465 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.119477 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:59Z","lastTransitionTime":"2025-11-22T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.151267 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.151408 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:59 crc kubenswrapper[4743]: E1122 08:22:59.151429 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:22:59 crc kubenswrapper[4743]: E1122 08:22:59.151709 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.222451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.222523 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.222547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.222621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.222647 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:59Z","lastTransitionTime":"2025-11-22T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.325021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.325063 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.325071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.325088 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.325100 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:59Z","lastTransitionTime":"2025-11-22T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.427999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.428061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.428079 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.428107 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.428125 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:59Z","lastTransitionTime":"2025-11-22T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.531500 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.531606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.531633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.531664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.531683 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:59Z","lastTransitionTime":"2025-11-22T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.623055 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:22:59 crc kubenswrapper[4743]: E1122 08:22:59.623343 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:59 crc kubenswrapper[4743]: E1122 08:22:59.623466 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs podName:8426c723-9bfa-4856-b445-b01251484a35 nodeName:}" failed. No retries permitted until 2025-11-22 08:23:15.623428834 +0000 UTC m=+69.329789926 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs") pod "network-metrics-daemon-4vkc4" (UID: "8426c723-9bfa-4856-b445-b01251484a35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.634690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.634742 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.634760 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.634784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.634798 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:59Z","lastTransitionTime":"2025-11-22T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.737725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.737790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.737807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.737836 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.737853 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:59Z","lastTransitionTime":"2025-11-22T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.841188 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.841247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.841264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.841290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.841309 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:59Z","lastTransitionTime":"2025-11-22T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.944291 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.944361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.944382 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.944411 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:22:59 crc kubenswrapper[4743]: I1122 08:22:59.944434 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:22:59Z","lastTransitionTime":"2025-11-22T08:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.028163 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028344 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:23:32.028306041 +0000 UTC m=+85.734667143 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.028417 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.028506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.028629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028668 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.028686 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028720 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:23:32.028709113 +0000 UTC m=+85.735070165 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028760 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028800 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028823 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028822 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028833 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028860 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028877 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 08:23:32.028864278 +0000 UTC m=+85.735225340 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028884 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028915 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:23:32.028892129 +0000 UTC m=+85.735253221 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.028953 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 08:23:32.02893125 +0000 UTC m=+85.735292342 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.047272 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.047361 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.047385 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.047416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.047439 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:00Z","lastTransitionTime":"2025-11-22T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.149843 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.149874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.149883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.149898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.149908 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:00Z","lastTransitionTime":"2025-11-22T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.151195 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.151200 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.151410 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:00 crc kubenswrapper[4743]: E1122 08:23:00.151289 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.252841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.252911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.252934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.252964 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.252989 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:00Z","lastTransitionTime":"2025-11-22T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.355823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.355882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.355898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.355926 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.355943 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:00Z","lastTransitionTime":"2025-11-22T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.458943 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.458999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.459007 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.459022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.459035 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:00Z","lastTransitionTime":"2025-11-22T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.562421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.562478 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.562489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.562510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.562526 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:00Z","lastTransitionTime":"2025-11-22T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.666810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.666898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.666925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.666954 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.666975 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:00Z","lastTransitionTime":"2025-11-22T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.771007 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.771100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.771318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.771350 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.771373 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:00Z","lastTransitionTime":"2025-11-22T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.863327 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.874639 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.874703 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.874738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.874754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.874775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.874791 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:00Z","lastTransitionTime":"2025-11-22T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.876974 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.896046 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.908156 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.918860 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.927485 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.936415 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.949798 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.960738 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.973201 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.976681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.976727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.976740 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.976757 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.976771 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:00Z","lastTransitionTime":"2025-11-22T08:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.985388 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:00 crc kubenswrapper[4743]: I1122 08:23:00.997088 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:00Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.014674 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3878cd7f806a85808f7bd0125ffa8077426d65ffd54a0677804c7915af3d3d8c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:47Z\\\",\\\"message\\\":\\\"sip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:43.411319 6023 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1122 08:22:43.411360 6023 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1122 08:22:43.411376 6023 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:43.411382 6023 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:43.411397 6023 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:43.411396 6023 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1122 08:22:43.411415 6023 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:43.411424 6023 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1122 08:22:43.411431 6023 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1122 08:22:43.411434 6023 handler.go:208] Removed *v1.Node event handler 7\\\\nI1122 08:22:43.411443 6023 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:43.411487 6023 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1122 08:22:43.411531 6023 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1122 08:22:43.411532 6023 factory.go:656] Stopping watch factory\\\\nI1122 08:22:43.411550 6023 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:51Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 08:22:50.479112 6281 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479170 6281 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479947 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:50.479985 6281 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:50.479993 6281 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:50.480031 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:50.480040 6281 factory.go:656] Stopping watch factory\\\\nI1122 08:22:50.480075 6281 ovnkube.go:599] Stopped ovnkube\\\\nI1122 08:22:50.480099 6281 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 08:22:50.480116 6281 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:50.480195 6281 handler.go:208] Removed *v1.Node event handler 7\\\\nF1122 08:22:50.480210 6281 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:01Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.023927 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:01Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.036734 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:01Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.049761 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:01Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.063255 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:01Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.073961 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:01Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.078652 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.078713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.078725 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.078742 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.078754 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:01Z","lastTransitionTime":"2025-11-22T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.150762 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.150786 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:01 crc kubenswrapper[4743]: E1122 08:23:01.150933 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:01 crc kubenswrapper[4743]: E1122 08:23:01.151055 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.181634 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.181680 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.181690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.181707 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.181719 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:01Z","lastTransitionTime":"2025-11-22T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.284197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.284245 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.284260 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.284278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.284291 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:01Z","lastTransitionTime":"2025-11-22T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.386496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.386567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.386601 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.386619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.386631 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:01Z","lastTransitionTime":"2025-11-22T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.490417 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.490464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.490472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.490489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.490501 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:01Z","lastTransitionTime":"2025-11-22T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.592296 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.592358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.592371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.592417 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.592431 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:01Z","lastTransitionTime":"2025-11-22T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.694754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.694811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.694823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.694843 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.694854 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:01Z","lastTransitionTime":"2025-11-22T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.797409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.797459 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.797473 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.797496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.797509 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:01Z","lastTransitionTime":"2025-11-22T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.900367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.900422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.900433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.900453 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:01 crc kubenswrapper[4743]: I1122 08:23:01.900465 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:01Z","lastTransitionTime":"2025-11-22T08:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.003516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.003622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.003641 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.003670 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.003765 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:02Z","lastTransitionTime":"2025-11-22T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.106819 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.106894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.106911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.106938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.106957 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:02Z","lastTransitionTime":"2025-11-22T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.151381 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:02 crc kubenswrapper[4743]: E1122 08:23:02.151685 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.151815 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:02 crc kubenswrapper[4743]: E1122 08:23:02.152167 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.210463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.210501 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.210511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.210527 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.210536 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:02Z","lastTransitionTime":"2025-11-22T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.312472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.312520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.312529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.312545 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.312556 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:02Z","lastTransitionTime":"2025-11-22T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.415563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.415694 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.415778 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.415815 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.415837 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:02Z","lastTransitionTime":"2025-11-22T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.519310 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.519373 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.519389 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.519412 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.519429 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:02Z","lastTransitionTime":"2025-11-22T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.622441 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.622532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.622555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.622626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.622653 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:02Z","lastTransitionTime":"2025-11-22T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.725712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.725776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.725797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.725823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.725844 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:02Z","lastTransitionTime":"2025-11-22T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.829050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.829092 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.829103 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.829120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.829133 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:02Z","lastTransitionTime":"2025-11-22T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.931852 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.931905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.932065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.932127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:02 crc kubenswrapper[4743]: I1122 08:23:02.932153 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:02Z","lastTransitionTime":"2025-11-22T08:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.035917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.036002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.036019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.036048 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.036069 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:03Z","lastTransitionTime":"2025-11-22T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.138054 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.138084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.138092 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.138106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.138116 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:03Z","lastTransitionTime":"2025-11-22T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.151870 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.151934 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:03 crc kubenswrapper[4743]: E1122 08:23:03.152119 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:03 crc kubenswrapper[4743]: E1122 08:23:03.152274 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.240701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.240757 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.240774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.240796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.240810 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:03Z","lastTransitionTime":"2025-11-22T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.343925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.343985 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.343994 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.344013 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.344024 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:03Z","lastTransitionTime":"2025-11-22T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.446278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.446340 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.446360 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.446392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.446417 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:03Z","lastTransitionTime":"2025-11-22T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.548972 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.549037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.549061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.549091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.549112 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:03Z","lastTransitionTime":"2025-11-22T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.652119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.652169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.652189 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.652399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.652417 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:03Z","lastTransitionTime":"2025-11-22T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.755776 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.755837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.755855 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.755881 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.755900 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:03Z","lastTransitionTime":"2025-11-22T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.859154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.859225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.859237 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.859256 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.859266 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:03Z","lastTransitionTime":"2025-11-22T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.962312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.962415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.962431 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.962454 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:03 crc kubenswrapper[4743]: I1122 08:23:03.962477 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:03Z","lastTransitionTime":"2025-11-22T08:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.065327 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.065377 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.065390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.065410 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.065423 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.151510 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.151791 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.152141 4743 scope.go:117] "RemoveContainer" containerID="c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6" Nov 22 08:23:04 crc kubenswrapper[4743]: E1122 08:23:04.152307 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:04 crc kubenswrapper[4743]: E1122 08:23:04.152410 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.167561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.167636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.167647 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.167666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.167678 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.167780 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.182427 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.194068 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.216141 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.231967 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.244510 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.255492 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.268383 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.270169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.270305 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.270416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.270543 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.270662 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.281081 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.296101 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.309140 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.322301 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.336074 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.346447 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.367880 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:51Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 08:22:50.479112 6281 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479170 6281 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479947 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:50.479985 6281 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:50.479993 6281 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:50.480031 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:50.480040 6281 factory.go:656] Stopping watch factory\\\\nI1122 08:22:50.480075 6281 ovnkube.go:599] Stopped ovnkube\\\\nI1122 08:22:50.480099 6281 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 08:22:50.480116 6281 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:50.480195 6281 handler.go:208] Removed *v1.Node event handler 7\\\\nF1122 08:22:50.480210 6281 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.372973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.373028 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.373038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.373054 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.373064 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.379675 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.391884 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.408252 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.476002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.476047 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.476061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.476079 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.476090 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.578821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.578866 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.578879 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.578899 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.578912 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.583331 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/1.log" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.586332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.586790 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.624070 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.638092 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.654922 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.668008 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.681290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.681551 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.681657 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.681766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.681942 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.685917 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.694480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.694520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.694530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.694547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.694558 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.704114 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: E1122 08:23:04.706435 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.709273 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.709310 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.709322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.709338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.709348 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.716590 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: E1122 08:23:04.719484 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.722537 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.722592 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.722603 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.722621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.722632 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.728525 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: E1122 08:23:04.737204 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.740991 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.741025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.741034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.741050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.741059 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.745908 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: E1122 08:23:04.752298 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.756762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.756800 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.756809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.756825 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.756806 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.756834 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: E1122 08:23:04.767411 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: E1122 08:23:04.767591 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.769216 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.783918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.783961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.783974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.783993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.784009 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.784413 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.797034 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.808702 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.821194 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.841680 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:51Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 08:22:50.479112 6281 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479170 6281 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479947 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:50.479985 6281 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:50.479993 6281 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:50.480031 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:50.480040 6281 factory.go:656] Stopping watch factory\\\\nI1122 08:22:50.480075 6281 ovnkube.go:599] Stopped ovnkube\\\\nI1122 08:22:50.480099 6281 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 08:22:50.480116 6281 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:50.480195 6281 handler.go:208] Removed *v1.Node event handler 7\\\\nF1122 08:22:50.480210 6281 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.850794 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.862147 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:04Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.886855 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.886897 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.886909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.886925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.886937 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.988887 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.988919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.988930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.988944 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:04 crc kubenswrapper[4743]: I1122 08:23:04.988954 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:04Z","lastTransitionTime":"2025-11-22T08:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.092204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.092238 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.092250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.092264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.092275 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:05Z","lastTransitionTime":"2025-11-22T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.150805 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:05 crc kubenswrapper[4743]: E1122 08:23:05.150974 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.151054 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:05 crc kubenswrapper[4743]: E1122 08:23:05.151221 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.195375 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.195419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.195428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.195446 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.195460 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:05Z","lastTransitionTime":"2025-11-22T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.298272 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.298329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.298345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.298367 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.298383 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:05Z","lastTransitionTime":"2025-11-22T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.401974 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.402031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.402050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.402075 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.402097 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:05Z","lastTransitionTime":"2025-11-22T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.504721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.504788 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.504807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.504833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.504852 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:05Z","lastTransitionTime":"2025-11-22T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.592813 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/2.log" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.594192 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/1.log" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.598650 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5" exitCode=1 Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.598729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5"} Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.598784 4743 scope.go:117] "RemoveContainer" containerID="c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.600332 4743 scope.go:117] "RemoveContainer" containerID="5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5" Nov 22 08:23:05 crc kubenswrapper[4743]: E1122 08:23:05.600785 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.607259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.607315 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.607355 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.607379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.607398 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:05Z","lastTransitionTime":"2025-11-22T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.620267 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.637126 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.660051 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.679894 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.696469 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.709342 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.710741 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.710772 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.710781 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.710797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.710807 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:05Z","lastTransitionTime":"2025-11-22T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.720956 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.732055 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.747557 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.760814 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.774810 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.785402 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.803728 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c73e72850316a61dcb8b0d416c8e45e3e6e0861a5dc3e3b74cb5d40a45e5f9d6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:22:51Z\\\",\\\"message\\\":\\\"eflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1122 08:22:50.479112 6281 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479170 6281 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:22:50.479947 6281 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1122 08:22:50.479985 6281 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1122 08:22:50.479993 6281 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1122 08:22:50.480031 6281 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1122 08:22:50.480040 6281 factory.go:656] Stopping watch factory\\\\nI1122 08:22:50.480075 6281 ovnkube.go:599] Stopped ovnkube\\\\nI1122 08:22:50.480099 6281 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1122 08:22:50.480116 6281 handler.go:208] Removed *v1.Node event handler 2\\\\nI1122 08:22:50.480195 6281 handler.go:208] Removed *v1.Node event handler 7\\\\nF1122 08:22:50.480210 6281 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:05Z\\\",\\\"message\\\":\\\"237 6462 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx in node crc\\\\nI1122 08:23:05.035302 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1122 08:23:05.035305 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx after 0 failed attempt(s)\\\\nI1122 08:23:05.035313 6462 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx\\\\nI1122 08:23:05.035315 6462 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1122 08:23:05.035302 6462 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:23:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.813625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.813662 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.813695 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.813712 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.813724 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:05Z","lastTransitionTime":"2025-11-22T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.814177 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.843241 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.856658 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.871059 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.881943 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:05Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.916601 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.916655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.916667 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.916687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:05 crc kubenswrapper[4743]: I1122 08:23:05.916699 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:05Z","lastTransitionTime":"2025-11-22T08:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.019568 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.019643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.019659 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.019680 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.019696 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:06Z","lastTransitionTime":"2025-11-22T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.122980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.123037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.123048 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.123065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.123078 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:06Z","lastTransitionTime":"2025-11-22T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.151414 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.151477 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:06 crc kubenswrapper[4743]: E1122 08:23:06.151673 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:06 crc kubenswrapper[4743]: E1122 08:23:06.151847 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.226097 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.226161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.226176 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.226199 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.226213 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:06Z","lastTransitionTime":"2025-11-22T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.329095 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.329161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.329173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.329231 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.329246 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:06Z","lastTransitionTime":"2025-11-22T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.432381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.432423 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.432433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.432451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.432461 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:06Z","lastTransitionTime":"2025-11-22T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.534971 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.535043 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.535056 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.535076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.535090 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:06Z","lastTransitionTime":"2025-11-22T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.605536 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/2.log" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.610643 4743 scope.go:117] "RemoveContainer" containerID="5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5" Nov 22 08:23:06 crc kubenswrapper[4743]: E1122 08:23:06.610859 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.626106 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.638363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.638422 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.638433 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.638453 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.638466 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:06Z","lastTransitionTime":"2025-11-22T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.645373 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.667178 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.679519 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.691733 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.708987 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:05Z\\\",\\\"message\\\":\\\"237 6462 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx in node crc\\\\nI1122 08:23:05.035302 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1122 08:23:05.035305 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx after 0 failed attempt(s)\\\\nI1122 08:23:05.035313 6462 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx\\\\nI1122 08:23:05.035315 6462 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1122 08:23:05.035302 6462 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:23:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.721011 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.740959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.741002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.741012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.741031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.741041 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:06Z","lastTransitionTime":"2025-11-22T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.742138 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.757251 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.770049 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.780305 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.793661 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.803186 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.816724 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.828266 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.842072 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.843765 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.843802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.843814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.843830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.843841 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:06Z","lastTransitionTime":"2025-11-22T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.856352 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.868539 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:06Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.946957 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.947011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.947027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.947046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:06 crc kubenswrapper[4743]: I1122 08:23:06.947058 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:06Z","lastTransitionTime":"2025-11-22T08:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.050474 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.050519 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.050529 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.050547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.050559 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:07Z","lastTransitionTime":"2025-11-22T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.151765 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.151884 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:07 crc kubenswrapper[4743]: E1122 08:23:07.151930 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:07 crc kubenswrapper[4743]: E1122 08:23:07.152112 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.154275 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.154350 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.154362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.154409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.154421 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:07Z","lastTransitionTime":"2025-11-22T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.179613 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.196606 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.214214 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.225817 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.236477 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.252373 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.256040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.256065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.256075 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.256089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.256100 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:07Z","lastTransitionTime":"2025-11-22T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.269623 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.289333 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.301924 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.311775 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.327109 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.340663 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.353123 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.359381 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.359412 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.359421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.359435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.359445 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:07Z","lastTransitionTime":"2025-11-22T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.368686 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.377878 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.397784 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:05Z\\\",\\\"message\\\":\\\"237 6462 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx in node crc\\\\nI1122 08:23:05.035302 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1122 08:23:05.035305 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx after 0 failed attempt(s)\\\\nI1122 08:23:05.035313 6462 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx\\\\nI1122 08:23:05.035315 6462 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1122 08:23:05.035302 6462 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:23:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.410207 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.422246 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:07Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.461258 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.461300 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.461311 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.461329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.461340 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:07Z","lastTransitionTime":"2025-11-22T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.564078 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.564126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.564168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.564190 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.564207 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:07Z","lastTransitionTime":"2025-11-22T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.667036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.667095 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.667111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.667135 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.667153 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:07Z","lastTransitionTime":"2025-11-22T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.770002 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.770051 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.770063 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.770084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.770096 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:07Z","lastTransitionTime":"2025-11-22T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.872701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.872771 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.872788 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.872820 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.872839 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:07Z","lastTransitionTime":"2025-11-22T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.975460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.975514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.975528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.975549 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:07 crc kubenswrapper[4743]: I1122 08:23:07.975565 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:07Z","lastTransitionTime":"2025-11-22T08:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.078930 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.078988 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.079011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.079034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.079051 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:08Z","lastTransitionTime":"2025-11-22T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.151158 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:08 crc kubenswrapper[4743]: E1122 08:23:08.151419 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.151924 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:08 crc kubenswrapper[4743]: E1122 08:23:08.152133 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.181880 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.181938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.181952 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.181976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.181991 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:08Z","lastTransitionTime":"2025-11-22T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.284299 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.284916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.285003 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.285113 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.285212 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:08Z","lastTransitionTime":"2025-11-22T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.389384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.389441 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.389455 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.389477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.389491 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:08Z","lastTransitionTime":"2025-11-22T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.491823 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.492040 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.492102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.492182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.492240 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:08Z","lastTransitionTime":"2025-11-22T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.594323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.594368 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.594380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.594399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.594411 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:08Z","lastTransitionTime":"2025-11-22T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.696812 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.696892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.696905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.696922 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.696931 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:08Z","lastTransitionTime":"2025-11-22T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.799108 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.799177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.799189 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.799206 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.799216 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:08Z","lastTransitionTime":"2025-11-22T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.902102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.902169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.902187 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.902214 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:08 crc kubenswrapper[4743]: I1122 08:23:08.902239 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:08Z","lastTransitionTime":"2025-11-22T08:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.004938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.004995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.005011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.005031 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.005043 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:09Z","lastTransitionTime":"2025-11-22T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.108914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.108976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.108990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.109015 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.109032 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:09Z","lastTransitionTime":"2025-11-22T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.150923 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.151060 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:09 crc kubenswrapper[4743]: E1122 08:23:09.151198 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:09 crc kubenswrapper[4743]: E1122 08:23:09.151322 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.212048 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.212094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.212108 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.212130 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.212147 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:09Z","lastTransitionTime":"2025-11-22T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.313901 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.313948 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.313960 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.313975 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.313986 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:09Z","lastTransitionTime":"2025-11-22T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.416340 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.416404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.416415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.416434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.416447 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:09Z","lastTransitionTime":"2025-11-22T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.519822 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.519874 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.519888 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.519909 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.519924 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:09Z","lastTransitionTime":"2025-11-22T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.621865 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.621908 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.621916 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.621931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.621940 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:09Z","lastTransitionTime":"2025-11-22T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.724046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.724096 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.724106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.724123 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.724135 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:09Z","lastTransitionTime":"2025-11-22T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.826751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.826786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.826795 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.826810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.826823 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:09Z","lastTransitionTime":"2025-11-22T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.929294 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.929342 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.929351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.929369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:09 crc kubenswrapper[4743]: I1122 08:23:09.929379 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:09Z","lastTransitionTime":"2025-11-22T08:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.031627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.031672 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.031704 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.031721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.031731 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:10Z","lastTransitionTime":"2025-11-22T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.134019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.134061 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.134076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.134100 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.134115 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:10Z","lastTransitionTime":"2025-11-22T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.150524 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.150534 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:10 crc kubenswrapper[4743]: E1122 08:23:10.150680 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:10 crc kubenswrapper[4743]: E1122 08:23:10.150790 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.237244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.237314 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.237333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.237364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.237382 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:10Z","lastTransitionTime":"2025-11-22T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.340109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.340158 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.340177 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.340197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.340209 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:10Z","lastTransitionTime":"2025-11-22T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.443416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.443463 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.443473 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.443489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.443500 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:10Z","lastTransitionTime":"2025-11-22T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.546091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.546127 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.546137 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.546153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.546163 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:10Z","lastTransitionTime":"2025-11-22T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.649283 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.649329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.649339 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.649364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.649383 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:10Z","lastTransitionTime":"2025-11-22T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.752380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.752424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.752435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.752454 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.752469 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:10Z","lastTransitionTime":"2025-11-22T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.856486 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.857015 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.857034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.857062 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.857081 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:10Z","lastTransitionTime":"2025-11-22T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.959682 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.959727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.959735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.959749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:10 crc kubenswrapper[4743]: I1122 08:23:10.959759 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:10Z","lastTransitionTime":"2025-11-22T08:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.062722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.062788 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.062802 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.062821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.062836 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:11Z","lastTransitionTime":"2025-11-22T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.151852 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.151965 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:11 crc kubenswrapper[4743]: E1122 08:23:11.152055 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:11 crc kubenswrapper[4743]: E1122 08:23:11.152110 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.165027 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.165069 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.165079 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.165094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.165105 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:11Z","lastTransitionTime":"2025-11-22T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.267754 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.267811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.267829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.267849 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.267863 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:11Z","lastTransitionTime":"2025-11-22T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.370696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.370734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.370742 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.370759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.370771 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:11Z","lastTransitionTime":"2025-11-22T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.472777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.472824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.472834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.472849 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.472859 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:11Z","lastTransitionTime":"2025-11-22T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.575439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.575522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.575532 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.575547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.575557 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:11Z","lastTransitionTime":"2025-11-22T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.679507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.679554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.679566 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.679599 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.679611 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:11Z","lastTransitionTime":"2025-11-22T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.782535 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.782616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.782629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.782650 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.782662 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:11Z","lastTransitionTime":"2025-11-22T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.884940 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.884982 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.884993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.885009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.885019 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:11Z","lastTransitionTime":"2025-11-22T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.989982 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.990035 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.990044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.990060 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:11 crc kubenswrapper[4743]: I1122 08:23:11.990072 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:11Z","lastTransitionTime":"2025-11-22T08:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.092970 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.093021 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.093029 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.093046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.093058 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:12Z","lastTransitionTime":"2025-11-22T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.150757 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.150822 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:12 crc kubenswrapper[4743]: E1122 08:23:12.150980 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:12 crc kubenswrapper[4743]: E1122 08:23:12.151130 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.195558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.195624 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.195636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.195655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.195667 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:12Z","lastTransitionTime":"2025-11-22T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.298522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.298644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.298658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.298683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.298696 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:12Z","lastTransitionTime":"2025-11-22T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.401467 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.401510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.401521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.401538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.401549 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:12Z","lastTransitionTime":"2025-11-22T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.504377 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.504423 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.504435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.504482 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.504495 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:12Z","lastTransitionTime":"2025-11-22T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.607183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.607227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.607241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.607259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.607271 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:12Z","lastTransitionTime":"2025-11-22T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.710178 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.710233 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.710246 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.710267 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.710279 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:12Z","lastTransitionTime":"2025-11-22T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.814244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.814293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.814301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.814330 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.814341 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:12Z","lastTransitionTime":"2025-11-22T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.916845 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.916883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.916894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.916911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:12 crc kubenswrapper[4743]: I1122 08:23:12.916928 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:12Z","lastTransitionTime":"2025-11-22T08:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.019198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.019240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.019250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.019268 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.019281 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:13Z","lastTransitionTime":"2025-11-22T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.122247 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.122341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.122366 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.122394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.122413 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:13Z","lastTransitionTime":"2025-11-22T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.151094 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.151218 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:13 crc kubenswrapper[4743]: E1122 08:23:13.151643 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:13 crc kubenswrapper[4743]: E1122 08:23:13.151771 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.161059 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.227715 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.227757 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.227798 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.227816 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.227828 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:13Z","lastTransitionTime":"2025-11-22T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.330751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.330797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.330808 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.330825 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.330838 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:13Z","lastTransitionTime":"2025-11-22T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.433945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.433995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.434006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.434025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.434037 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:13Z","lastTransitionTime":"2025-11-22T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.537266 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.537340 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.537365 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.537398 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.537419 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:13Z","lastTransitionTime":"2025-11-22T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.639508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.639543 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.639554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.639567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.639592 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:13Z","lastTransitionTime":"2025-11-22T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.741863 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.741902 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.741910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.741925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.741936 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:13Z","lastTransitionTime":"2025-11-22T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.844255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.844329 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.844341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.844363 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.844391 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:13Z","lastTransitionTime":"2025-11-22T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.947564 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.947654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.947664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.947681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:13 crc kubenswrapper[4743]: I1122 08:23:13.947692 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:13Z","lastTransitionTime":"2025-11-22T08:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.053406 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.053473 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.053485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.053503 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.053529 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.150942 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.151011 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:14 crc kubenswrapper[4743]: E1122 08:23:14.151144 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:14 crc kubenswrapper[4743]: E1122 08:23:14.151234 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.156243 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.156328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.156341 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.156378 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.156397 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.258608 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.258680 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.258696 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.258714 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.258726 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.361710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.361755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.361767 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.361786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.361805 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.464644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.464704 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.464729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.464761 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.464775 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.567971 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.568022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.568032 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.568051 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.568062 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.670913 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.670947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.670956 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.670969 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.670978 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.773297 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.773335 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.773347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.773366 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.773378 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.876597 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.876655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.876668 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.876692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.876706 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.904743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.904797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.904811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.904831 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.904841 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: E1122 08:23:14.917483 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:14Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.921624 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.921688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.921700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.921722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.921732 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: E1122 08:23:14.936528 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:14Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.940827 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.940862 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.940875 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.940914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.940927 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: E1122 08:23:14.957995 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:14Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.962390 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.962464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.962477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.962503 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.962519 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: E1122 08:23:14.975101 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:14Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.980068 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.980139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.980150 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.980171 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.980199 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:14 crc kubenswrapper[4743]: E1122 08:23:14.995642 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:14Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:14 crc kubenswrapper[4743]: E1122 08:23:14.995830 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.998069 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.998119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.998129 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.998154 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:14 crc kubenswrapper[4743]: I1122 08:23:14.998166 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:14Z","lastTransitionTime":"2025-11-22T08:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.152676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:15 crc kubenswrapper[4743]: E1122 08:23:15.152830 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.153480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.153538 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.153562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.153709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.153727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.153741 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:15Z","lastTransitionTime":"2025-11-22T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:15 crc kubenswrapper[4743]: E1122 08:23:15.153771 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.263024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.263076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.263086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.263104 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.263117 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:15Z","lastTransitionTime":"2025-11-22T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.365841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.365883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.365894 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.365910 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.365920 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:15Z","lastTransitionTime":"2025-11-22T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.469082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.469142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.469157 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.469179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.469195 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:15Z","lastTransitionTime":"2025-11-22T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.572630 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.572693 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.572706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.572727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.572740 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:15Z","lastTransitionTime":"2025-11-22T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.659597 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:15 crc kubenswrapper[4743]: E1122 08:23:15.659799 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:23:15 crc kubenswrapper[4743]: E1122 08:23:15.659921 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs podName:8426c723-9bfa-4856-b445-b01251484a35 nodeName:}" failed. No retries permitted until 2025-11-22 08:23:47.659890616 +0000 UTC m=+101.366251708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs") pod "network-metrics-daemon-4vkc4" (UID: "8426c723-9bfa-4856-b445-b01251484a35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.676439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.676495 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.676507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.676527 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.676539 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:15Z","lastTransitionTime":"2025-11-22T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.779161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.779194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.779204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.779217 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.779227 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:15Z","lastTransitionTime":"2025-11-22T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.881799 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.881853 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.881863 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.881882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.881893 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:15Z","lastTransitionTime":"2025-11-22T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.986286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.986337 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.986351 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.986370 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:15 crc kubenswrapper[4743]: I1122 08:23:15.986385 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:15Z","lastTransitionTime":"2025-11-22T08:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.089762 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.089811 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.089821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.089838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.089848 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:16Z","lastTransitionTime":"2025-11-22T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.151264 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.151387 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:16 crc kubenswrapper[4743]: E1122 08:23:16.151473 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:16 crc kubenswrapper[4743]: E1122 08:23:16.151895 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.193227 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.193550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.193687 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.193925 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.194038 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:16Z","lastTransitionTime":"2025-11-22T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.296983 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.297039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.297053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.297076 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.297090 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:16Z","lastTransitionTime":"2025-11-22T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.399841 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.399883 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.399892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.399912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.399922 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:16Z","lastTransitionTime":"2025-11-22T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.502761 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.503528 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.503629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.503721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.503789 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:16Z","lastTransitionTime":"2025-11-22T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.606919 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.607374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.607468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.607593 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.607672 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:16Z","lastTransitionTime":"2025-11-22T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.711395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.711451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.711460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.711480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.711491 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:16Z","lastTransitionTime":"2025-11-22T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.814152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.814204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.814213 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.814230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.814245 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:16Z","lastTransitionTime":"2025-11-22T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.917637 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.918012 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.918087 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.918173 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:16 crc kubenswrapper[4743]: I1122 08:23:16.918251 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:16Z","lastTransitionTime":"2025-11-22T08:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.021161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.021212 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.021225 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.021244 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.021258 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:17Z","lastTransitionTime":"2025-11-22T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.123728 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.123793 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.123807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.124034 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.124046 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:17Z","lastTransitionTime":"2025-11-22T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.151213 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.151208 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:17 crc kubenswrapper[4743]: E1122 08:23:17.151773 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:17 crc kubenswrapper[4743]: E1122 08:23:17.151932 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.162475 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.181045 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.191886 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.202512 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32a39ce8-25e5-4b37-84d7-21027cdb228b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f89923f4bf5b5ce2ce85146d7c472421f1dbc5b8d20103bfd00de9599c2c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.214861 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.227039 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.227153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.227774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.228035 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.228204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.228354 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:17Z","lastTransitionTime":"2025-11-22T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.238120 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.251971 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.267369 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.281433 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.295622 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.306634 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.327479 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:05Z\\\",\\\"message\\\":\\\"237 6462 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx in node crc\\\\nI1122 08:23:05.035302 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1122 08:23:05.035305 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx after 0 failed attempt(s)\\\\nI1122 08:23:05.035313 6462 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx\\\\nI1122 08:23:05.035315 6462 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1122 08:23:05.035302 6462 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:23:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.332070 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.332124 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.332138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.332160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.332176 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:17Z","lastTransitionTime":"2025-11-22T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.338568 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.350171 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.368693 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.380678 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.391320 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.401010 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.434081 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.434118 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.434130 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.434165 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.434178 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:17Z","lastTransitionTime":"2025-11-22T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.536545 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.537109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.537120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.537135 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.537144 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:17Z","lastTransitionTime":"2025-11-22T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.639941 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.639980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.639990 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.640005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.640016 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:17Z","lastTransitionTime":"2025-11-22T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.647834 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cbpnf_a1de4b47-eed0-431f-a7a9-a944ce8791bd/kube-multus/0.log" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.647879 4743 generic.go:334] "Generic (PLEG): container finished" podID="a1de4b47-eed0-431f-a7a9-a944ce8791bd" containerID="42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b" exitCode=1 Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.647908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cbpnf" event={"ID":"a1de4b47-eed0-431f-a7a9-a944ce8791bd","Type":"ContainerDied","Data":"42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.648267 4743 scope.go:117] "RemoveContainer" containerID="42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.663451 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.676101 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.688024 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.703040 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:17Z\\\",\\\"message\\\":\\\"2025-11-22T08:22:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_17bf3948-35f9-4255-8597-4844fba11d77\\\\n2025-11-22T08:22:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_17bf3948-35f9-4255-8597-4844fba11d77 to /host/opt/cni/bin/\\\\n2025-11-22T08:22:32Z [verbose] multus-daemon started\\\\n2025-11-22T08:22:32Z [verbose] Readiness Indicator file check\\\\n2025-11-22T08:23:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.716468 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.735663 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.748489 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.748517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.748526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.748542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.748553 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:17Z","lastTransitionTime":"2025-11-22T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.751223 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.763046 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32a39ce8-25e5-4b37-84d7-21027cdb228b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f89923f4bf5b5ce2ce85146d7c472421f1dbc5b8d20103bfd00de9599c2c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.778890 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.792232 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.816470 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:05Z\\\",\\\"message\\\":\\\"237 6462 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx in node crc\\\\nI1122 08:23:05.035302 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1122 08:23:05.035305 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx after 0 failed attempt(s)\\\\nI1122 08:23:05.035313 6462 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx\\\\nI1122 08:23:05.035315 6462 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1122 08:23:05.035302 6462 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:23:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.831909 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.849049 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.851653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.851702 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.851720 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.851744 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.851761 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:17Z","lastTransitionTime":"2025-11-22T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.876679 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.889935 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.904938 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.915973 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.937387 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.952009 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:17Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.953674 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.953710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.953719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.953734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:17 crc kubenswrapper[4743]: I1122 08:23:17.953747 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:17Z","lastTransitionTime":"2025-11-22T08:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.056749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.056796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.056808 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.056829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.056842 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:18Z","lastTransitionTime":"2025-11-22T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.151449 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:18 crc kubenswrapper[4743]: E1122 08:23:18.151595 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.151770 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:18 crc kubenswrapper[4743]: E1122 08:23:18.151814 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.158688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.158724 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.158736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.158775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.158786 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:18Z","lastTransitionTime":"2025-11-22T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.260434 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.260479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.260491 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.260508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.260520 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:18Z","lastTransitionTime":"2025-11-22T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.363567 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.363628 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.363639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.363654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.363664 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:18Z","lastTransitionTime":"2025-11-22T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.466071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.466116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.466125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.466139 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.466149 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:18Z","lastTransitionTime":"2025-11-22T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.569039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.569109 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.569122 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.569138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.569149 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:18Z","lastTransitionTime":"2025-11-22T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.652470 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cbpnf_a1de4b47-eed0-431f-a7a9-a944ce8791bd/kube-multus/0.log" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.652529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cbpnf" event={"ID":"a1de4b47-eed0-431f-a7a9-a944ce8791bd","Type":"ContainerStarted","Data":"d69946320b9db9ab4b35189efa616676972d055242413aa37ff4b1ae5b7af00d"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.666370 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.671328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.671365 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.671376 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.671395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.671408 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:18Z","lastTransitionTime":"2025-11-22T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.680652 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.692652 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.705718 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69946320b9db9ab4b35189efa616676972d055242413aa37ff4b1ae5b7af00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:17Z\\\",\\\"message\\\":\\\"2025-11-22T08:22:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_17bf3948-35f9-4255-8597-4844fba11d77\\\\n2025-11-22T08:22:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_17bf3948-35f9-4255-8597-4844fba11d77 to /host/opt/cni/bin/\\\\n2025-11-22T08:22:32Z [verbose] multus-daemon started\\\\n2025-11-22T08:22:32Z [verbose] Readiness Indicator file check\\\\n2025-11-22T08:23:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.718012 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.733654 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.743637 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.753092 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32a39ce8-25e5-4b37-84d7-21027cdb228b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f89923f4bf5b5ce2ce85146d7c472421f1dbc5b8d20103bfd00de9599c2c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.765361 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.774120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.774160 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.774170 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.774186 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.774199 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:18Z","lastTransitionTime":"2025-11-22T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.775023 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.795099 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:05Z\\\",\\\"message\\\":\\\"237 6462 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx in node crc\\\\nI1122 08:23:05.035302 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1122 08:23:05.035305 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx after 0 failed attempt(s)\\\\nI1122 08:23:05.035313 6462 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx\\\\nI1122 08:23:05.035315 6462 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1122 08:23:05.035302 6462 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:23:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.806566 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.820889 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.834688 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.846500 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.860274 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.870310 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.877700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.877766 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.877777 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.877793 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.877803 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:18Z","lastTransitionTime":"2025-11-22T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.890228 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.902325 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:18Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.980450 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.980496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.980506 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.980521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:18 crc kubenswrapper[4743]: I1122 08:23:18.980530 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:18Z","lastTransitionTime":"2025-11-22T08:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.082896 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.082980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.083006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.083039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.083067 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:19Z","lastTransitionTime":"2025-11-22T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.151055 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.151113 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:19 crc kubenswrapper[4743]: E1122 08:23:19.151288 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:19 crc kubenswrapper[4743]: E1122 08:23:19.151359 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.186773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.186820 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.186830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.186855 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.186868 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:19Z","lastTransitionTime":"2025-11-22T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.289312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.289638 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.289736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.289833 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.289910 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:19Z","lastTransitionTime":"2025-11-22T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.391999 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.392049 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.392058 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.392074 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.392086 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:19Z","lastTransitionTime":"2025-11-22T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.494995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.495037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.495050 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.495067 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.495080 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:19Z","lastTransitionTime":"2025-11-22T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.597347 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.597388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.597399 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.597416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.597428 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:19Z","lastTransitionTime":"2025-11-22T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.699526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.699563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.699593 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.699610 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.699621 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:19Z","lastTransitionTime":"2025-11-22T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.802294 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.802613 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.802727 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.802810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.802897 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:19Z","lastTransitionTime":"2025-11-22T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.908379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.908420 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.908428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.908445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:19 crc kubenswrapper[4743]: I1122 08:23:19.908455 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:19Z","lastTransitionTime":"2025-11-22T08:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.010958 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.011008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.011020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.011038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.011048 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:20Z","lastTransitionTime":"2025-11-22T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.113248 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.113285 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.113294 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.113307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.113316 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:20Z","lastTransitionTime":"2025-11-22T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.150780 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.151063 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:20 crc kubenswrapper[4743]: E1122 08:23:20.151160 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.151364 4743 scope.go:117] "RemoveContainer" containerID="5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5" Nov 22 08:23:20 crc kubenswrapper[4743]: E1122 08:23:20.151376 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:20 crc kubenswrapper[4743]: E1122 08:23:20.151486 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.215565 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.215633 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.215648 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.215671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.215685 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:20Z","lastTransitionTime":"2025-11-22T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.317934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.317971 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.317980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.317995 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.318005 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:20Z","lastTransitionTime":"2025-11-22T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.420449 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.420484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.420494 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.420508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.420521 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:20Z","lastTransitionTime":"2025-11-22T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.523289 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.523333 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.523345 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.523364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.523379 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:20Z","lastTransitionTime":"2025-11-22T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.626721 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.626775 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.626796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.626816 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.626830 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:20Z","lastTransitionTime":"2025-11-22T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.728701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.728743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.728756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.728772 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.728787 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:20Z","lastTransitionTime":"2025-11-22T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.830632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.830666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.830675 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.830691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.830703 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:20Z","lastTransitionTime":"2025-11-22T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.933945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.934008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.934019 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.934045 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:20 crc kubenswrapper[4743]: I1122 08:23:20.934059 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:20Z","lastTransitionTime":"2025-11-22T08:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.035632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.035679 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.035690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.035706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.035718 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:21Z","lastTransitionTime":"2025-11-22T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.138255 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.138290 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.138301 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.138318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.138331 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:21Z","lastTransitionTime":"2025-11-22T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.150843 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:21 crc kubenswrapper[4743]: E1122 08:23:21.150950 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.150852 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:21 crc kubenswrapper[4743]: E1122 08:23:21.151073 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.241115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.241153 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.241162 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.241176 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.241185 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:21Z","lastTransitionTime":"2025-11-22T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.343749 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.343804 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.343814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.343831 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.343842 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:21Z","lastTransitionTime":"2025-11-22T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.446470 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.446559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.446616 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.446642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.446660 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:21Z","lastTransitionTime":"2025-11-22T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.554621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.554701 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.554718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.554738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.554751 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:21Z","lastTransitionTime":"2025-11-22T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.656961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.657008 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.657020 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.657036 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.657047 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:21Z","lastTransitionTime":"2025-11-22T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.760046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.760094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.760106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.760126 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.760141 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:21Z","lastTransitionTime":"2025-11-22T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.862343 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.862481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.862499 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.862516 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.862540 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:21Z","lastTransitionTime":"2025-11-22T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.965104 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.965169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.965181 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.965198 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:21 crc kubenswrapper[4743]: I1122 08:23:21.965209 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:21Z","lastTransitionTime":"2025-11-22T08:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.067655 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.068007 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.068159 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.068312 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.068456 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:22Z","lastTransitionTime":"2025-11-22T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.151311 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:22 crc kubenswrapper[4743]: E1122 08:23:22.151480 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.151718 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:22 crc kubenswrapper[4743]: E1122 08:23:22.151787 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.171542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.171606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.171625 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.171646 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.171661 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:22Z","lastTransitionTime":"2025-11-22T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.273640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.273688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.273700 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.273718 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.273731 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:22Z","lastTransitionTime":"2025-11-22T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.376681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.376729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.376743 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.376761 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.376773 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:22Z","lastTransitionTime":"2025-11-22T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.479122 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.479168 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.479179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.479194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.479204 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:22Z","lastTransitionTime":"2025-11-22T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.582011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.582068 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.582082 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.582104 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.582119 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:22Z","lastTransitionTime":"2025-11-22T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.700630 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.700698 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.700716 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.700742 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.700760 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:22Z","lastTransitionTime":"2025-11-22T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.803839 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.803912 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.803929 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.803961 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.803981 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:22Z","lastTransitionTime":"2025-11-22T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.907326 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.907369 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.907379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.907396 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:22 crc kubenswrapper[4743]: I1122 08:23:22.907406 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:22Z","lastTransitionTime":"2025-11-22T08:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.009952 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.010011 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.010024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.010042 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.010056 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:23Z","lastTransitionTime":"2025-11-22T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.112792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.113052 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.113183 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.113268 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.113348 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:23Z","lastTransitionTime":"2025-11-22T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.150750 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.150829 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:23 crc kubenswrapper[4743]: E1122 08:23:23.150983 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:23 crc kubenswrapper[4743]: E1122 08:23:23.151129 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.217315 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.217393 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.217416 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.217448 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.217467 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:23Z","lastTransitionTime":"2025-11-22T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.321210 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.321276 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.321295 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.321318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.321339 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:23Z","lastTransitionTime":"2025-11-22T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.423781 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.423830 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.423842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.423860 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.423875 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:23Z","lastTransitionTime":"2025-11-22T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.527226 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.527274 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.527287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.527306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.527320 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:23Z","lastTransitionTime":"2025-11-22T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.630468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.630542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.630562 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.630644 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.630669 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:23Z","lastTransitionTime":"2025-11-22T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.733719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.733810 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.733829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.733857 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.733878 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:23Z","lastTransitionTime":"2025-11-22T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.836454 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.836507 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.836526 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.836561 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.836602 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:23Z","lastTransitionTime":"2025-11-22T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.943772 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.943824 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.943837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.943856 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:23 crc kubenswrapper[4743]: I1122 08:23:23.943871 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:23Z","lastTransitionTime":"2025-11-22T08:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.046132 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.046179 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.046189 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.046209 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.046219 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:24Z","lastTransitionTime":"2025-11-22T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.149261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.149307 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.149317 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.149335 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.149346 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:24Z","lastTransitionTime":"2025-11-22T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.151607 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:24 crc kubenswrapper[4743]: E1122 08:23:24.151727 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.151609 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:24 crc kubenswrapper[4743]: E1122 08:23:24.151944 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.252037 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.252105 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.252116 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.252131 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.252141 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:24Z","lastTransitionTime":"2025-11-22T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.354891 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.354959 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.354975 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.354992 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.355027 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:24Z","lastTransitionTime":"2025-11-22T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.458250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.458298 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.458308 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.458324 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.458336 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:24Z","lastTransitionTime":"2025-11-22T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.561584 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.561619 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.561635 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.561654 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.561665 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:24Z","lastTransitionTime":"2025-11-22T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.664358 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.664428 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.664447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.664475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.664495 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:24Z","lastTransitionTime":"2025-11-22T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.767855 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.767905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.767918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.767934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.767945 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:24Z","lastTransitionTime":"2025-11-22T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.871182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.871270 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.871293 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.871328 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.871350 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:24Z","lastTransitionTime":"2025-11-22T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.975338 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.975404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.975421 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.975445 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:24 crc kubenswrapper[4743]: I1122 08:23:24.975465 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:24Z","lastTransitionTime":"2025-11-22T08:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.078648 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.078691 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.078722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.078739 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.078750 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.120424 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.120496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.120517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.120543 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.120563 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: E1122 08:23:25.136839 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:25Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.142414 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.142472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.142485 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.142508 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.142528 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.151365 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:25 crc kubenswrapper[4743]: E1122 08:23:25.151513 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.151666 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:25 crc kubenswrapper[4743]: E1122 08:23:25.151866 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:25 crc kubenswrapper[4743]: E1122 08:23:25.163531 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:25Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.168356 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.168391 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.168402 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.168418 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.168432 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: E1122 08:23:25.182374 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:25Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.185980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.186053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.186094 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.186115 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.186128 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: E1122 08:23:25.234744 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:25Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.238875 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.238918 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.238929 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.238947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.238961 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: E1122 08:23:25.253026 4743 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3d59bcbc-87c5-44a9-8766-f99eaa2bbc9f\\\",\\\"systemUUID\\\":\\\"b3ab2120-2923-4414-bbef-16ed8728100f\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:25Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:25 crc kubenswrapper[4743]: E1122 08:23:25.253245 4743 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.254875 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.254920 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.254931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.254950 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.254961 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.358122 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.358460 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.358570 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.358709 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.358790 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.464651 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.464735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.464761 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.464792 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.464825 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.567480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.567797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.567962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.568098 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.568264 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.671038 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.671323 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.671403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.671542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.671633 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.774617 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.775259 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.775384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.775596 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.775703 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.877807 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.877898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.877927 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.877998 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.878025 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.980896 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.981223 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.981322 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.981404 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:25 crc kubenswrapper[4743]: I1122 08:23:25.981498 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:25Z","lastTransitionTime":"2025-11-22T08:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.083538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.083604 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.083614 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.083632 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.083647 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:26Z","lastTransitionTime":"2025-11-22T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.150938 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:26 crc kubenswrapper[4743]: E1122 08:23:26.151113 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.150938 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:26 crc kubenswrapper[4743]: E1122 08:23:26.151498 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.186492 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.186522 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.186530 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.186550 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.186569 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:26Z","lastTransitionTime":"2025-11-22T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.288563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.288631 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.288641 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.288658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.288669 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:26Z","lastTransitionTime":"2025-11-22T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.391447 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.391488 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.391498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.391514 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.391525 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:26Z","lastTransitionTime":"2025-11-22T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.494429 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.494468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.494479 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.494495 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.494507 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:26Z","lastTransitionTime":"2025-11-22T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.597314 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.597360 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.597370 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.597388 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.597399 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:26Z","lastTransitionTime":"2025-11-22T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.699976 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.700053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.700065 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.700086 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.700099 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:26Z","lastTransitionTime":"2025-11-22T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.803241 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.803278 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.803286 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.803299 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.803312 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:26Z","lastTransitionTime":"2025-11-22T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.906095 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.906169 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.906197 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.906230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:26 crc kubenswrapper[4743]: I1122 08:23:26.906255 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:26Z","lastTransitionTime":"2025-11-22T08:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.009417 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.009468 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.009484 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.009510 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.009529 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:27Z","lastTransitionTime":"2025-11-22T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.111945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.111993 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.112005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.112023 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.112035 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:27Z","lastTransitionTime":"2025-11-22T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.150652 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.150652 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:27 crc kubenswrapper[4743]: E1122 08:23:27.150931 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:27 crc kubenswrapper[4743]: E1122 08:23:27.151931 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.174438 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:05Z\\\",\\\"message\\\":\\\"237 6462 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx in node crc\\\\nI1122 08:23:05.035302 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1122 08:23:05.035305 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx after 0 failed attempt(s)\\\\nI1122 08:23:05.035313 6462 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx\\\\nI1122 08:23:05.035315 6462 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1122 08:23:05.035302 6462 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:23:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.186917 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.199310 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.211606 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.214821 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.214850 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.214863 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.214877 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.214888 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:27Z","lastTransitionTime":"2025-11-22T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.225066 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.236561 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.247118 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.273375 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.288083 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.299896 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.308805 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.318843 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.318931 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.318945 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.318980 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.318994 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:27Z","lastTransitionTime":"2025-11-22T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.321378 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.335165 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69946320b9db9ab4b35189efa616676972d055242413aa37ff4b1ae5b7af00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:17Z\\\",\\\"message\\\":\\\"2025-11-22T08:22:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_17bf3948-35f9-4255-8597-4844fba11d77\\\\n2025-11-22T08:22:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_17bf3948-35f9-4255-8597-4844fba11d77 to /host/opt/cni/bin/\\\\n2025-11-22T08:22:32Z [verbose] multus-daemon started\\\\n2025-11-22T08:22:32Z [verbose] Readiness Indicator file check\\\\n2025-11-22T08:23:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.346344 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.359427 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.372087 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.383849 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32a39ce8-25e5-4b37-84d7-21027cdb228b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f89923f4bf5b5ce2ce85146d7c472421f1dbc5b8d20103bfd00de9599c2c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.394482 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.405452 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:27Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.420911 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.420938 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.420947 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.420975 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.420987 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:27Z","lastTransitionTime":"2025-11-22T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.522968 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.523017 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.523025 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.523060 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.523070 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:27Z","lastTransitionTime":"2025-11-22T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.626840 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.626898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.626917 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.626941 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.626959 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:27Z","lastTransitionTime":"2025-11-22T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.729649 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.729681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.729690 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.729704 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.729717 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:27Z","lastTransitionTime":"2025-11-22T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.832005 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.832059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.832081 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.832103 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.832120 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:27Z","lastTransitionTime":"2025-11-22T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.934161 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.934253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.934287 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.934321 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:27 crc kubenswrapper[4743]: I1122 08:23:27.934344 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:27Z","lastTransitionTime":"2025-11-22T08:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.036494 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.036539 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.036557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.036594 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.036613 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:28Z","lastTransitionTime":"2025-11-22T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.139362 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.139409 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.139419 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.139435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.139445 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:28Z","lastTransitionTime":"2025-11-22T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.150528 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.150725 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:28 crc kubenswrapper[4743]: E1122 08:23:28.150854 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:28 crc kubenswrapper[4743]: E1122 08:23:28.151009 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.243230 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.243292 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.243306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.243331 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.243345 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:28Z","lastTransitionTime":"2025-11-22T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.345934 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.346009 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.346024 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.346046 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.346058 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:28Z","lastTransitionTime":"2025-11-22T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.449294 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.449375 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.449395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.449425 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.449447 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:28Z","lastTransitionTime":"2025-11-22T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.551796 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.551834 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.551842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.551857 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.551870 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:28Z","lastTransitionTime":"2025-11-22T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.654666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.654719 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.654734 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.654752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.654772 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:28Z","lastTransitionTime":"2025-11-22T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.757636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.757683 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.757692 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.757705 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.757716 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:28Z","lastTransitionTime":"2025-11-22T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.860318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.860380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.860392 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.860411 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.860422 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:28Z","lastTransitionTime":"2025-11-22T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.962697 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.962736 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.962752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.962770 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:28 crc kubenswrapper[4743]: I1122 08:23:28.962782 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:28Z","lastTransitionTime":"2025-11-22T08:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.065756 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.065829 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.065842 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.065860 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.065871 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:29Z","lastTransitionTime":"2025-11-22T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.151249 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.151378 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:29 crc kubenswrapper[4743]: E1122 08:23:29.151410 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:29 crc kubenswrapper[4743]: E1122 08:23:29.151542 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.169006 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.169057 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.169071 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.169089 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.169105 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:29Z","lastTransitionTime":"2025-11-22T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.274656 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.274750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.274773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.274803 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.274829 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:29Z","lastTransitionTime":"2025-11-22T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.377042 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.377090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.377102 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.377118 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.377129 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:29Z","lastTransitionTime":"2025-11-22T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.479986 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.480041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.480057 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.480084 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.480100 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:29Z","lastTransitionTime":"2025-11-22T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.583395 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.583481 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.583505 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.583535 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.583557 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:29Z","lastTransitionTime":"2025-11-22T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.686962 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.687039 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.687063 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.687091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.687109 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:29Z","lastTransitionTime":"2025-11-22T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.790517 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.790622 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.790636 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.790658 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.790675 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:29Z","lastTransitionTime":"2025-11-22T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.893250 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.893316 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.893334 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.893359 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.893377 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:29Z","lastTransitionTime":"2025-11-22T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.995523 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.995599 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.995621 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.995642 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:29 crc kubenswrapper[4743]: I1122 08:23:29.995657 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:29Z","lastTransitionTime":"2025-11-22T08:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.097527 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.097563 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.097595 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.097610 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.097622 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:30Z","lastTransitionTime":"2025-11-22T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.150488 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:30 crc kubenswrapper[4743]: E1122 08:23:30.150732 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.150857 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:30 crc kubenswrapper[4743]: E1122 08:23:30.151022 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.199546 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.199626 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.199643 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.199666 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.199686 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:30Z","lastTransitionTime":"2025-11-22T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.305344 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.305407 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.305426 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.305452 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.305476 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:30Z","lastTransitionTime":"2025-11-22T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.408435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.408511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.408869 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.408905 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.408930 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:30Z","lastTransitionTime":"2025-11-22T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.511010 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.511090 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.511114 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.511143 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.511166 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:30Z","lastTransitionTime":"2025-11-22T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.613337 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.613374 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.613393 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.613415 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.613431 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:30Z","lastTransitionTime":"2025-11-22T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.716394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.716442 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.716454 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.716480 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.716499 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:30Z","lastTransitionTime":"2025-11-22T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.819646 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.819748 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.819781 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.819817 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.819840 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:30Z","lastTransitionTime":"2025-11-22T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.922041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.922091 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.922106 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.922125 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:30 crc kubenswrapper[4743]: I1122 08:23:30.922137 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:30Z","lastTransitionTime":"2025-11-22T08:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.024973 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.025044 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.025064 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.025083 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.025094 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:31Z","lastTransitionTime":"2025-11-22T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.128755 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.128809 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.128818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.128843 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.128853 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:31Z","lastTransitionTime":"2025-11-22T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.151044 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.151155 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:31 crc kubenswrapper[4743]: E1122 08:23:31.151274 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:31 crc kubenswrapper[4743]: E1122 08:23:31.151341 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.232457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.232509 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.232521 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.232542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.232559 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:31Z","lastTransitionTime":"2025-11-22T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.335472 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.335542 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.335560 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.335606 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.335620 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:31Z","lastTransitionTime":"2025-11-22T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.439627 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.439706 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.439726 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.439751 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.439772 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:31Z","lastTransitionTime":"2025-11-22T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.544394 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.544475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.544494 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.544520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.544539 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:31Z","lastTransitionTime":"2025-11-22T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.648318 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.648371 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.648380 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.648400 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.648410 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:31Z","lastTransitionTime":"2025-11-22T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.751703 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.751758 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.751774 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.751791 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.751802 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:31Z","lastTransitionTime":"2025-11-22T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.854502 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.854547 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.854559 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.854599 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.854613 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:31Z","lastTransitionTime":"2025-11-22T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.957557 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.957639 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.957653 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.957671 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:31 crc kubenswrapper[4743]: I1122 08:23:31.957683 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:31Z","lastTransitionTime":"2025-11-22T08:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.061189 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.061253 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.061265 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.061288 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.061304 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:32Z","lastTransitionTime":"2025-11-22T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.118554 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.118853 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.118962 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:36.11892048 +0000 UTC m=+149.825281572 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119053 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.119058 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119079 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119099 4743 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.119120 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.119193 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119330 4743 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119468 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119547 4743 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119565 4743 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119354 4743 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119499 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-11-22 08:24:36.119277741 +0000 UTC m=+149.825638833 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119729 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:24:36.119700674 +0000 UTC m=+149.826061776 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119773 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-11-22 08:24:36.119753345 +0000 UTC m=+149.826114607 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.119805 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-11-22 08:24:36.119790866 +0000 UTC m=+149.826151958 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.150806 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.150966 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.151209 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:32 crc kubenswrapper[4743]: E1122 08:23:32.151000 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.164184 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.164229 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.164240 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.164261 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.164278 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:32Z","lastTransitionTime":"2025-11-22T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.267058 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.267111 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.267122 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.267142 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.267152 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:32Z","lastTransitionTime":"2025-11-22T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.370439 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.370520 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.370538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.370571 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.370648 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:32Z","lastTransitionTime":"2025-11-22T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.473306 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.473398 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.473420 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.473457 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.473484 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:32Z","lastTransitionTime":"2025-11-22T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.576914 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.576994 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.577022 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.577053 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.577074 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:32Z","lastTransitionTime":"2025-11-22T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.682010 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.682119 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.682134 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.682205 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.682226 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:32Z","lastTransitionTime":"2025-11-22T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.785677 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.785745 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.785764 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.785790 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.785810 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:32Z","lastTransitionTime":"2025-11-22T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.889773 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.889846 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.889860 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.889882 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.889902 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:32Z","lastTransitionTime":"2025-11-22T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.992264 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.992350 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.992379 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.992413 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:32 crc kubenswrapper[4743]: I1122 08:23:32.992432 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:32Z","lastTransitionTime":"2025-11-22T08:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.095152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.095204 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.095216 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.095235 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.095249 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:33Z","lastTransitionTime":"2025-11-22T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.151800 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.152057 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:33 crc kubenswrapper[4743]: E1122 08:23:33.152168 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:33 crc kubenswrapper[4743]: E1122 08:23:33.152257 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.152461 4743 scope.go:117] "RemoveContainer" containerID="5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.197778 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.197827 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.197837 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.197856 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.197868 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:33Z","lastTransitionTime":"2025-11-22T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.300688 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.300750 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.300763 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.300787 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.300803 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:33Z","lastTransitionTime":"2025-11-22T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.404554 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.404640 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.404656 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.404681 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.404696 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:33Z","lastTransitionTime":"2025-11-22T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.513475 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.513541 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.513558 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.513601 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.513632 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:33Z","lastTransitionTime":"2025-11-22T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.617511 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.618838 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.618892 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.618924 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.618944 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:33Z","lastTransitionTime":"2025-11-22T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.721680 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.721733 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.721747 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.721767 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.721783 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:33Z","lastTransitionTime":"2025-11-22T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.741322 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/2.log" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.744110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.744643 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.773548 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.803353 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.816851 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.824152 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.824194 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.824202 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.824218 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.824231 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:33Z","lastTransitionTime":"2025-11-22T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.826799 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.848661 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.859836 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32a39ce8-25e5-4b37-84d7-21027cdb228b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f89923f4bf5b5ce2ce85146d7c472421f1dbc5b8d20103bfd00de9599c2c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.873230 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.886413 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.900131 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.916488 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69946320b9db9ab4b35189efa616676972d055242413aa37ff4b1ae5b7af00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:17Z\\\",\\\"message\\\":\\\"2025-11-22T08:22:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_17bf3948-35f9-4255-8597-4844fba11d77\\\\n2025-11-22T08:22:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_17bf3948-35f9-4255-8597-4844fba11d77 to /host/opt/cni/bin/\\\\n2025-11-22T08:22:32Z [verbose] multus-daemon started\\\\n2025-11-22T08:22:32Z [verbose] Readiness Indicator file check\\\\n2025-11-22T08:23:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.926665 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.926724 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.926735 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.926759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.926772 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:33Z","lastTransitionTime":"2025-11-22T08:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.930327 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.945388 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.959062 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.979385 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:33 crc kubenswrapper[4743]: I1122 08:23:33.997096 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:33Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.010228 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.029304 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.029383 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.029417 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.029435 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.029449 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:34Z","lastTransitionTime":"2025-11-22T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.033156 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:05Z\\\",\\\"message\\\":\\\"237 6462 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx in node crc\\\\nI1122 08:23:05.035302 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1122 08:23:05.035305 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx after 0 failed attempt(s)\\\\nI1122 08:23:05.035313 6462 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx\\\\nI1122 08:23:05.035315 6462 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1122 08:23:05.035302 6462 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:23:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.048358 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.064843 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.132403 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.132464 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.132477 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.132498 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.132527 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:34Z","lastTransitionTime":"2025-11-22T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.150892 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.150988 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:34 crc kubenswrapper[4743]: E1122 08:23:34.151074 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:34 crc kubenswrapper[4743]: E1122 08:23:34.151307 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.236815 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.236898 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.236913 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.236936 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.236950 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:34Z","lastTransitionTime":"2025-11-22T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.339669 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.339713 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.339722 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.339738 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.339750 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:34Z","lastTransitionTime":"2025-11-22T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.443291 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.443350 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.443364 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.443384 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.443394 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:34Z","lastTransitionTime":"2025-11-22T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.546710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.546784 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.546801 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.546827 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.546850 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:34Z","lastTransitionTime":"2025-11-22T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.650572 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.650664 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.650684 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.650710 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.650726 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:34Z","lastTransitionTime":"2025-11-22T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.751454 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/3.log" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.752815 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/2.log" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.752981 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.753041 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.753059 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.753085 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.753105 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:34Z","lastTransitionTime":"2025-11-22T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.756924 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" exitCode=1 Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.756978 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.757027 4743 scope.go:117] "RemoveContainer" containerID="5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.758295 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:23:34 crc kubenswrapper[4743]: E1122 08:23:34.758612 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.779048 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec1ffd9a-e133-4f73-8c32-6f9c7c7d1ae0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9822748df146d0d379d5bf2b46b09035165df7513de4837c28b0bb436aaf9222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3c5bf3a7fa7da888e80e67e103088354c91c2d6462b847695495dd49dfdc1ad9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cf8374d17e83c6c1eb88bf5c305468d1e1a7bde3ac22da39bf3cdab5d464b43\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e7372dd6890b7a1a6fd53269827b28904e29d2d7cb4ce09df979645bcd7a3da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33cd9a1070fc97240eec4be4c71188a9eec29af8291020c928bcf0f25a1c46ec\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"le observer\\\\nW1122 08:22:27.653813 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1122 08:22:27.654043 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1122 08:22:27.654998 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1848994225/tls.crt::/tmp/serving-cert-1848994225/tls.key\\\\\\\"\\\\nI1122 08:22:28.441534 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1122 08:22:28.449982 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1122 08:22:28.450017 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1122 08:22:28.450057 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1122 08:22:28.450063 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1122 08:22:28.466054 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1122 08:22:28.466183 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1122 08:22:28.466212 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI1122 08:22:28.466223 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1122 08:22:28.466235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1122 08:22:28.466269 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1122 08:22:28.466281 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1122 08:22:28.466286 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1122 08:22:28.470236 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://79ac73c20d50613d50e79a6f8fd98b233c8738c6c0cedb5f77836fdba47a92f7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9e8088a951e616404c806b76ddaf4d989ba9b5a0fdab0079849750e6fcdb105\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.796142 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://71c71c30bb0c9280ae83b8b669e7ec1aba8c09ef0cb736b07944641149d7e3ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.827150 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.868330 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7v699" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a5a54581-c9c3-4c51-b2ed-a3477e2a3159\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad4f57dbb7fbc8ceb4107820b4c365bf8e6ee056a76680538f9a09c88ffc5c10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nx9j4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7v699\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.871451 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.871496 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.871515 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.871538 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.871556 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:34Z","lastTransitionTime":"2025-11-22T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.894719 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d29494-f9cd-46b7-be04-d7a848a72fee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5cf369ad78e8ed6e47d42ffc5f9f802ded0ece1535d8aceef776908df1996cb5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:05Z\\\",\\\"message\\\":\\\"237 6462 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx in node crc\\\\nI1122 08:23:05.035302 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1122 08:23:05.035305 6462 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx after 0 failed attempt(s)\\\\nI1122 08:23:05.035313 6462 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx\\\\nI1122 08:23:05.035315 6462 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1122 08:23:05.035302 6462 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:23:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:34Z\\\",\\\"message\\\":\\\"ormers/externalversions/factory.go:117\\\\nI1122 08:23:34.131016 6856 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:23:34.131524 6856 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:23:34.131587 6856 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:23:34.131696 6856 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1122 08:23:34.142609 6856 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI1122 08:23:34.142642 6856 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI1122 08:23:34.142705 6856 ovnkube.go:599] Stopped ovnkube\\\\nI1122 08:23:34.142734 6856 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1122 08:23:34.142813 6856 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:23:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vkdp6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-p8glw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.907525 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8426c723-9bfa-4856-b445-b01251484a35\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gr995\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4vkc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.921130 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f4c88f46-4abf-4975-b03c-52d9be99a9e6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b5a9889a51b4b86c7a34c81d74003e32a40641d3920efa07197b416f3e239c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d82ae3bbefbbd3e3b270c91044f7f293f77fb1fa746c8643091af694764fa49c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cj6r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rf4vx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.941041 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"59ff571c-7885-4a2f-a969-f7a3f5431d1d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3d88bc3c7d8076f3a4afe2c48166a0b40ef14947bcd289190830e075455fe165\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b5604d67807ebb526c829be2c1f526b50c5575ad8cfaea91d3d6e25ae0b9d44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea972b56b3c97165597dc7eac3005be8ff00bcf42502fde967e3ec2d84677282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3a280747b878184602dac5fd51cf88e23f3b3f503358266848ba60f63b2d525\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a5d9066157fae5ea0316a65b3c9fd4e9b8dd769d3bc9b45f612e3dbc9342ce91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ea07ce9b6f55e48fed9306611273da875ee9e435bf4d47d6f88d1a5a0369168\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eafb0c5b321dfb5595ed70a6e8969e663e837ca57f9a2951b8fecc0ea38cbefc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c72e560d77f00330022c0ca178315f009386bea15091a471ecb766e854974251\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.960007 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.973729 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.973782 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.973797 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.973817 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.973831 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:34Z","lastTransitionTime":"2025-11-22T08:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.978557 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:28Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:34 crc kubenswrapper[4743]: I1122 08:23:34.992972 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gmgcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06d28622-f91f-485b-9396-f489884f2c13\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://42cd8c3a9206dcfdaf2f2ef5b66a3f41a7c4be1e45c8d8d2421473b07a010b3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cdbcw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:34Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gmgcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:34Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.014731 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8fcce96-e512-4437-bf8f-d56269b1ce26\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4685cf8fb887be8317f398877a87c5df940c147f062dcde86a7d48d158120511\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://037756669e64cfa65b4717e585582c39f77ff12e001f6d74e4ea4590ea095b0f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://340f8c3d68b241637fffb0a0615abf713aac03ca7410cc09ee1cd53ed124ca83\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f81f76414db158ad40bb01fcff86c3b8765af756675a6dbebc2e2a64530563\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db082b51abec93291706d70c825fe29179fddabf699560a483918878d5ac3e60\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4216a8c68d11ec64cfadee5aa5fe21ee52f085bd1e89968c25a124da81ec5bf3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4da165bd569df435235892e272e4f342bfca63ea9abda8294331050b12fb2038\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4knjx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-mwvcl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.031670 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"47324a66-c45e-4968-91f0-b3eed00698f3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9035be97b6dbd0b5ae618f206f41db3f9f5e92b2f7c78ea3ecfc55d6e64f996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd73abe92833b75b1379c4b7fca26ae1a1ea481de3c1fb003bed9a1d18a7351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://879024dcf6f0c002bdbf90970153c7c6a871c8a6bdbfe4f3c8e65e662aca5793\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://65a23f38ca41241615e7f0b624b72eeb994c4f427fab04fa901dbc946a947f16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.043626 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32a39ce8-25e5-4b37-84d7-21027cdb228b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4f89923f4bf5b5ce2ce85146d7c472421f1dbc5b8d20103bfd00de9599c2c17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bf5f10baf0243417a61f57407713cd01fcc0749c4191949cf5573d301f3b2ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-11-22T08:22:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.057128 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b2160ee4-015a-483f-9da9-e81ee0d5ef10\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001cbf5deb3a624c4f8795a7f2728fa0eb18c8fdc58c491659a6843cff0e6ed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7a70e101cd7563f1d1475375953e0f70deeda9a6bbd49d8ef03db89c7fd5ebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6533225ac1bf802df22df266ff22096e6da09a5b98073d8c97cff7836f1f8d21\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba6323309c784ad60f532bab6db1a5a0cf233ebf959855f67cd110cc3c90fdba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.075263 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41beddfc998bfb3d14afcdea1d42c7a5de9fe328649202b231d1afbfb8128942\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd90619f4355a4b21385e33f6d959fae199ab792bb84bc59e942162555491eb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.076759 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.076786 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.076799 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.076818 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.076830 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:35Z","lastTransitionTime":"2025-11-22T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.093301 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://590c327275040c366966b6bae7f849b0bf16ff5920aaade23a596eb3fda917c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.108026 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cbpnf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1de4b47-eed0-431f-a7a9-a944ce8791bd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:23:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69946320b9db9ab4b35189efa616676972d055242413aa37ff4b1ae5b7af00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-11-22T08:23:17Z\\\",\\\"message\\\":\\\"2025-11-22T08:22:31+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_17bf3948-35f9-4255-8597-4844fba11d77\\\\n2025-11-22T08:22:31+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_17bf3948-35f9-4255-8597-4844fba11d77 to /host/opt/cni/bin/\\\\n2025-11-22T08:22:32Z [verbose] multus-daemon started\\\\n2025-11-22T08:22:32Z [verbose] Readiness Indicator file check\\\\n2025-11-22T08:23:17Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:23:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hd9v8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cbpnf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.120476 4743 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bae39197-d188-40a8-880d-0d2e6e528f86\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T08:22:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47aa3d16a3b5d9ceb662f2c2901ee2f36a418a2fc2d4d3ad6d60347a929bea24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-11-22T08:22:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9ddkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T08:22:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xk98p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-11-22T08:23:35Z is after 2025-08-24T17:21:41Z" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.151940 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.152042 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:35 crc kubenswrapper[4743]: E1122 08:23:35.152223 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:35 crc kubenswrapper[4743]: E1122 08:23:35.152359 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.179495 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.179555 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.179601 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.179629 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.179652 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:35Z","lastTransitionTime":"2025-11-22T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.282854 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.282940 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.282975 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.283120 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.283172 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:35Z","lastTransitionTime":"2025-11-22T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.385752 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.385789 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.385799 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.385814 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.385823 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:35Z","lastTransitionTime":"2025-11-22T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.414138 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.414182 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.414193 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.414211 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.414223 4743 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-11-22T08:23:35Z","lastTransitionTime":"2025-11-22T08:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.469993 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg"] Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.470453 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.474751 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.475570 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.475831 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.480192 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.541850 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7v699" podStartSLOduration=67.541832395 podStartE2EDuration="1m7.541832395s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.521144551 +0000 UTC m=+89.227505603" watchObservedRunningTime="2025-11-22 08:23:35.541832395 +0000 UTC m=+89.248193447" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.562151 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.562201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.562241 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.562416 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.562466 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.585955 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=66.585939286 podStartE2EDuration="1m6.585939286s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.585626307 +0000 UTC m=+89.291987369" watchObservedRunningTime="2025-11-22 08:23:35.585939286 +0000 UTC m=+89.292300358" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.600661 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rf4vx" podStartSLOduration=66.600638796 podStartE2EDuration="1m6.600638796s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.60042818 +0000 UTC m=+89.306789242" watchObservedRunningTime="2025-11-22 08:23:35.600638796 +0000 UTC m=+89.306999848" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.635718 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gmgcj" podStartSLOduration=67.6356953 podStartE2EDuration="1m7.6356953s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.635563486 +0000 UTC m=+89.341924538" watchObservedRunningTime="2025-11-22 08:23:35.6356953 +0000 UTC m=+89.342056352" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.659436 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.659414647 podStartE2EDuration="1m7.659414647s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.659142139 +0000 UTC m=+89.365503191" watchObservedRunningTime="2025-11-22 08:23:35.659414647 +0000 UTC m=+89.365775699" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.663800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.664033 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.664062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.664103 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.664122 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.664127 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.664483 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.665021 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-service-ca\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.670091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.672617 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=22.672602861 podStartE2EDuration="22.672602861s" podCreationTimestamp="2025-11-22 08:23:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.672122966 +0000 UTC m=+89.378484018" watchObservedRunningTime="2025-11-22 08:23:35.672602861 +0000 UTC m=+89.378963913" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.684832 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=62.684811995 podStartE2EDuration="1m2.684811995s" podCreationTimestamp="2025-11-22 08:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.684432964 +0000 UTC m=+89.390794016" watchObservedRunningTime="2025-11-22 08:23:35.684811995 +0000 UTC m=+89.391173047" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.690310 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/895abcbc-2d6b-40bd-ba24-a4431ba91aa7-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-pqqjg\" (UID: \"895abcbc-2d6b-40bd-ba24-a4431ba91aa7\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.725890 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cbpnf" podStartSLOduration=67.725865313 podStartE2EDuration="1m7.725865313s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.725537683 +0000 UTC m=+89.431898745" watchObservedRunningTime="2025-11-22 08:23:35.725865313 +0000 UTC m=+89.432226365" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.738362 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podStartSLOduration=67.738338275 podStartE2EDuration="1m7.738338275s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.737795758 +0000 UTC m=+89.444156810" watchObservedRunningTime="2025-11-22 08:23:35.738338275 +0000 UTC m=+89.444699347" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.755622 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mwvcl" podStartSLOduration=67.755601584 podStartE2EDuration="1m7.755601584s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.754775479 +0000 UTC m=+89.461136581" watchObservedRunningTime="2025-11-22 08:23:35.755601584 +0000 UTC m=+89.461962646" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.762329 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/3.log" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.766364 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:23:35 crc kubenswrapper[4743]: E1122 08:23:35.766549 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.773933 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.773911315 podStartE2EDuration="35.773911315s" podCreationTimestamp="2025-11-22 08:23:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:35.772990787 +0000 UTC m=+89.479351869" watchObservedRunningTime="2025-11-22 08:23:35.773911315 +0000 UTC m=+89.480272367" Nov 22 08:23:35 crc kubenswrapper[4743]: I1122 08:23:35.782646 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" Nov 22 08:23:35 crc kubenswrapper[4743]: W1122 08:23:35.793075 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod895abcbc_2d6b_40bd_ba24_a4431ba91aa7.slice/crio-d44ecc90f768b6e7435b6c534e449bfd233f0ba687e6b0febb24cd62ef4ef37d WatchSource:0}: Error finding container d44ecc90f768b6e7435b6c534e449bfd233f0ba687e6b0febb24cd62ef4ef37d: Status 404 returned error can't find the container with id d44ecc90f768b6e7435b6c534e449bfd233f0ba687e6b0febb24cd62ef4ef37d Nov 22 08:23:36 crc kubenswrapper[4743]: I1122 08:23:36.150858 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:36 crc kubenswrapper[4743]: I1122 08:23:36.150971 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:36 crc kubenswrapper[4743]: E1122 08:23:36.151165 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:36 crc kubenswrapper[4743]: E1122 08:23:36.151363 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:36 crc kubenswrapper[4743]: I1122 08:23:36.771440 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" event={"ID":"895abcbc-2d6b-40bd-ba24-a4431ba91aa7","Type":"ContainerStarted","Data":"8c67c5912893b53f15d4152078e8f79e7cb58cac9a61e3cec1ed13c009a9834d"} Nov 22 08:23:36 crc kubenswrapper[4743]: I1122 08:23:36.771503 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" event={"ID":"895abcbc-2d6b-40bd-ba24-a4431ba91aa7","Type":"ContainerStarted","Data":"d44ecc90f768b6e7435b6c534e449bfd233f0ba687e6b0febb24cd62ef4ef37d"} Nov 22 08:23:37 crc kubenswrapper[4743]: I1122 08:23:37.150739 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:37 crc kubenswrapper[4743]: I1122 08:23:37.150785 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:37 crc kubenswrapper[4743]: E1122 08:23:37.152103 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:37 crc kubenswrapper[4743]: E1122 08:23:37.152199 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:38 crc kubenswrapper[4743]: I1122 08:23:38.150721 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:38 crc kubenswrapper[4743]: I1122 08:23:38.150779 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:38 crc kubenswrapper[4743]: E1122 08:23:38.151624 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:38 crc kubenswrapper[4743]: E1122 08:23:38.151912 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:39 crc kubenswrapper[4743]: I1122 08:23:39.151566 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:39 crc kubenswrapper[4743]: I1122 08:23:39.151605 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:39 crc kubenswrapper[4743]: E1122 08:23:39.151726 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:39 crc kubenswrapper[4743]: E1122 08:23:39.151905 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:40 crc kubenswrapper[4743]: I1122 08:23:40.150625 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:40 crc kubenswrapper[4743]: E1122 08:23:40.150837 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:40 crc kubenswrapper[4743]: I1122 08:23:40.151134 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:40 crc kubenswrapper[4743]: E1122 08:23:40.151321 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:41 crc kubenswrapper[4743]: I1122 08:23:41.150992 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:41 crc kubenswrapper[4743]: E1122 08:23:41.151174 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:41 crc kubenswrapper[4743]: I1122 08:23:41.151816 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:41 crc kubenswrapper[4743]: E1122 08:23:41.152101 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:42 crc kubenswrapper[4743]: I1122 08:23:42.151446 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:42 crc kubenswrapper[4743]: I1122 08:23:42.151465 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:42 crc kubenswrapper[4743]: E1122 08:23:42.151687 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:42 crc kubenswrapper[4743]: E1122 08:23:42.151733 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:43 crc kubenswrapper[4743]: I1122 08:23:43.151082 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:43 crc kubenswrapper[4743]: I1122 08:23:43.151092 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:43 crc kubenswrapper[4743]: E1122 08:23:43.151213 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:43 crc kubenswrapper[4743]: E1122 08:23:43.151319 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:44 crc kubenswrapper[4743]: I1122 08:23:44.151507 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:44 crc kubenswrapper[4743]: E1122 08:23:44.151653 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:44 crc kubenswrapper[4743]: I1122 08:23:44.151662 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:44 crc kubenswrapper[4743]: E1122 08:23:44.151855 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:45 crc kubenswrapper[4743]: I1122 08:23:45.151097 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:45 crc kubenswrapper[4743]: I1122 08:23:45.151169 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:45 crc kubenswrapper[4743]: E1122 08:23:45.151319 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:45 crc kubenswrapper[4743]: E1122 08:23:45.151495 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:46 crc kubenswrapper[4743]: I1122 08:23:46.150956 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:46 crc kubenswrapper[4743]: I1122 08:23:46.151064 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:46 crc kubenswrapper[4743]: E1122 08:23:46.151215 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:46 crc kubenswrapper[4743]: E1122 08:23:46.151357 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:47 crc kubenswrapper[4743]: I1122 08:23:47.150798 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:47 crc kubenswrapper[4743]: I1122 08:23:47.150840 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:47 crc kubenswrapper[4743]: E1122 08:23:47.152044 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:47 crc kubenswrapper[4743]: E1122 08:23:47.152137 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:47 crc kubenswrapper[4743]: I1122 08:23:47.739666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:47 crc kubenswrapper[4743]: E1122 08:23:47.739791 4743 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:23:47 crc kubenswrapper[4743]: E1122 08:23:47.739868 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs podName:8426c723-9bfa-4856-b445-b01251484a35 nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.739847406 +0000 UTC m=+165.446208458 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs") pod "network-metrics-daemon-4vkc4" (UID: "8426c723-9bfa-4856-b445-b01251484a35") : object "openshift-multus"/"metrics-daemon-secret" not registered Nov 22 08:23:48 crc kubenswrapper[4743]: I1122 08:23:48.151491 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:48 crc kubenswrapper[4743]: I1122 08:23:48.151603 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:48 crc kubenswrapper[4743]: E1122 08:23:48.151659 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:48 crc kubenswrapper[4743]: E1122 08:23:48.151742 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:48 crc kubenswrapper[4743]: I1122 08:23:48.152462 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:23:48 crc kubenswrapper[4743]: E1122 08:23:48.152677 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" Nov 22 08:23:49 crc kubenswrapper[4743]: I1122 08:23:49.150860 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:49 crc kubenswrapper[4743]: I1122 08:23:49.150942 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:49 crc kubenswrapper[4743]: E1122 08:23:49.151024 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:49 crc kubenswrapper[4743]: E1122 08:23:49.151160 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:50 crc kubenswrapper[4743]: I1122 08:23:50.150924 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:50 crc kubenswrapper[4743]: E1122 08:23:50.151131 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:50 crc kubenswrapper[4743]: I1122 08:23:50.150957 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:50 crc kubenswrapper[4743]: E1122 08:23:50.151380 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:51 crc kubenswrapper[4743]: I1122 08:23:51.151250 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:51 crc kubenswrapper[4743]: I1122 08:23:51.151300 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:51 crc kubenswrapper[4743]: E1122 08:23:51.152913 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:51 crc kubenswrapper[4743]: E1122 08:23:51.153096 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:52 crc kubenswrapper[4743]: I1122 08:23:52.151002 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:52 crc kubenswrapper[4743]: I1122 08:23:52.151107 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:52 crc kubenswrapper[4743]: E1122 08:23:52.151168 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:52 crc kubenswrapper[4743]: E1122 08:23:52.151381 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:53 crc kubenswrapper[4743]: I1122 08:23:53.151517 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:53 crc kubenswrapper[4743]: E1122 08:23:53.151711 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:53 crc kubenswrapper[4743]: I1122 08:23:53.151526 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:53 crc kubenswrapper[4743]: E1122 08:23:53.151816 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:54 crc kubenswrapper[4743]: I1122 08:23:54.150990 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:54 crc kubenswrapper[4743]: I1122 08:23:54.151045 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:54 crc kubenswrapper[4743]: E1122 08:23:54.151166 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:54 crc kubenswrapper[4743]: E1122 08:23:54.151392 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:55 crc kubenswrapper[4743]: I1122 08:23:55.150610 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:55 crc kubenswrapper[4743]: E1122 08:23:55.150767 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:55 crc kubenswrapper[4743]: I1122 08:23:55.151125 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:55 crc kubenswrapper[4743]: E1122 08:23:55.151201 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:56 crc kubenswrapper[4743]: I1122 08:23:56.151436 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:56 crc kubenswrapper[4743]: E1122 08:23:56.152382 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:56 crc kubenswrapper[4743]: I1122 08:23:56.151785 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:56 crc kubenswrapper[4743]: E1122 08:23:56.152702 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:57 crc kubenswrapper[4743]: I1122 08:23:57.151249 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:57 crc kubenswrapper[4743]: I1122 08:23:57.152432 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:57 crc kubenswrapper[4743]: E1122 08:23:57.152645 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:23:57 crc kubenswrapper[4743]: E1122 08:23:57.152795 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:58 crc kubenswrapper[4743]: I1122 08:23:58.151264 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:23:58 crc kubenswrapper[4743]: I1122 08:23:58.151337 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:23:58 crc kubenswrapper[4743]: E1122 08:23:58.151733 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:23:58 crc kubenswrapper[4743]: E1122 08:23:58.152409 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:23:59 crc kubenswrapper[4743]: I1122 08:23:59.150807 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:23:59 crc kubenswrapper[4743]: E1122 08:23:59.150955 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:23:59 crc kubenswrapper[4743]: I1122 08:23:59.151857 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:23:59 crc kubenswrapper[4743]: E1122 08:23:59.152022 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" Nov 22 08:23:59 crc kubenswrapper[4743]: I1122 08:23:59.152185 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:23:59 crc kubenswrapper[4743]: E1122 08:23:59.152360 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:00 crc kubenswrapper[4743]: I1122 08:24:00.150485 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:00 crc kubenswrapper[4743]: I1122 08:24:00.150566 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:00 crc kubenswrapper[4743]: E1122 08:24:00.150658 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:00 crc kubenswrapper[4743]: E1122 08:24:00.150762 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:01 crc kubenswrapper[4743]: I1122 08:24:01.151279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:01 crc kubenswrapper[4743]: I1122 08:24:01.151303 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:01 crc kubenswrapper[4743]: E1122 08:24:01.151412 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:01 crc kubenswrapper[4743]: E1122 08:24:01.151672 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:02 crc kubenswrapper[4743]: I1122 08:24:02.150757 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:02 crc kubenswrapper[4743]: I1122 08:24:02.150873 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:02 crc kubenswrapper[4743]: E1122 08:24:02.151048 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:02 crc kubenswrapper[4743]: E1122 08:24:02.151154 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:03 crc kubenswrapper[4743]: I1122 08:24:03.151624 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:03 crc kubenswrapper[4743]: I1122 08:24:03.151722 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:03 crc kubenswrapper[4743]: E1122 08:24:03.152153 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:03 crc kubenswrapper[4743]: E1122 08:24:03.152004 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:03 crc kubenswrapper[4743]: I1122 08:24:03.860272 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cbpnf_a1de4b47-eed0-431f-a7a9-a944ce8791bd/kube-multus/1.log" Nov 22 08:24:03 crc kubenswrapper[4743]: I1122 08:24:03.860833 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cbpnf_a1de4b47-eed0-431f-a7a9-a944ce8791bd/kube-multus/0.log" Nov 22 08:24:03 crc kubenswrapper[4743]: I1122 08:24:03.860886 4743 generic.go:334] "Generic (PLEG): container finished" podID="a1de4b47-eed0-431f-a7a9-a944ce8791bd" containerID="d69946320b9db9ab4b35189efa616676972d055242413aa37ff4b1ae5b7af00d" exitCode=1 Nov 22 08:24:03 crc kubenswrapper[4743]: I1122 08:24:03.860919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cbpnf" event={"ID":"a1de4b47-eed0-431f-a7a9-a944ce8791bd","Type":"ContainerDied","Data":"d69946320b9db9ab4b35189efa616676972d055242413aa37ff4b1ae5b7af00d"} Nov 22 08:24:03 crc kubenswrapper[4743]: I1122 08:24:03.860953 4743 scope.go:117] "RemoveContainer" containerID="42b1636b6a4f5b1bd30c6a1f4b21745b25fb419994e41ceef5253ae1937f525b" Nov 22 08:24:03 crc kubenswrapper[4743]: I1122 08:24:03.861676 4743 scope.go:117] "RemoveContainer" containerID="d69946320b9db9ab4b35189efa616676972d055242413aa37ff4b1ae5b7af00d" Nov 22 08:24:03 crc kubenswrapper[4743]: E1122 08:24:03.861875 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-cbpnf_openshift-multus(a1de4b47-eed0-431f-a7a9-a944ce8791bd)\"" pod="openshift-multus/multus-cbpnf" podUID="a1de4b47-eed0-431f-a7a9-a944ce8791bd" Nov 22 08:24:03 crc kubenswrapper[4743]: I1122 08:24:03.878532 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-pqqjg" podStartSLOduration=95.878514205 podStartE2EDuration="1m35.878514205s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:23:36.793218223 +0000 UTC m=+90.499579315" watchObservedRunningTime="2025-11-22 08:24:03.878514205 +0000 UTC m=+117.584875267" Nov 22 08:24:04 crc kubenswrapper[4743]: I1122 08:24:04.151451 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:04 crc kubenswrapper[4743]: I1122 08:24:04.151778 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:04 crc kubenswrapper[4743]: E1122 08:24:04.151999 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:04 crc kubenswrapper[4743]: E1122 08:24:04.152235 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:04 crc kubenswrapper[4743]: I1122 08:24:04.866430 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cbpnf_a1de4b47-eed0-431f-a7a9-a944ce8791bd/kube-multus/1.log" Nov 22 08:24:05 crc kubenswrapper[4743]: I1122 08:24:05.151012 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:05 crc kubenswrapper[4743]: E1122 08:24:05.151234 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:05 crc kubenswrapper[4743]: I1122 08:24:05.151682 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:05 crc kubenswrapper[4743]: E1122 08:24:05.151787 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:06 crc kubenswrapper[4743]: I1122 08:24:06.151094 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:06 crc kubenswrapper[4743]: E1122 08:24:06.151288 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:06 crc kubenswrapper[4743]: I1122 08:24:06.151129 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:06 crc kubenswrapper[4743]: E1122 08:24:06.151987 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:07 crc kubenswrapper[4743]: E1122 08:24:07.112892 4743 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Nov 22 08:24:07 crc kubenswrapper[4743]: I1122 08:24:07.151876 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:07 crc kubenswrapper[4743]: I1122 08:24:07.152032 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:07 crc kubenswrapper[4743]: E1122 08:24:07.152470 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:07 crc kubenswrapper[4743]: E1122 08:24:07.152693 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:07 crc kubenswrapper[4743]: E1122 08:24:07.259280 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 08:24:08 crc kubenswrapper[4743]: I1122 08:24:08.151472 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:08 crc kubenswrapper[4743]: I1122 08:24:08.151498 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:08 crc kubenswrapper[4743]: E1122 08:24:08.151666 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:08 crc kubenswrapper[4743]: E1122 08:24:08.151893 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:09 crc kubenswrapper[4743]: I1122 08:24:09.151819 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:09 crc kubenswrapper[4743]: E1122 08:24:09.152086 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:09 crc kubenswrapper[4743]: I1122 08:24:09.151892 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:09 crc kubenswrapper[4743]: E1122 08:24:09.153602 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:10 crc kubenswrapper[4743]: I1122 08:24:10.151377 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:10 crc kubenswrapper[4743]: I1122 08:24:10.151568 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:10 crc kubenswrapper[4743]: E1122 08:24:10.151703 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:10 crc kubenswrapper[4743]: E1122 08:24:10.151823 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:11 crc kubenswrapper[4743]: I1122 08:24:11.151473 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:11 crc kubenswrapper[4743]: I1122 08:24:11.151686 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:11 crc kubenswrapper[4743]: E1122 08:24:11.151748 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:11 crc kubenswrapper[4743]: E1122 08:24:11.151931 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:12 crc kubenswrapper[4743]: I1122 08:24:12.150889 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:12 crc kubenswrapper[4743]: I1122 08:24:12.150969 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:12 crc kubenswrapper[4743]: E1122 08:24:12.151085 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:12 crc kubenswrapper[4743]: E1122 08:24:12.151188 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:12 crc kubenswrapper[4743]: E1122 08:24:12.261284 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 08:24:13 crc kubenswrapper[4743]: I1122 08:24:13.151420 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:13 crc kubenswrapper[4743]: I1122 08:24:13.151473 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:13 crc kubenswrapper[4743]: E1122 08:24:13.151567 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:13 crc kubenswrapper[4743]: E1122 08:24:13.151814 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:14 crc kubenswrapper[4743]: I1122 08:24:14.151297 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:14 crc kubenswrapper[4743]: E1122 08:24:14.151790 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:14 crc kubenswrapper[4743]: I1122 08:24:14.151314 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:14 crc kubenswrapper[4743]: E1122 08:24:14.152183 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:14 crc kubenswrapper[4743]: I1122 08:24:14.152382 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:24:14 crc kubenswrapper[4743]: E1122 08:24:14.152528 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-p8glw_openshift-ovn-kubernetes(35d29494-f9cd-46b7-be04-d7a848a72fee)\"" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" Nov 22 08:24:15 crc kubenswrapper[4743]: I1122 08:24:15.150876 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:15 crc kubenswrapper[4743]: I1122 08:24:15.150898 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:15 crc kubenswrapper[4743]: E1122 08:24:15.151060 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:15 crc kubenswrapper[4743]: E1122 08:24:15.151041 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:16 crc kubenswrapper[4743]: I1122 08:24:16.150548 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:16 crc kubenswrapper[4743]: I1122 08:24:16.150770 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:16 crc kubenswrapper[4743]: E1122 08:24:16.150919 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:16 crc kubenswrapper[4743]: E1122 08:24:16.150977 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:17 crc kubenswrapper[4743]: I1122 08:24:17.150737 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:17 crc kubenswrapper[4743]: I1122 08:24:17.151034 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:17 crc kubenswrapper[4743]: I1122 08:24:17.152599 4743 scope.go:117] "RemoveContainer" containerID="d69946320b9db9ab4b35189efa616676972d055242413aa37ff4b1ae5b7af00d" Nov 22 08:24:17 crc kubenswrapper[4743]: E1122 08:24:17.152707 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:17 crc kubenswrapper[4743]: E1122 08:24:17.152921 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:17 crc kubenswrapper[4743]: E1122 08:24:17.262005 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 08:24:17 crc kubenswrapper[4743]: I1122 08:24:17.922705 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cbpnf_a1de4b47-eed0-431f-a7a9-a944ce8791bd/kube-multus/1.log" Nov 22 08:24:17 crc kubenswrapper[4743]: I1122 08:24:17.922788 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cbpnf" event={"ID":"a1de4b47-eed0-431f-a7a9-a944ce8791bd","Type":"ContainerStarted","Data":"0902f0de82c42e6e1f407e388c2f9fa1998f6da031b9008f5ddf06d7a8fda6ee"} Nov 22 08:24:18 crc kubenswrapper[4743]: I1122 08:24:18.151144 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:18 crc kubenswrapper[4743]: I1122 08:24:18.151151 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:18 crc kubenswrapper[4743]: E1122 08:24:18.151424 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:18 crc kubenswrapper[4743]: E1122 08:24:18.151283 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:19 crc kubenswrapper[4743]: I1122 08:24:19.151007 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:19 crc kubenswrapper[4743]: E1122 08:24:19.151156 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:19 crc kubenswrapper[4743]: I1122 08:24:19.151008 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:19 crc kubenswrapper[4743]: E1122 08:24:19.151408 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:20 crc kubenswrapper[4743]: I1122 08:24:20.151208 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:20 crc kubenswrapper[4743]: I1122 08:24:20.151250 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:20 crc kubenswrapper[4743]: E1122 08:24:20.151337 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:20 crc kubenswrapper[4743]: E1122 08:24:20.151425 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:21 crc kubenswrapper[4743]: I1122 08:24:21.150618 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:21 crc kubenswrapper[4743]: I1122 08:24:21.150638 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:21 crc kubenswrapper[4743]: E1122 08:24:21.150743 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:21 crc kubenswrapper[4743]: E1122 08:24:21.150880 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:22 crc kubenswrapper[4743]: I1122 08:24:22.151367 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:22 crc kubenswrapper[4743]: I1122 08:24:22.151522 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:22 crc kubenswrapper[4743]: E1122 08:24:22.151686 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:22 crc kubenswrapper[4743]: E1122 08:24:22.151872 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:22 crc kubenswrapper[4743]: E1122 08:24:22.262828 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 08:24:23 crc kubenswrapper[4743]: I1122 08:24:23.151065 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:23 crc kubenswrapper[4743]: I1122 08:24:23.151061 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:23 crc kubenswrapper[4743]: E1122 08:24:23.151215 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:23 crc kubenswrapper[4743]: E1122 08:24:23.151420 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:24 crc kubenswrapper[4743]: I1122 08:24:24.150761 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:24 crc kubenswrapper[4743]: I1122 08:24:24.150957 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:24 crc kubenswrapper[4743]: E1122 08:24:24.150974 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:24 crc kubenswrapper[4743]: E1122 08:24:24.151229 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:25 crc kubenswrapper[4743]: I1122 08:24:25.151735 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:25 crc kubenswrapper[4743]: I1122 08:24:25.151820 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:25 crc kubenswrapper[4743]: E1122 08:24:25.151925 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:25 crc kubenswrapper[4743]: E1122 08:24:25.152113 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:26 crc kubenswrapper[4743]: I1122 08:24:26.151153 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:26 crc kubenswrapper[4743]: I1122 08:24:26.151153 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:26 crc kubenswrapper[4743]: E1122 08:24:26.152678 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:26 crc kubenswrapper[4743]: E1122 08:24:26.153603 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:26 crc kubenswrapper[4743]: I1122 08:24:26.155499 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:24:26 crc kubenswrapper[4743]: I1122 08:24:26.955004 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/3.log" Nov 22 08:24:26 crc kubenswrapper[4743]: I1122 08:24:26.958457 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerStarted","Data":"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472"} Nov 22 08:24:26 crc kubenswrapper[4743]: I1122 08:24:26.959667 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:24:26 crc kubenswrapper[4743]: I1122 08:24:26.994517 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podStartSLOduration=118.994497028 podStartE2EDuration="1m58.994497028s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:26.99424199 +0000 UTC m=+140.700603062" watchObservedRunningTime="2025-11-22 08:24:26.994497028 +0000 UTC m=+140.700858100" Nov 22 08:24:27 crc kubenswrapper[4743]: I1122 08:24:27.024552 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4vkc4"] Nov 22 08:24:27 crc kubenswrapper[4743]: I1122 08:24:27.024696 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:27 crc kubenswrapper[4743]: E1122 08:24:27.024805 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:27 crc kubenswrapper[4743]: I1122 08:24:27.151359 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:27 crc kubenswrapper[4743]: E1122 08:24:27.152747 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:27 crc kubenswrapper[4743]: E1122 08:24:27.263396 4743 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 08:24:28 crc kubenswrapper[4743]: I1122 08:24:28.150625 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:28 crc kubenswrapper[4743]: I1122 08:24:28.150705 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:28 crc kubenswrapper[4743]: I1122 08:24:28.150646 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:28 crc kubenswrapper[4743]: E1122 08:24:28.150806 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:28 crc kubenswrapper[4743]: E1122 08:24:28.150931 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:28 crc kubenswrapper[4743]: E1122 08:24:28.150992 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:29 crc kubenswrapper[4743]: I1122 08:24:29.151267 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:29 crc kubenswrapper[4743]: E1122 08:24:29.151432 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:30 crc kubenswrapper[4743]: I1122 08:24:30.151185 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:30 crc kubenswrapper[4743]: I1122 08:24:30.151278 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:30 crc kubenswrapper[4743]: E1122 08:24:30.151781 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:30 crc kubenswrapper[4743]: I1122 08:24:30.151290 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:30 crc kubenswrapper[4743]: E1122 08:24:30.151945 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:30 crc kubenswrapper[4743]: E1122 08:24:30.152162 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:31 crc kubenswrapper[4743]: I1122 08:24:31.150713 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:31 crc kubenswrapper[4743]: E1122 08:24:31.150850 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Nov 22 08:24:31 crc kubenswrapper[4743]: I1122 08:24:31.241313 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:24:31 crc kubenswrapper[4743]: I1122 08:24:31.241392 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:24:32 crc kubenswrapper[4743]: I1122 08:24:32.151111 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:32 crc kubenswrapper[4743]: I1122 08:24:32.151127 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:32 crc kubenswrapper[4743]: I1122 08:24:32.151137 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:32 crc kubenswrapper[4743]: E1122 08:24:32.151416 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Nov 22 08:24:32 crc kubenswrapper[4743]: E1122 08:24:32.151459 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Nov 22 08:24:32 crc kubenswrapper[4743]: E1122 08:24:32.151255 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4vkc4" podUID="8426c723-9bfa-4856-b445-b01251484a35" Nov 22 08:24:33 crc kubenswrapper[4743]: I1122 08:24:33.151224 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:33 crc kubenswrapper[4743]: I1122 08:24:33.154738 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Nov 22 08:24:33 crc kubenswrapper[4743]: I1122 08:24:33.154817 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Nov 22 08:24:34 crc kubenswrapper[4743]: I1122 08:24:34.151463 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:34 crc kubenswrapper[4743]: I1122 08:24:34.151628 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:34 crc kubenswrapper[4743]: I1122 08:24:34.151650 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:34 crc kubenswrapper[4743]: I1122 08:24:34.154178 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Nov 22 08:24:34 crc kubenswrapper[4743]: I1122 08:24:34.154335 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Nov 22 08:24:34 crc kubenswrapper[4743]: I1122 08:24:34.157269 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Nov 22 08:24:34 crc kubenswrapper[4743]: I1122 08:24:34.157332 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.177218 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.177369 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:36 crc kubenswrapper[4743]: E1122 08:24:36.177401 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:26:38.177375142 +0000 UTC m=+271.883736194 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.177446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.177491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.177518 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.178655 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.183495 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.183776 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.185000 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.271823 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.279607 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.476475 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Nov 22 08:24:36 crc kubenswrapper[4743]: W1122 08:24:36.601634 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-9168ed4e690b1612c2848c5b8f7d0d60d501c5d23213998f71265c028be96450 WatchSource:0}: Error finding container 9168ed4e690b1612c2848c5b8f7d0d60d501c5d23213998f71265c028be96450: Status 404 returned error can't find the container with id 9168ed4e690b1612c2848c5b8f7d0d60d501c5d23213998f71265c028be96450 Nov 22 08:24:36 crc kubenswrapper[4743]: W1122 08:24:36.658700 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-2ed21511e4a3d43070b49c568e3c89d51812a20a961ac6f6ed0b27e1a3640598 WatchSource:0}: Error finding container 2ed21511e4a3d43070b49c568e3c89d51812a20a961ac6f6ed0b27e1a3640598: Status 404 returned error can't find the container with id 2ed21511e4a3d43070b49c568e3c89d51812a20a961ac6f6ed0b27e1a3640598 Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.846327 4743 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.886477 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m8pkf"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.887232 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.888664 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gwf6h"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.889355 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.889595 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8c5mq"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.890040 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.917533 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.919861 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.920842 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.923151 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-42kzd"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.925944 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.928330 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.938094 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.938302 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.938548 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.938725 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.938878 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.940777 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.940848 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.940968 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.943863 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.944159 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.944293 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.944383 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.944630 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.944853 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.944860 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.944909 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.945038 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.945315 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.945447 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.945564 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.945386 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.945901 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.945999 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.945860 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.945801 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.947089 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.947639 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.945420 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.948320 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.950515 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6p2jm"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.951039 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.954918 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rmjqq"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.955388 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.955675 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-5tsjj"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.956266 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.956664 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.956920 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5tsjj" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.959881 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.960605 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.960705 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.960890 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.960975 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.961010 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.961125 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.961199 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.961273 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.961432 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.961494 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.960621 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.961736 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.961939 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.962182 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.962341 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.962755 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.962999 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.963128 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.963287 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.964314 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.965111 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.969079 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fwvd8"] Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.969986 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.970639 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.978963 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.979829 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.982711 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.983193 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.983257 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.983373 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.983450 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.984633 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Nov 22 08:24:36 crc kubenswrapper[4743]: I1122 08:24:36.985082 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.009811 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.010146 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.013626 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.013658 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.013801 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.013863 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.013934 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.014000 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.014060 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.014313 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.014668 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.014731 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.014910 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.015086 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.015394 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.015485 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.016086 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.016204 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.016403 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.016433 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.019978 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.020105 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.043648 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.043829 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.047317 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hhpxp"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.047893 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.048099 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.048365 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.049633 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.049830 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmvlb\" (UniqueName: \"kubernetes.io/projected/fa343393-45bb-4bde-a13b-1686db2c1979-kube-api-access-hmvlb\") pod \"openshift-apiserver-operator-796bbdcf4f-cq9j5\" (UID: \"fa343393-45bb-4bde-a13b-1686db2c1979\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050671 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7352b0d1-4af7-49c9-8029-5a97c3cdf450-serving-cert\") pod \"openshift-config-operator-7777fb866f-wtpmb\" (UID: \"7352b0d1-4af7-49c9-8029-5a97c3cdf450\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050693 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73592dc4-d2b3-42f7-9bec-346286516f23-config\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050724 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b72706e5-e53f-4c1f-81fa-6b850a062076-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b72706e5-e53f-4c1f-81fa-6b850a062076-serving-cert\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050756 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050770 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-audit\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050792 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5zbj\" (UniqueName: \"kubernetes.io/projected/9e14fb50-5723-489d-acc2-c5ca42234b73-kube-api-access-j5zbj\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050810 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73592dc4-d2b3-42f7-9bec-346286516f23-serving-cert\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-client-ca\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050849 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-config\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050898 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b72706e5-e53f-4c1f-81fa-6b850a062076-audit-policies\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050922 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050943 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65df5f7-3f27-422f-acde-42d3029cd963-config\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050962 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b65df5f7-3f27-422f-acde-42d3029cd963-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050980 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7352b0d1-4af7-49c9-8029-5a97c3cdf450-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wtpmb\" (UID: \"7352b0d1-4af7-49c9-8029-5a97c3cdf450\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.050998 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-etcd-client\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051021 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051040 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b72706e5-e53f-4c1f-81fa-6b850a062076-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051059 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-serving-cert\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/16da58ac-db10-4cb2-a7e9-330ac883a480-machine-approver-tls\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051109 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b40a4ea-3227-449c-b318-8e47b7eeefc4-config\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051166 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfms2\" (UniqueName: \"kubernetes.io/projected/b72706e5-e53f-4c1f-81fa-6b850a062076-kube-api-access-gfms2\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-config\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-encryption-config\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh7hp\" (UniqueName: \"kubernetes.io/projected/7352b0d1-4af7-49c9-8029-5a97c3cdf450-kube-api-access-bh7hp\") pod \"openshift-config-operator-7777fb866f-wtpmb\" (UID: \"7352b0d1-4af7-49c9-8029-5a97c3cdf450\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051245 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-etcd-serving-ca\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051267 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-dir\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2jlt\" (UniqueName: \"kubernetes.io/projected/506d451b-5cf3-44fe-be73-9d43abbbf9a8-kube-api-access-g2jlt\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051332 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-policies\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051356 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-audit-dir\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051424 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73592dc4-d2b3-42f7-9bec-346286516f23-trusted-ca\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f63f00-6812-4f35-ba1e-d1ea01a27a19-serving-cert\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051467 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/16da58ac-db10-4cb2-a7e9-330ac883a480-auth-proxy-config\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051488 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16da58ac-db10-4cb2-a7e9-330ac883a480-config\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051508 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b65df5f7-3f27-422f-acde-42d3029cd963-service-ca-bundle\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051524 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051543 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051563 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa343393-45bb-4bde-a13b-1686db2c1979-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cq9j5\" (UID: \"fa343393-45bb-4bde-a13b-1686db2c1979\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/506d451b-5cf3-44fe-be73-9d43abbbf9a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051618 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b40a4ea-3227-449c-b318-8e47b7eeefc4-etcd-service-ca\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051633 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa343393-45bb-4bde-a13b-1686db2c1979-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cq9j5\" (UID: \"fa343393-45bb-4bde-a13b-1686db2c1979\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051647 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8chf\" (UniqueName: \"kubernetes.io/projected/73592dc4-d2b3-42f7-9bec-346286516f23-kube-api-access-s8chf\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051681 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b65df5f7-3f27-422f-acde-42d3029cd963-serving-cert\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051695 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkxdr\" (UniqueName: \"kubernetes.io/projected/71f63f00-6812-4f35-ba1e-d1ea01a27a19-kube-api-access-rkxdr\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051711 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/352175cc-5065-4a1f-ac24-7ae82d39b87d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-68cdt\" (UID: \"352175cc-5065-4a1f-ac24-7ae82d39b87d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-image-import-ca\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051744 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2cz\" (UniqueName: \"kubernetes.io/projected/352175cc-5065-4a1f-ac24-7ae82d39b87d-kube-api-access-9h2cz\") pod \"cluster-samples-operator-665b6dd947-68cdt\" (UID: \"352175cc-5065-4a1f-ac24-7ae82d39b87d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47bwd\" (UniqueName: \"kubernetes.io/projected/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-kube-api-access-47bwd\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051792 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b40a4ea-3227-449c-b318-8e47b7eeefc4-serving-cert\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051810 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/506d451b-5cf3-44fe-be73-9d43abbbf9a8-images\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051830 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051851 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1b40a4ea-3227-449c-b318-8e47b7eeefc4-etcd-ca\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051870 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqgdj\" (UniqueName: \"kubernetes.io/projected/1b40a4ea-3227-449c-b318-8e47b7eeefc4-kube-api-access-sqgdj\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051888 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rft9m\" (UniqueName: \"kubernetes.io/projected/16da58ac-db10-4cb2-a7e9-330ac883a480-kube-api-access-rft9m\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051904 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b40a4ea-3227-449c-b318-8e47b7eeefc4-etcd-client\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051917 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b72706e5-e53f-4c1f-81fa-6b850a062076-etcd-client\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051931 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqsvn\" (UniqueName: \"kubernetes.io/projected/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-kube-api-access-kqsvn\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.051969 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.052000 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.052020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506d451b-5cf3-44fe-be73-9d43abbbf9a8-config\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.052150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrlqn\" (UniqueName: \"kubernetes.io/projected/01b809b1-7b62-4043-9411-7194d6e96e47-kube-api-access-wrlqn\") pod \"downloads-7954f5f757-5tsjj\" (UID: \"01b809b1-7b62-4043-9411-7194d6e96e47\") " pod="openshift-console/downloads-7954f5f757-5tsjj" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.052170 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2lz4\" (UniqueName: \"kubernetes.io/projected/b65df5f7-3f27-422f-acde-42d3029cd963-kube-api-access-t2lz4\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.052184 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-node-pullsecrets\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.052215 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b72706e5-e53f-4c1f-81fa-6b850a062076-encryption-config\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.052240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b72706e5-e53f-4c1f-81fa-6b850a062076-audit-dir\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.053069 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.054379 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.054543 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.054456 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.055058 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.055226 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.058893 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m8pkf"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.059057 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.059180 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.059741 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.060216 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.061022 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.061355 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.063054 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.066413 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5kn9v"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.067043 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-592fz"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.067392 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8c5mq"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.067483 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.068089 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.070331 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6p2jm"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.071275 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.071493 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.071282 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.072156 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.072289 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.072478 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.072687 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.072772 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.073758 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.073173 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.073261 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.073295 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.073324 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.073357 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.073382 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.074516 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.075917 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.085551 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.090437 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.091036 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.091076 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.091264 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qst5g"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.091531 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.091873 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.097317 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"139be40417a221a5b34eb9fe41a0938e4f8fc2709b8e29d82906bf85ab06b31d"} Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.097406 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.098201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"fcd0bb37874eed3462f344252a5c767170bcaf2e233f30850cf0c3596f63deaa"} Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.098223 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b24cc1ffe8b9988be37cdebb74b156bebfd9fdbc1af82f6bc7fd5439c9b8b3c2"} Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.098241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9168ed4e690b1612c2848c5b8f7d0d60d501c5d23213998f71265c028be96450"} Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.098283 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2ed21511e4a3d43070b49c568e3c89d51812a20a961ac6f6ed0b27e1a3640598"} Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.098964 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.099045 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.099176 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.101227 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.101911 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.102121 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9fhcv"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.103543 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.103605 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.104943 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.105322 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.105493 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.107640 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.108528 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.112631 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j5nkm"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.113604 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.113613 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dm8fj"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.114178 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.115464 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.117839 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.118906 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.120660 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.121814 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.122049 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.122982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.126491 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.129237 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.129749 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-42kzd"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.130735 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.130856 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.131569 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.133520 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rmjqq"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.135598 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-v99cs"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.136161 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.136303 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.136800 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.137600 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.138333 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fwvd8"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.139356 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gwf6h"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.140270 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sb8ph"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.141011 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sb8ph" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.141590 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.143939 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.143972 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j5nkm"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.145903 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5tsjj"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.149815 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.150657 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.153540 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7l7p\" (UniqueName: \"kubernetes.io/projected/3a221ced-5d72-41e5-8b49-d93ec52d53f5-kube-api-access-v7l7p\") pod \"catalog-operator-68c6474976-r6rj6\" (UID: \"3a221ced-5d72-41e5-8b49-d93ec52d53f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.153592 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.153622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b40a4ea-3227-449c-b318-8e47b7eeefc4-serving-cert\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.153647 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-client-ca\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.153673 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/506d451b-5cf3-44fe-be73-9d43abbbf9a8-images\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.153770 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfqkz\" (UniqueName: \"kubernetes.io/projected/28f586ec-7a65-4c1e-9f09-845b812246b0-kube-api-access-xfqkz\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.153834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.153859 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1b40a4ea-3227-449c-b318-8e47b7eeefc4-etcd-ca\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.153882 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqgdj\" (UniqueName: \"kubernetes.io/projected/1b40a4ea-3227-449c-b318-8e47b7eeefc4-kube-api-access-sqgdj\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.153971 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0adcbd21-d1d6-4beb-8b26-74d16b534b91-config\") pod \"kube-apiserver-operator-766d6c64bb-qrt7s\" (UID: \"0adcbd21-d1d6-4beb-8b26-74d16b534b91\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154001 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rft9m\" (UniqueName: \"kubernetes.io/projected/16da58ac-db10-4cb2-a7e9-330ac883a480-kube-api-access-rft9m\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154022 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b40a4ea-3227-449c-b318-8e47b7eeefc4-etcd-client\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154048 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b72706e5-e53f-4c1f-81fa-6b850a062076-etcd-client\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154127 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154195 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqsvn\" (UniqueName: \"kubernetes.io/projected/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-kube-api-access-kqsvn\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154223 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjx28\" (UniqueName: \"kubernetes.io/projected/5564388b-e6dd-409f-a137-b34700967f4a-kube-api-access-fjx28\") pod \"collect-profiles-29396655-zxpkt\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154353 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154815 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506d451b-5cf3-44fe-be73-9d43abbbf9a8-config\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrlqn\" (UniqueName: \"kubernetes.io/projected/01b809b1-7b62-4043-9411-7194d6e96e47-kube-api-access-wrlqn\") pod \"downloads-7954f5f757-5tsjj\" (UID: \"01b809b1-7b62-4043-9411-7194d6e96e47\") " pod="openshift-console/downloads-7954f5f757-5tsjj" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154887 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff02729-0197-4f76-b43e-594c908f8312-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6r56v\" (UID: \"aff02729-0197-4f76-b43e-594c908f8312\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dm8fj\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154937 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2lz4\" (UniqueName: \"kubernetes.io/projected/b65df5f7-3f27-422f-acde-42d3029cd963-kube-api-access-t2lz4\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154963 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-node-pullsecrets\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154998 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbdf845-aa39-41ed-a45c-75ac4ba8e894-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9fhcv\" (UID: \"2bbdf845-aa39-41ed-a45c-75ac4ba8e894\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnpd2\" (UniqueName: \"kubernetes.io/projected/07f6c2e0-4230-40e0-ad71-2f652546cd38-kube-api-access-cnpd2\") pod \"marketplace-operator-79b997595-dm8fj\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155047 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabe2e04-2ba1-4719-a4e9-8e763c6e3659-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dgk7t\" (UID: \"aabe2e04-2ba1-4719-a4e9-8e763c6e3659\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155069 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aff02729-0197-4f76-b43e-594c908f8312-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6r56v\" (UID: \"aff02729-0197-4f76-b43e-594c908f8312\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155094 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b72706e5-e53f-4c1f-81fa-6b850a062076-encryption-config\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155118 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b72706e5-e53f-4c1f-81fa-6b850a062076-audit-dir\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155142 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmvlb\" (UniqueName: \"kubernetes.io/projected/fa343393-45bb-4bde-a13b-1686db2c1979-kube-api-access-hmvlb\") pod \"openshift-apiserver-operator-796bbdcf4f-cq9j5\" (UID: \"fa343393-45bb-4bde-a13b-1686db2c1979\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155165 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7352b0d1-4af7-49c9-8029-5a97c3cdf450-serving-cert\") pod \"openshift-config-operator-7777fb866f-wtpmb\" (UID: \"7352b0d1-4af7-49c9-8029-5a97c3cdf450\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73592dc4-d2b3-42f7-9bec-346286516f23-config\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155214 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3a221ced-5d72-41e5-8b49-d93ec52d53f5-profile-collector-cert\") pod \"catalog-operator-68c6474976-r6rj6\" (UID: \"3a221ced-5d72-41e5-8b49-d93ec52d53f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155249 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/193a50c2-2643-4267-b722-a92cc83d3d40-serving-cert\") pod \"service-ca-operator-777779d784-qst5g\" (UID: \"193a50c2-2643-4267-b722-a92cc83d3d40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155272 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aff02729-0197-4f76-b43e-594c908f8312-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6r56v\" (UID: \"aff02729-0197-4f76-b43e-594c908f8312\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-oauth-serving-cert\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155321 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b72706e5-e53f-4c1f-81fa-6b850a062076-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b72706e5-e53f-4c1f-81fa-6b850a062076-serving-cert\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-audit\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155405 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dm8fj\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155433 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5zbj\" (UniqueName: \"kubernetes.io/projected/9e14fb50-5723-489d-acc2-c5ca42234b73-kube-api-access-j5zbj\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155454 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73592dc4-d2b3-42f7-9bec-346286516f23-serving-cert\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155509 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-service-ca\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155561 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-client-ca\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155600 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-config\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155649 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-console-config\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155671 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b72706e5-e53f-4c1f-81fa-6b850a062076-audit-policies\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-trusted-ca-bundle\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155740 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28f586ec-7a65-4c1e-9f09-845b812246b0-serving-cert\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nslhc\" (UniqueName: \"kubernetes.io/projected/1eb0462a-95bc-435f-825a-59fc93898e81-kube-api-access-nslhc\") pod \"ingress-canary-sb8ph\" (UID: \"1eb0462a-95bc-435f-825a-59fc93898e81\") " pod="openshift-ingress-canary/ingress-canary-sb8ph" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155796 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0adcbd21-d1d6-4beb-8b26-74d16b534b91-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qrt7s\" (UID: \"0adcbd21-d1d6-4beb-8b26-74d16b534b91\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155818 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65df5f7-3f27-422f-acde-42d3029cd963-config\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155869 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b65df5f7-3f27-422f-acde-42d3029cd963-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155896 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7352b0d1-4af7-49c9-8029-5a97c3cdf450-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wtpmb\" (UID: \"7352b0d1-4af7-49c9-8029-5a97c3cdf450\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155916 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-etcd-client\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155963 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b72706e5-e53f-4c1f-81fa-6b850a062076-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.155984 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-serving-cert\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156006 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-serving-cert\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156028 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/16da58ac-db10-4cb2-a7e9-330ac883a480-machine-approver-tls\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156049 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmkzd\" (UniqueName: \"kubernetes.io/projected/aabe2e04-2ba1-4719-a4e9-8e763c6e3659-kube-api-access-vmkzd\") pod \"openshift-controller-manager-operator-756b6f6bc6-dgk7t\" (UID: \"aabe2e04-2ba1-4719-a4e9-8e763c6e3659\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156074 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8974\" (UniqueName: \"kubernetes.io/projected/2bbdf845-aa39-41ed-a45c-75ac4ba8e894-kube-api-access-c8974\") pod \"multus-admission-controller-857f4d67dd-9fhcv\" (UID: \"2bbdf845-aa39-41ed-a45c-75ac4ba8e894\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156123 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b40a4ea-3227-449c-b318-8e47b7eeefc4-config\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfms2\" (UniqueName: \"kubernetes.io/projected/b72706e5-e53f-4c1f-81fa-6b850a062076-kube-api-access-gfms2\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-config\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-encryption-config\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh7hp\" (UniqueName: \"kubernetes.io/projected/7352b0d1-4af7-49c9-8029-5a97c3cdf450-kube-api-access-bh7hp\") pod \"openshift-config-operator-7777fb866f-wtpmb\" (UID: \"7352b0d1-4af7-49c9-8029-5a97c3cdf450\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156235 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-etcd-serving-ca\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156258 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aabe2e04-2ba1-4719-a4e9-8e763c6e3659-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dgk7t\" (UID: \"aabe2e04-2ba1-4719-a4e9-8e763c6e3659\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156284 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-dir\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156309 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2jlt\" (UniqueName: \"kubernetes.io/projected/506d451b-5cf3-44fe-be73-9d43abbbf9a8-kube-api-access-g2jlt\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156343 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-592fz"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156354 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1eb0462a-95bc-435f-825a-59fc93898e81-cert\") pod \"ingress-canary-sb8ph\" (UID: \"1eb0462a-95bc-435f-825a-59fc93898e81\") " pod="openshift-ingress-canary/ingress-canary-sb8ph" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-policies\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156483 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156523 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-audit-dir\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzv4j\" (UniqueName: \"kubernetes.io/projected/193a50c2-2643-4267-b722-a92cc83d3d40-kube-api-access-kzv4j\") pod \"service-ca-operator-777779d784-qst5g\" (UID: \"193a50c2-2643-4267-b722-a92cc83d3d40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156591 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.156642 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73592dc4-d2b3-42f7-9bec-346286516f23-trusted-ca\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.154764 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1b40a4ea-3227-449c-b318-8e47b7eeefc4-etcd-ca\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.157951 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-node-pullsecrets\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.158406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-dir\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.158464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.159009 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/506d451b-5cf3-44fe-be73-9d43abbbf9a8-images\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.159034 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.159128 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.159301 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b40a4ea-3227-449c-b318-8e47b7eeefc4-serving-cert\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.159558 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.159823 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-policies\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.159858 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.159992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73592dc4-d2b3-42f7-9bec-346286516f23-config\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160009 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65df5f7-3f27-422f-acde-42d3029cd963-config\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-oauth-config\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f63f00-6812-4f35-ba1e-d1ea01a27a19-serving-cert\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160130 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/16da58ac-db10-4cb2-a7e9-330ac883a480-auth-proxy-config\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-config\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160168 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/193a50c2-2643-4267-b722-a92cc83d3d40-config\") pod \"service-ca-operator-777779d784-qst5g\" (UID: \"193a50c2-2643-4267-b722-a92cc83d3d40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160196 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5564388b-e6dd-409f-a137-b34700967f4a-secret-volume\") pod \"collect-profiles-29396655-zxpkt\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160227 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16da58ac-db10-4cb2-a7e9-330ac883a480-config\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160252 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b65df5f7-3f27-422f-acde-42d3029cd963-service-ca-bundle\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160278 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160302 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5564388b-e6dd-409f-a137-b34700967f4a-config-volume\") pod \"collect-profiles-29396655-zxpkt\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160318 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-etcd-serving-ca\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160335 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160362 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa343393-45bb-4bde-a13b-1686db2c1979-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cq9j5\" (UID: \"fa343393-45bb-4bde-a13b-1686db2c1979\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/506d451b-5cf3-44fe-be73-9d43abbbf9a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160553 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b40a4ea-3227-449c-b318-8e47b7eeefc4-etcd-service-ca\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160644 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b72706e5-e53f-4c1f-81fa-6b850a062076-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160670 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa343393-45bb-4bde-a13b-1686db2c1979-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cq9j5\" (UID: \"fa343393-45bb-4bde-a13b-1686db2c1979\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160704 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8chf\" (UniqueName: \"kubernetes.io/projected/73592dc4-d2b3-42f7-9bec-346286516f23-kube-api-access-s8chf\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160740 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfxwr\" (UniqueName: \"kubernetes.io/projected/49cb4faf-6b21-4097-8f6c-24a310cff149-kube-api-access-xfxwr\") pod \"migrator-59844c95c7-gnsjl\" (UID: \"49cb4faf-6b21-4097-8f6c-24a310cff149\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160766 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-config\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160819 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4962z\" (UniqueName: \"kubernetes.io/projected/bead015e-e8e8-44f2-8dae-41047cd66706-kube-api-access-4962z\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160846 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b65df5f7-3f27-422f-acde-42d3029cd963-serving-cert\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160871 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkxdr\" (UniqueName: \"kubernetes.io/projected/71f63f00-6812-4f35-ba1e-d1ea01a27a19-kube-api-access-rkxdr\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160896 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/352175cc-5065-4a1f-ac24-7ae82d39b87d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-68cdt\" (UID: \"352175cc-5065-4a1f-ac24-7ae82d39b87d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0adcbd21-d1d6-4beb-8b26-74d16b534b91-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qrt7s\" (UID: \"0adcbd21-d1d6-4beb-8b26-74d16b534b91\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160962 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3a221ced-5d72-41e5-8b49-d93ec52d53f5-srv-cert\") pod \"catalog-operator-68c6474976-r6rj6\" (UID: \"3a221ced-5d72-41e5-8b49-d93ec52d53f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.160990 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-image-import-ca\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.161054 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b40a4ea-3227-449c-b318-8e47b7eeefc4-config\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.161146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2cz\" (UniqueName: \"kubernetes.io/projected/352175cc-5065-4a1f-ac24-7ae82d39b87d-kube-api-access-9h2cz\") pod \"cluster-samples-operator-665b6dd947-68cdt\" (UID: \"352175cc-5065-4a1f-ac24-7ae82d39b87d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.161182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47bwd\" (UniqueName: \"kubernetes.io/projected/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-kube-api-access-47bwd\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.161313 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.161364 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.161584 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.162128 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16da58ac-db10-4cb2-a7e9-330ac883a480-config\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.162236 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-audit\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.162391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.162395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/16da58ac-db10-4cb2-a7e9-330ac883a480-auth-proxy-config\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.162719 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5kn9v"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.162618 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7352b0d1-4af7-49c9-8029-5a97c3cdf450-available-featuregates\") pod \"openshift-config-operator-7777fb866f-wtpmb\" (UID: \"7352b0d1-4af7-49c9-8029-5a97c3cdf450\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.162804 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b65df5f7-3f27-422f-acde-42d3029cd963-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.163156 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9fhcv"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.163340 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-config\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.163362 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-audit-dir\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.163381 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b72706e5-e53f-4c1f-81fa-6b850a062076-audit-dir\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.163442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506d451b-5cf3-44fe-be73-9d43abbbf9a8-config\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.163600 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b72706e5-e53f-4c1f-81fa-6b850a062076-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.163918 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7352b0d1-4af7-49c9-8029-5a97c3cdf450-serving-cert\") pod \"openshift-config-operator-7777fb866f-wtpmb\" (UID: \"7352b0d1-4af7-49c9-8029-5a97c3cdf450\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.164179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa343393-45bb-4bde-a13b-1686db2c1979-config\") pod \"openshift-apiserver-operator-796bbdcf4f-cq9j5\" (UID: \"fa343393-45bb-4bde-a13b-1686db2c1979\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.164910 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b65df5f7-3f27-422f-acde-42d3029cd963-service-ca-bundle\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.165095 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/73592dc4-d2b3-42f7-9bec-346286516f23-trusted-ca\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.165242 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-image-import-ca\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.165254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1b40a4ea-3227-449c-b318-8e47b7eeefc4-etcd-client\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.165464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.165512 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b72706e5-e53f-4c1f-81fa-6b850a062076-serving-cert\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.165880 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-client-ca\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.165946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b72706e5-e53f-4c1f-81fa-6b850a062076-audit-policies\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.166300 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.166828 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.167311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.167311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-encryption-config\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.167823 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b72706e5-e53f-4c1f-81fa-6b850a062076-etcd-client\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.168188 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/506d451b-5cf3-44fe-be73-9d43abbbf9a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.168354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/352175cc-5065-4a1f-ac24-7ae82d39b87d-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-68cdt\" (UID: \"352175cc-5065-4a1f-ac24-7ae82d39b87d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.168657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-serving-cert\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.168957 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-etcd-client\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.168991 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.169296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fa343393-45bb-4bde-a13b-1686db2c1979-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-cq9j5\" (UID: \"fa343393-45bb-4bde-a13b-1686db2c1979\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.169387 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73592dc4-d2b3-42f7-9bec-346286516f23-serving-cert\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.169500 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f63f00-6812-4f35-ba1e-d1ea01a27a19-serving-cert\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.169532 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.170090 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.170522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b72706e5-e53f-4c1f-81fa-6b850a062076-encryption-config\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.171354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.171398 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qst5g"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.172329 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1b40a4ea-3227-449c-b318-8e47b7eeefc4-etcd-service-ca\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.175091 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hhpxp"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.178174 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.178636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b65df5f7-3f27-422f-acde-42d3029cd963-serving-cert\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.182408 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.183944 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.185252 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.186853 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dm8fj"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.188751 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.190127 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.191833 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.193510 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.194549 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.195559 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.196586 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.197730 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.198660 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.199663 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.200761 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-fbhcz"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.202652 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/16da58ac-db10-4cb2-a7e9-330ac883a480-machine-approver-tls\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.203136 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.203483 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-858dv"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.205867 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sb8ph"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.206007 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.208054 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-858dv"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.209883 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.211076 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qvtpl"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.212145 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.216387 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qvtpl"] Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.231366 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.250454 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-service-ca\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262373 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-console-config\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262403 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0adcbd21-d1d6-4beb-8b26-74d16b534b91-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qrt7s\" (UID: \"0adcbd21-d1d6-4beb-8b26-74d16b534b91\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262424 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-trusted-ca-bundle\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262445 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28f586ec-7a65-4c1e-9f09-845b812246b0-serving-cert\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262467 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nslhc\" (UniqueName: \"kubernetes.io/projected/1eb0462a-95bc-435f-825a-59fc93898e81-kube-api-access-nslhc\") pod \"ingress-canary-sb8ph\" (UID: \"1eb0462a-95bc-435f-825a-59fc93898e81\") " pod="openshift-ingress-canary/ingress-canary-sb8ph" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262490 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-serving-cert\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmkzd\" (UniqueName: \"kubernetes.io/projected/aabe2e04-2ba1-4719-a4e9-8e763c6e3659-kube-api-access-vmkzd\") pod \"openshift-controller-manager-operator-756b6f6bc6-dgk7t\" (UID: \"aabe2e04-2ba1-4719-a4e9-8e763c6e3659\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8974\" (UniqueName: \"kubernetes.io/projected/2bbdf845-aa39-41ed-a45c-75ac4ba8e894-kube-api-access-c8974\") pod \"multus-admission-controller-857f4d67dd-9fhcv\" (UID: \"2bbdf845-aa39-41ed-a45c-75ac4ba8e894\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262601 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aabe2e04-2ba1-4719-a4e9-8e763c6e3659-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dgk7t\" (UID: \"aabe2e04-2ba1-4719-a4e9-8e763c6e3659\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1eb0462a-95bc-435f-825a-59fc93898e81-cert\") pod \"ingress-canary-sb8ph\" (UID: \"1eb0462a-95bc-435f-825a-59fc93898e81\") " pod="openshift-ingress-canary/ingress-canary-sb8ph" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262655 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzv4j\" (UniqueName: \"kubernetes.io/projected/193a50c2-2643-4267-b722-a92cc83d3d40-kube-api-access-kzv4j\") pod \"service-ca-operator-777779d784-qst5g\" (UID: \"193a50c2-2643-4267-b722-a92cc83d3d40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262678 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-oauth-config\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262700 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/193a50c2-2643-4267-b722-a92cc83d3d40-config\") pod \"service-ca-operator-777779d784-qst5g\" (UID: \"193a50c2-2643-4267-b722-a92cc83d3d40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262720 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5564388b-e6dd-409f-a137-b34700967f4a-secret-volume\") pod \"collect-profiles-29396655-zxpkt\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262750 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5564388b-e6dd-409f-a137-b34700967f4a-config-volume\") pod \"collect-profiles-29396655-zxpkt\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262783 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-config\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.262817 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfxwr\" (UniqueName: \"kubernetes.io/projected/49cb4faf-6b21-4097-8f6c-24a310cff149-kube-api-access-xfxwr\") pod \"migrator-59844c95c7-gnsjl\" (UID: \"49cb4faf-6b21-4097-8f6c-24a310cff149\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264119 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-console-config\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4962z\" (UniqueName: \"kubernetes.io/projected/bead015e-e8e8-44f2-8dae-41047cd66706-kube-api-access-4962z\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264352 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-trusted-ca-bundle\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264457 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3a221ced-5d72-41e5-8b49-d93ec52d53f5-srv-cert\") pod \"catalog-operator-68c6474976-r6rj6\" (UID: \"3a221ced-5d72-41e5-8b49-d93ec52d53f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264487 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0adcbd21-d1d6-4beb-8b26-74d16b534b91-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qrt7s\" (UID: \"0adcbd21-d1d6-4beb-8b26-74d16b534b91\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264616 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7l7p\" (UniqueName: \"kubernetes.io/projected/3a221ced-5d72-41e5-8b49-d93ec52d53f5-kube-api-access-v7l7p\") pod \"catalog-operator-68c6474976-r6rj6\" (UID: \"3a221ced-5d72-41e5-8b49-d93ec52d53f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264658 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-client-ca\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264705 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0adcbd21-d1d6-4beb-8b26-74d16b534b91-config\") pod \"kube-apiserver-operator-766d6c64bb-qrt7s\" (UID: \"0adcbd21-d1d6-4beb-8b26-74d16b534b91\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfqkz\" (UniqueName: \"kubernetes.io/projected/28f586ec-7a65-4c1e-9f09-845b812246b0-kube-api-access-xfqkz\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264813 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjx28\" (UniqueName: \"kubernetes.io/projected/5564388b-e6dd-409f-a137-b34700967f4a-kube-api-access-fjx28\") pod \"collect-profiles-29396655-zxpkt\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dm8fj\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff02729-0197-4f76-b43e-594c908f8312-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6r56v\" (UID: \"aff02729-0197-4f76-b43e-594c908f8312\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264925 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-config\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbdf845-aa39-41ed-a45c-75ac4ba8e894-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9fhcv\" (UID: \"2bbdf845-aa39-41ed-a45c-75ac4ba8e894\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.264988 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnpd2\" (UniqueName: \"kubernetes.io/projected/07f6c2e0-4230-40e0-ad71-2f652546cd38-kube-api-access-cnpd2\") pod \"marketplace-operator-79b997595-dm8fj\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.265036 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabe2e04-2ba1-4719-a4e9-8e763c6e3659-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dgk7t\" (UID: \"aabe2e04-2ba1-4719-a4e9-8e763c6e3659\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.265143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-service-ca\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.265637 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-client-ca\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.266425 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aabe2e04-2ba1-4719-a4e9-8e763c6e3659-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-dgk7t\" (UID: \"aabe2e04-2ba1-4719-a4e9-8e763c6e3659\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.266467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-oauth-config\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.265077 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aff02729-0197-4f76-b43e-594c908f8312-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6r56v\" (UID: \"aff02729-0197-4f76-b43e-594c908f8312\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.266613 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3a221ced-5d72-41e5-8b49-d93ec52d53f5-profile-collector-cert\") pod \"catalog-operator-68c6474976-r6rj6\" (UID: \"3a221ced-5d72-41e5-8b49-d93ec52d53f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.266641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/193a50c2-2643-4267-b722-a92cc83d3d40-serving-cert\") pod \"service-ca-operator-777779d784-qst5g\" (UID: \"193a50c2-2643-4267-b722-a92cc83d3d40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.266661 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aff02729-0197-4f76-b43e-594c908f8312-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6r56v\" (UID: \"aff02729-0197-4f76-b43e-594c908f8312\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.266678 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-oauth-serving-cert\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.266706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dm8fj\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.267033 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-serving-cert\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.267391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28f586ec-7a65-4c1e-9f09-845b812246b0-serving-cert\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.267497 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-oauth-serving-cert\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.268974 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aabe2e04-2ba1-4719-a4e9-8e763c6e3659-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-dgk7t\" (UID: \"aabe2e04-2ba1-4719-a4e9-8e763c6e3659\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.269812 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.269997 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0adcbd21-d1d6-4beb-8b26-74d16b534b91-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qrt7s\" (UID: \"0adcbd21-d1d6-4beb-8b26-74d16b534b91\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.290056 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.309929 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.329709 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.351246 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.370249 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.389523 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.409212 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.429236 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.449554 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.469863 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.489945 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.496752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0adcbd21-d1d6-4beb-8b26-74d16b534b91-config\") pod \"kube-apiserver-operator-766d6c64bb-qrt7s\" (UID: \"0adcbd21-d1d6-4beb-8b26-74d16b534b91\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.530048 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.550162 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.570438 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.589419 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.610661 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.630456 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.636798 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5564388b-e6dd-409f-a137-b34700967f4a-secret-volume\") pod \"collect-profiles-29396655-zxpkt\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.639617 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3a221ced-5d72-41e5-8b49-d93ec52d53f5-profile-collector-cert\") pod \"catalog-operator-68c6474976-r6rj6\" (UID: \"3a221ced-5d72-41e5-8b49-d93ec52d53f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.650853 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.655499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5564388b-e6dd-409f-a137-b34700967f4a-config-volume\") pod \"collect-profiles-29396655-zxpkt\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.669852 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.690428 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.709463 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.730456 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.740505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/193a50c2-2643-4267-b722-a92cc83d3d40-serving-cert\") pod \"service-ca-operator-777779d784-qst5g\" (UID: \"193a50c2-2643-4267-b722-a92cc83d3d40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.750795 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.770879 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.775047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/193a50c2-2643-4267-b722-a92cc83d3d40-config\") pod \"service-ca-operator-777779d784-qst5g\" (UID: \"193a50c2-2643-4267-b722-a92cc83d3d40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.789715 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.809317 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.829755 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.849517 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.870609 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.889250 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.898152 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2bbdf845-aa39-41ed-a45c-75ac4ba8e894-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-9fhcv\" (UID: \"2bbdf845-aa39-41ed-a45c-75ac4ba8e894\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.910288 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.930673 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.967299 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.969479 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Nov 22 08:24:37 crc kubenswrapper[4743]: I1122 08:24:37.990668 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.011151 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.016087 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aff02729-0197-4f76-b43e-594c908f8312-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6r56v\" (UID: \"aff02729-0197-4f76-b43e-594c908f8312\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.030249 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.039271 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aff02729-0197-4f76-b43e-594c908f8312-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6r56v\" (UID: \"aff02729-0197-4f76-b43e-594c908f8312\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.051284 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.070315 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.089981 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.104194 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ecdec6ec70cabf5b9a106db2373d0c5d74fabf69503793954d0ca886cd5ad53a"} Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.111791 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.127991 4743 request.go:700] Waited for 1.013509338s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-service-ca/configmaps?fieldSelector=metadata.name%3Dsigning-cabundle&limit=500&resourceVersion=0 Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.130723 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.150948 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.170538 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.190671 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.199889 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dm8fj\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.216642 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.227222 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dm8fj\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.230121 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.250680 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Nov 22 08:24:38 crc kubenswrapper[4743]: E1122 08:24:38.263110 4743 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 22 08:24:38 crc kubenswrapper[4743]: E1122 08:24:38.263244 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eb0462a-95bc-435f-825a-59fc93898e81-cert podName:1eb0462a-95bc-435f-825a-59fc93898e81 nodeName:}" failed. No retries permitted until 2025-11-22 08:24:38.763215314 +0000 UTC m=+152.469576376 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1eb0462a-95bc-435f-825a-59fc93898e81-cert") pod "ingress-canary-sb8ph" (UID: "1eb0462a-95bc-435f-825a-59fc93898e81") : failed to sync secret cache: timed out waiting for the condition Nov 22 08:24:38 crc kubenswrapper[4743]: E1122 08:24:38.265290 4743 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Nov 22 08:24:38 crc kubenswrapper[4743]: E1122 08:24:38.266472 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a221ced-5d72-41e5-8b49-d93ec52d53f5-srv-cert podName:3a221ced-5d72-41e5-8b49-d93ec52d53f5 nodeName:}" failed. No retries permitted until 2025-11-22 08:24:38.765546154 +0000 UTC m=+152.471907296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/3a221ced-5d72-41e5-8b49-d93ec52d53f5-srv-cert") pod "catalog-operator-68c6474976-r6rj6" (UID: "3a221ced-5d72-41e5-8b49-d93ec52d53f5") : failed to sync secret cache: timed out waiting for the condition Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.269834 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.290771 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.310464 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.329681 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.350067 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.369968 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.389620 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.410086 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.429680 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.449563 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.469718 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.489753 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.509264 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.529178 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.550765 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.570769 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.590736 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.609683 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.629708 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.653476 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.669108 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.689713 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.709100 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.729557 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.749990 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.786385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1eb0462a-95bc-435f-825a-59fc93898e81-cert\") pod \"ingress-canary-sb8ph\" (UID: \"1eb0462a-95bc-435f-825a-59fc93898e81\") " pod="openshift-ingress-canary/ingress-canary-sb8ph" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.786527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3a221ced-5d72-41e5-8b49-d93ec52d53f5-srv-cert\") pod \"catalog-operator-68c6474976-r6rj6\" (UID: \"3a221ced-5d72-41e5-8b49-d93ec52d53f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.792624 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1eb0462a-95bc-435f-825a-59fc93898e81-cert\") pod \"ingress-canary-sb8ph\" (UID: \"1eb0462a-95bc-435f-825a-59fc93898e81\") " pod="openshift-ingress-canary/ingress-canary-sb8ph" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.793627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3a221ced-5d72-41e5-8b49-d93ec52d53f5-srv-cert\") pod \"catalog-operator-68c6474976-r6rj6\" (UID: \"3a221ced-5d72-41e5-8b49-d93ec52d53f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.816196 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rft9m\" (UniqueName: \"kubernetes.io/projected/16da58ac-db10-4cb2-a7e9-330ac883a480-kube-api-access-rft9m\") pod \"machine-approver-56656f9798-gw2gb\" (UID: \"16da58ac-db10-4cb2-a7e9-330ac883a480\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.822814 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqgdj\" (UniqueName: \"kubernetes.io/projected/1b40a4ea-3227-449c-b318-8e47b7eeefc4-kube-api-access-sqgdj\") pod \"etcd-operator-b45778765-fwvd8\" (UID: \"1b40a4ea-3227-449c-b318-8e47b7eeefc4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.855692 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqsvn\" (UniqueName: \"kubernetes.io/projected/c7c71cb3-54e2-471f-a91f-50c146c4e3c8-kube-api-access-kqsvn\") pod \"apiserver-76f77b778f-gwf6h\" (UID: \"c7c71cb3-54e2-471f-a91f-50c146c4e3c8\") " pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.866475 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrlqn\" (UniqueName: \"kubernetes.io/projected/01b809b1-7b62-4043-9411-7194d6e96e47-kube-api-access-wrlqn\") pod \"downloads-7954f5f757-5tsjj\" (UID: \"01b809b1-7b62-4043-9411-7194d6e96e47\") " pod="openshift-console/downloads-7954f5f757-5tsjj" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.883407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2jlt\" (UniqueName: \"kubernetes.io/projected/506d451b-5cf3-44fe-be73-9d43abbbf9a8-kube-api-access-g2jlt\") pod \"machine-api-operator-5694c8668f-m8pkf\" (UID: \"506d451b-5cf3-44fe-be73-9d43abbbf9a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.903627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh7hp\" (UniqueName: \"kubernetes.io/projected/7352b0d1-4af7-49c9-8029-5a97c3cdf450-kube-api-access-bh7hp\") pod \"openshift-config-operator-7777fb866f-wtpmb\" (UID: \"7352b0d1-4af7-49c9-8029-5a97c3cdf450\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.921328 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmvlb\" (UniqueName: \"kubernetes.io/projected/fa343393-45bb-4bde-a13b-1686db2c1979-kube-api-access-hmvlb\") pod \"openshift-apiserver-operator-796bbdcf4f-cq9j5\" (UID: \"fa343393-45bb-4bde-a13b-1686db2c1979\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.932433 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.943453 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfms2\" (UniqueName: \"kubernetes.io/projected/b72706e5-e53f-4c1f-81fa-6b850a062076-kube-api-access-gfms2\") pod \"apiserver-7bbb656c7d-67bbq\" (UID: \"b72706e5-e53f-4c1f-81fa-6b850a062076\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.971239 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47bwd\" (UniqueName: \"kubernetes.io/projected/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-kube-api-access-47bwd\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.976609 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-5tsjj" Nov 22 08:24:38 crc kubenswrapper[4743]: I1122 08:24:38.987033 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2lz4\" (UniqueName: \"kubernetes.io/projected/b65df5f7-3f27-422f-acde-42d3029cd963-kube-api-access-t2lz4\") pod \"authentication-operator-69f744f599-6p2jm\" (UID: \"b65df5f7-3f27-422f-acde-42d3029cd963\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.003768 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5zbj\" (UniqueName: \"kubernetes.io/projected/9e14fb50-5723-489d-acc2-c5ca42234b73-kube-api-access-j5zbj\") pod \"oauth-openshift-558db77b4-42kzd\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.026954 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.034464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkxdr\" (UniqueName: \"kubernetes.io/projected/71f63f00-6812-4f35-ba1e-d1ea01a27a19-kube-api-access-rkxdr\") pod \"controller-manager-879f6c89f-8c5mq\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.051521 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.052913 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42e31b43-9e9c-4ab0-a6bf-3857b755b87a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zqhp9\" (UID: \"42e31b43-9e9c-4ab0-a6bf-3857b755b87a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.055562 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.067017 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8chf\" (UniqueName: \"kubernetes.io/projected/73592dc4-d2b3-42f7-9bec-346286516f23-kube-api-access-s8chf\") pod \"console-operator-58897d9998-rmjqq\" (UID: \"73592dc4-d2b3-42f7-9bec-346286516f23\") " pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.083889 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.090323 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.090972 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2cz\" (UniqueName: \"kubernetes.io/projected/352175cc-5065-4a1f-ac24-7ae82d39b87d-kube-api-access-9h2cz\") pod \"cluster-samples-operator-665b6dd947-68cdt\" (UID: \"352175cc-5065-4a1f-ac24-7ae82d39b87d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.111239 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.122326 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.129757 4743 request.go:700] Waited for 1.926249842s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.131847 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.150151 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.170336 4743 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.174681 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.193241 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.195488 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.209418 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.210286 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.217187 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-5tsjj"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.218509 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.242036 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.242143 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.246520 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.250508 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.261156 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" Nov 22 08:24:39 crc kubenswrapper[4743]: W1122 08:24:39.268375 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01b809b1_7b62_4043_9411_7194d6e96e47.slice/crio-8a28a88c33fb5f534366e3887bc3af5978fcd92b0f6ea009b4585cf4c3488193 WatchSource:0}: Error finding container 8a28a88c33fb5f534366e3887bc3af5978fcd92b0f6ea009b4585cf4c3488193: Status 404 returned error can't find the container with id 8a28a88c33fb5f534366e3887bc3af5978fcd92b0f6ea009b4585cf4c3488193 Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.284442 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.294761 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0adcbd21-d1d6-4beb-8b26-74d16b534b91-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qrt7s\" (UID: \"0adcbd21-d1d6-4beb-8b26-74d16b534b91\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.311269 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmkzd\" (UniqueName: \"kubernetes.io/projected/aabe2e04-2ba1-4719-a4e9-8e763c6e3659-kube-api-access-vmkzd\") pod \"openshift-controller-manager-operator-756b6f6bc6-dgk7t\" (UID: \"aabe2e04-2ba1-4719-a4e9-8e763c6e3659\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.328245 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nslhc\" (UniqueName: \"kubernetes.io/projected/1eb0462a-95bc-435f-825a-59fc93898e81-kube-api-access-nslhc\") pod \"ingress-canary-sb8ph\" (UID: \"1eb0462a-95bc-435f-825a-59fc93898e81\") " pod="openshift-ingress-canary/ingress-canary-sb8ph" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.347927 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzv4j\" (UniqueName: \"kubernetes.io/projected/193a50c2-2643-4267-b722-a92cc83d3d40-kube-api-access-kzv4j\") pod \"service-ca-operator-777779d784-qst5g\" (UID: \"193a50c2-2643-4267-b722-a92cc83d3d40\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.348613 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.365096 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.379121 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8974\" (UniqueName: \"kubernetes.io/projected/2bbdf845-aa39-41ed-a45c-75ac4ba8e894-kube-api-access-c8974\") pod \"multus-admission-controller-857f4d67dd-9fhcv\" (UID: \"2bbdf845-aa39-41ed-a45c-75ac4ba8e894\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.392765 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfxwr\" (UniqueName: \"kubernetes.io/projected/49cb4faf-6b21-4097-8f6c-24a310cff149-kube-api-access-xfxwr\") pod \"migrator-59844c95c7-gnsjl\" (UID: \"49cb4faf-6b21-4097-8f6c-24a310cff149\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.409479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4962z\" (UniqueName: \"kubernetes.io/projected/bead015e-e8e8-44f2-8dae-41047cd66706-kube-api-access-4962z\") pod \"console-f9d7485db-hhpxp\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.412382 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.427854 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.440104 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.443309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7l7p\" (UniqueName: \"kubernetes.io/projected/3a221ced-5d72-41e5-8b49-d93ec52d53f5-kube-api-access-v7l7p\") pod \"catalog-operator-68c6474976-r6rj6\" (UID: \"3a221ced-5d72-41e5-8b49-d93ec52d53f5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.459037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfqkz\" (UniqueName: \"kubernetes.io/projected/28f586ec-7a65-4c1e-9f09-845b812246b0-kube-api-access-xfqkz\") pod \"route-controller-manager-6576b87f9c-rrn7z\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.469061 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnpd2\" (UniqueName: \"kubernetes.io/projected/07f6c2e0-4230-40e0-ad71-2f652546cd38-kube-api-access-cnpd2\") pod \"marketplace-operator-79b997595-dm8fj\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.484780 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.488749 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjx28\" (UniqueName: \"kubernetes.io/projected/5564388b-e6dd-409f-a137-b34700967f4a-kube-api-access-fjx28\") pod \"collect-profiles-29396655-zxpkt\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.498977 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.523102 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/aff02729-0197-4f76-b43e-594c908f8312-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-6r56v\" (UID: \"aff02729-0197-4f76-b43e-594c908f8312\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.559157 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fwvd8"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.579314 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8c5mq"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.652040 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sb8ph" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.652190 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.652623 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654185 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f847f14-06dc-46ee-ac13-08e1f8c86ae4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzpfd\" (UID: \"6f847f14-06dc-46ee-ac13-08e1f8c86ae4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654212 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/555c98c2-8078-4171-b6ec-1c2d0df9ae90-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654235 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45f0877c-996f-4f4c-aa19-970fc7cd0459-srv-cert\") pod \"olm-operator-6b444d44fb-hn6c7\" (UID: \"45f0877c-996f-4f4c-aa19-970fc7cd0459\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654251 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzz2p\" (UniqueName: \"kubernetes.io/projected/45f0877c-996f-4f4c-aa19-970fc7cd0459-kube-api-access-pzz2p\") pod \"olm-operator-6b444d44fb-hn6c7\" (UID: \"45f0877c-996f-4f4c-aa19-970fc7cd0459\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654264 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5609b092-3caf-445b-99e0-edaff20d65a2-tmpfs\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654279 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxthw\" (UniqueName: \"kubernetes.io/projected/6f847f14-06dc-46ee-ac13-08e1f8c86ae4-kube-api-access-xxthw\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzpfd\" (UID: \"6f847f14-06dc-46ee-ac13-08e1f8c86ae4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654307 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27bd3602-b31d-4b17-9902-72cac7b5580f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7rcws\" (UID: \"27bd3602-b31d-4b17-9902-72cac7b5580f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654322 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-proxy-tls\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654347 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f24b\" (UniqueName: \"kubernetes.io/projected/2619251a-fbe2-425c-bf25-5c0fafb3965a-kube-api-access-8f24b\") pod \"package-server-manager-789f6589d5-wbn2s\" (UID: \"2619251a-fbe2-425c-bf25-5c0fafb3965a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654364 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-729zq\" (UniqueName: \"kubernetes.io/projected/555c98c2-8078-4171-b6ec-1c2d0df9ae90-kube-api-access-729zq\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654386 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63019b95-c8f5-4782-85ba-def26be394f0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654401 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdrzb\" (UniqueName: \"kubernetes.io/projected/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-kube-api-access-rdrzb\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654415 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/555c98c2-8078-4171-b6ec-1c2d0df9ae90-trusted-ca\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-trusted-ca\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654477 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f847f14-06dc-46ee-ac13-08e1f8c86ae4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzpfd\" (UID: \"6f847f14-06dc-46ee-ac13-08e1f8c86ae4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ce758eb-1848-4be1-b1d1-65373e1531d9-signing-key\") pod \"service-ca-9c57cc56f-j5nkm\" (UID: \"1ce758eb-1848-4be1-b1d1-65373e1531d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654539 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-images\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2619251a-fbe2-425c-bf25-5c0fafb3965a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-wbn2s\" (UID: \"2619251a-fbe2-425c-bf25-5c0fafb3965a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654591 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wvm5\" (UniqueName: \"kubernetes.io/projected/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-kube-api-access-8wvm5\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654607 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27bd3602-b31d-4b17-9902-72cac7b5580f-proxy-tls\") pod \"machine-config-controller-84d6567774-7rcws\" (UID: \"27bd3602-b31d-4b17-9902-72cac7b5580f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654633 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ce758eb-1848-4be1-b1d1-65373e1531d9-signing-cabundle\") pod \"service-ca-9c57cc56f-j5nkm\" (UID: \"1ce758eb-1848-4be1-b1d1-65373e1531d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.654649 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gvfz\" (UniqueName: \"kubernetes.io/projected/1ce758eb-1848-4be1-b1d1-65373e1531d9-kube-api-access-8gvfz\") pod \"service-ca-9c57cc56f-j5nkm\" (UID: \"1ce758eb-1848-4be1-b1d1-65373e1531d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.655183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-default-certificate\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.655223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-metrics-certs\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.655261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtx8r\" (UniqueName: \"kubernetes.io/projected/5609b092-3caf-445b-99e0-edaff20d65a2-kube-api-access-qtx8r\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.655804 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-42kzd"] Nov 22 08:24:39 crc kubenswrapper[4743]: E1122 08:24:39.656732 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.156716947 +0000 UTC m=+153.863078089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.657801 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fht\" (UniqueName: \"kubernetes.io/projected/a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5-kube-api-access-45fht\") pod \"dns-operator-744455d44c-5kn9v\" (UID: \"a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5\") " pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.658015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwh82\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-kube-api-access-kwh82\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.658195 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5609b092-3caf-445b-99e0-edaff20d65a2-webhook-cert\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.658231 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/555c98c2-8078-4171-b6ec-1c2d0df9ae90-metrics-tls\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.658288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8plkl\" (UniqueName: \"kubernetes.io/projected/6ecdcc2c-1d03-46ec-96e6-da1e04437140-kube-api-access-8plkl\") pod \"control-plane-machine-set-operator-78cbb6b69f-pg7xr\" (UID: \"6ecdcc2c-1d03-46ec-96e6-da1e04437140\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.658410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45f0877c-996f-4f4c-aa19-970fc7cd0459-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hn6c7\" (UID: \"45f0877c-996f-4f4c-aa19-970fc7cd0459\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.658444 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ecdcc2c-1d03-46ec-96e6-da1e04437140-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pg7xr\" (UID: \"6ecdcc2c-1d03-46ec-96e6-da1e04437140\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.658751 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f0d578-d25e-4f48-bca6-389c9b4fbd37-config\") pod \"kube-controller-manager-operator-78b949d7b-nc42b\" (UID: \"a8f0d578-d25e-4f48-bca6-389c9b4fbd37\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.658840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-registry-tls\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.659118 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-stats-auth\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.659212 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5-metrics-tls\") pod \"dns-operator-744455d44c-5kn9v\" (UID: \"a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5\") " pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.659614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8f0d578-d25e-4f48-bca6-389c9b4fbd37-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nc42b\" (UID: \"a8f0d578-d25e-4f48-bca6-389c9b4fbd37\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.659720 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5609b092-3caf-445b-99e0-edaff20d65a2-apiservice-cert\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.659811 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63019b95-c8f5-4782-85ba-def26be394f0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.659886 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.660051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-registry-certificates\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.660134 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8f0d578-d25e-4f48-bca6-389c9b4fbd37-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nc42b\" (UID: \"a8f0d578-d25e-4f48-bca6-389c9b4fbd37\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.660211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndr48\" (UniqueName: \"kubernetes.io/projected/27bd3602-b31d-4b17-9902-72cac7b5580f-kube-api-access-ndr48\") pod \"machine-config-controller-84d6567774-7rcws\" (UID: \"27bd3602-b31d-4b17-9902-72cac7b5580f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.660290 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-service-ca-bundle\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.660371 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-bound-sa-token\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.678266 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-m8pkf"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.692759 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gwf6h"] Nov 22 08:24:39 crc kubenswrapper[4743]: W1122 08:24:39.702446 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71f63f00_6812_4f35_ba1e_d1ea01a27a19.slice/crio-6eeea9a14b0c1807fc82fb11a9836cd1ccc104d7522d9467f790d236e700029c WatchSource:0}: Error finding container 6eeea9a14b0c1807fc82fb11a9836cd1ccc104d7522d9467f790d236e700029c: Status 404 returned error can't find the container with id 6eeea9a14b0c1807fc82fb11a9836cd1ccc104d7522d9467f790d236e700029c Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.719652 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761168 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761361 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2619251a-fbe2-425c-bf25-5c0fafb3965a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-wbn2s\" (UID: \"2619251a-fbe2-425c-bf25-5c0fafb3965a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wvm5\" (UniqueName: \"kubernetes.io/projected/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-kube-api-access-8wvm5\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761423 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27bd3602-b31d-4b17-9902-72cac7b5580f-proxy-tls\") pod \"machine-config-controller-84d6567774-7rcws\" (UID: \"27bd3602-b31d-4b17-9902-72cac7b5580f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ce758eb-1848-4be1-b1d1-65373e1531d9-signing-cabundle\") pod \"service-ca-9c57cc56f-j5nkm\" (UID: \"1ce758eb-1848-4be1-b1d1-65373e1531d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761464 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gvfz\" (UniqueName: \"kubernetes.io/projected/1ce758eb-1848-4be1-b1d1-65373e1531d9-kube-api-access-8gvfz\") pod \"service-ca-9c57cc56f-j5nkm\" (UID: \"1ce758eb-1848-4be1-b1d1-65373e1531d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761510 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-csi-data-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-default-certificate\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761540 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-metrics-certs\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-plugins-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtx8r\" (UniqueName: \"kubernetes.io/projected/5609b092-3caf-445b-99e0-edaff20d65a2-kube-api-access-qtx8r\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761633 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef4673e8-a499-4df4-8ee9-987b41ee501f-metrics-tls\") pod \"dns-default-qvtpl\" (UID: \"ef4673e8-a499-4df4-8ee9-987b41ee501f\") " pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761652 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7491ded8-c8fd-4838-b3b7-7a8ce3946e07-node-bootstrap-token\") pod \"machine-config-server-fbhcz\" (UID: \"7491ded8-c8fd-4838-b3b7-7a8ce3946e07\") " pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761707 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef4673e8-a499-4df4-8ee9-987b41ee501f-config-volume\") pod \"dns-default-qvtpl\" (UID: \"ef4673e8-a499-4df4-8ee9-987b41ee501f\") " pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45fht\" (UniqueName: \"kubernetes.io/projected/a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5-kube-api-access-45fht\") pod \"dns-operator-744455d44c-5kn9v\" (UID: \"a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5\") " pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761767 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwh82\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-kube-api-access-kwh82\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761795 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5609b092-3caf-445b-99e0-edaff20d65a2-webhook-cert\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761815 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/555c98c2-8078-4171-b6ec-1c2d0df9ae90-metrics-tls\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761831 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8plkl\" (UniqueName: \"kubernetes.io/projected/6ecdcc2c-1d03-46ec-96e6-da1e04437140-kube-api-access-8plkl\") pod \"control-plane-machine-set-operator-78cbb6b69f-pg7xr\" (UID: \"6ecdcc2c-1d03-46ec-96e6-da1e04437140\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761846 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v5jx\" (UniqueName: \"kubernetes.io/projected/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-kube-api-access-9v5jx\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761863 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-mountpoint-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761897 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45f0877c-996f-4f4c-aa19-970fc7cd0459-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hn6c7\" (UID: \"45f0877c-996f-4f4c-aa19-970fc7cd0459\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761914 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ecdcc2c-1d03-46ec-96e6-da1e04437140-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pg7xr\" (UID: \"6ecdcc2c-1d03-46ec-96e6-da1e04437140\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761944 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f0d578-d25e-4f48-bca6-389c9b4fbd37-config\") pod \"kube-controller-manager-operator-78b949d7b-nc42b\" (UID: \"a8f0d578-d25e-4f48-bca6-389c9b4fbd37\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.761968 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-registry-tls\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762010 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-stats-auth\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5-metrics-tls\") pod \"dns-operator-744455d44c-5kn9v\" (UID: \"a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5\") " pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762077 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8f0d578-d25e-4f48-bca6-389c9b4fbd37-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nc42b\" (UID: \"a8f0d578-d25e-4f48-bca6-389c9b4fbd37\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762113 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5609b092-3caf-445b-99e0-edaff20d65a2-apiservice-cert\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762132 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7491ded8-c8fd-4838-b3b7-7a8ce3946e07-certs\") pod \"machine-config-server-fbhcz\" (UID: \"7491ded8-c8fd-4838-b3b7-7a8ce3946e07\") " pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63019b95-c8f5-4782-85ba-def26be394f0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762210 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-registry-certificates\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762270 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8f0d578-d25e-4f48-bca6-389c9b4fbd37-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nc42b\" (UID: \"a8f0d578-d25e-4f48-bca6-389c9b4fbd37\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762285 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-registration-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762311 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-service-ca-bundle\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762328 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndr48\" (UniqueName: \"kubernetes.io/projected/27bd3602-b31d-4b17-9902-72cac7b5580f-kube-api-access-ndr48\") pod \"machine-config-controller-84d6567774-7rcws\" (UID: \"27bd3602-b31d-4b17-9902-72cac7b5580f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762355 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-bound-sa-token\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-socket-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f847f14-06dc-46ee-ac13-08e1f8c86ae4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzpfd\" (UID: \"6f847f14-06dc-46ee-ac13-08e1f8c86ae4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/555c98c2-8078-4171-b6ec-1c2d0df9ae90-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.762488 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45f0877c-996f-4f4c-aa19-970fc7cd0459-srv-cert\") pod \"olm-operator-6b444d44fb-hn6c7\" (UID: \"45f0877c-996f-4f4c-aa19-970fc7cd0459\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.764323 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-service-ca-bundle\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.764867 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f847f14-06dc-46ee-ac13-08e1f8c86ae4-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzpfd\" (UID: \"6f847f14-06dc-46ee-ac13-08e1f8c86ae4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.768451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.769382 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-registry-certificates\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.771282 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzz2p\" (UniqueName: \"kubernetes.io/projected/45f0877c-996f-4f4c-aa19-970fc7cd0459-kube-api-access-pzz2p\") pod \"olm-operator-6b444d44fb-hn6c7\" (UID: \"45f0877c-996f-4f4c-aa19-970fc7cd0459\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.771313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5609b092-3caf-445b-99e0-edaff20d65a2-tmpfs\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.771334 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxthw\" (UniqueName: \"kubernetes.io/projected/6f847f14-06dc-46ee-ac13-08e1f8c86ae4-kube-api-access-xxthw\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzpfd\" (UID: \"6f847f14-06dc-46ee-ac13-08e1f8c86ae4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.771339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1ce758eb-1848-4be1-b1d1-65373e1531d9-signing-cabundle\") pod \"service-ca-9c57cc56f-j5nkm\" (UID: \"1ce758eb-1848-4be1-b1d1-65373e1531d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.771390 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27bd3602-b31d-4b17-9902-72cac7b5580f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7rcws\" (UID: \"27bd3602-b31d-4b17-9902-72cac7b5580f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.771421 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-proxy-tls\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: E1122 08:24:39.771461 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.271445221 +0000 UTC m=+153.977806273 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.774101 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5609b092-3caf-445b-99e0-edaff20d65a2-tmpfs\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.774251 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.775084 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f24b\" (UniqueName: \"kubernetes.io/projected/2619251a-fbe2-425c-bf25-5c0fafb3965a-kube-api-access-8f24b\") pod \"package-server-manager-789f6589d5-wbn2s\" (UID: \"2619251a-fbe2-425c-bf25-5c0fafb3965a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.775119 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-729zq\" (UniqueName: \"kubernetes.io/projected/555c98c2-8078-4171-b6ec-1c2d0df9ae90-kube-api-access-729zq\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.775359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8f0d578-d25e-4f48-bca6-389c9b4fbd37-config\") pod \"kube-controller-manager-operator-78b949d7b-nc42b\" (UID: \"a8f0d578-d25e-4f48-bca6-389c9b4fbd37\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.777522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/27bd3602-b31d-4b17-9902-72cac7b5580f-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7rcws\" (UID: \"27bd3602-b31d-4b17-9902-72cac7b5580f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.778723 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63019b95-c8f5-4782-85ba-def26be394f0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.779408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63019b95-c8f5-4782-85ba-def26be394f0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.779759 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdrzb\" (UniqueName: \"kubernetes.io/projected/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-kube-api-access-rdrzb\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.779793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/555c98c2-8078-4171-b6ec-1c2d0df9ae90-trusted-ca\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.779834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.779864 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnwq4\" (UniqueName: \"kubernetes.io/projected/ef4673e8-a499-4df4-8ee9-987b41ee501f-kube-api-access-fnwq4\") pod \"dns-default-qvtpl\" (UID: \"ef4673e8-a499-4df4-8ee9-987b41ee501f\") " pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.779884 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhdln\" (UniqueName: \"kubernetes.io/projected/7491ded8-c8fd-4838-b3b7-7a8ce3946e07-kube-api-access-jhdln\") pod \"machine-config-server-fbhcz\" (UID: \"7491ded8-c8fd-4838-b3b7-7a8ce3946e07\") " pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.780166 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63019b95-c8f5-4782-85ba-def26be394f0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.781177 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-trusted-ca\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: E1122 08:24:39.781395 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.281374539 +0000 UTC m=+153.987735701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.781437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f847f14-06dc-46ee-ac13-08e1f8c86ae4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzpfd\" (UID: \"6f847f14-06dc-46ee-ac13-08e1f8c86ae4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.781946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/555c98c2-8078-4171-b6ec-1c2d0df9ae90-trusted-ca\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.782049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ce758eb-1848-4be1-b1d1-65373e1531d9-signing-key\") pod \"service-ca-9c57cc56f-j5nkm\" (UID: \"1ce758eb-1848-4be1-b1d1-65373e1531d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.782088 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-images\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.782525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-trusted-ca\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.782632 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-proxy-tls\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.782666 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-images\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.790187 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-stats-auth\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.794173 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8f0d578-d25e-4f48-bca6-389c9b4fbd37-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nc42b\" (UID: \"a8f0d578-d25e-4f48-bca6-389c9b4fbd37\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.794940 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5609b092-3caf-445b-99e0-edaff20d65a2-apiservice-cert\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.799562 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-registry-tls\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.800178 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5609b092-3caf-445b-99e0-edaff20d65a2-webhook-cert\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.800215 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5-metrics-tls\") pod \"dns-operator-744455d44c-5kn9v\" (UID: \"a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5\") " pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.800626 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2619251a-fbe2-425c-bf25-5c0fafb3965a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-wbn2s\" (UID: \"2619251a-fbe2-425c-bf25-5c0fafb3965a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.800853 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-default-certificate\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.801061 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/27bd3602-b31d-4b17-9902-72cac7b5580f-proxy-tls\") pod \"machine-config-controller-84d6567774-7rcws\" (UID: \"27bd3602-b31d-4b17-9902-72cac7b5580f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.801436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-metrics-certs\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.801505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6ecdcc2c-1d03-46ec-96e6-da1e04437140-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-pg7xr\" (UID: \"6ecdcc2c-1d03-46ec-96e6-da1e04437140\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.801848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/45f0877c-996f-4f4c-aa19-970fc7cd0459-profile-collector-cert\") pod \"olm-operator-6b444d44fb-hn6c7\" (UID: \"45f0877c-996f-4f4c-aa19-970fc7cd0459\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.803449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/555c98c2-8078-4171-b6ec-1c2d0df9ae90-metrics-tls\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.807832 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/45f0877c-996f-4f4c-aa19-970fc7cd0459-srv-cert\") pod \"olm-operator-6b444d44fb-hn6c7\" (UID: \"45f0877c-996f-4f4c-aa19-970fc7cd0459\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.808099 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f847f14-06dc-46ee-ac13-08e1f8c86ae4-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzpfd\" (UID: \"6f847f14-06dc-46ee-ac13-08e1f8c86ae4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.814554 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fht\" (UniqueName: \"kubernetes.io/projected/a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5-kube-api-access-45fht\") pod \"dns-operator-744455d44c-5kn9v\" (UID: \"a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5\") " pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.832701 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwh82\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-kube-api-access-kwh82\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.849752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gvfz\" (UniqueName: \"kubernetes.io/projected/1ce758eb-1848-4be1-b1d1-65373e1531d9-kube-api-access-8gvfz\") pod \"service-ca-9c57cc56f-j5nkm\" (UID: \"1ce758eb-1848-4be1-b1d1-65373e1531d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.872051 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndr48\" (UniqueName: \"kubernetes.io/projected/27bd3602-b31d-4b17-9902-72cac7b5580f-kube-api-access-ndr48\") pod \"machine-config-controller-84d6567774-7rcws\" (UID: \"27bd3602-b31d-4b17-9902-72cac7b5580f\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.874602 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884287 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:39 crc kubenswrapper[4743]: E1122 08:24:39.884466 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.384441462 +0000 UTC m=+154.090802514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-csi-data-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-plugins-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef4673e8-a499-4df4-8ee9-987b41ee501f-metrics-tls\") pod \"dns-default-qvtpl\" (UID: \"ef4673e8-a499-4df4-8ee9-987b41ee501f\") " pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884765 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7491ded8-c8fd-4838-b3b7-7a8ce3946e07-node-bootstrap-token\") pod \"machine-config-server-fbhcz\" (UID: \"7491ded8-c8fd-4838-b3b7-7a8ce3946e07\") " pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884788 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-csi-data-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef4673e8-a499-4df4-8ee9-987b41ee501f-config-volume\") pod \"dns-default-qvtpl\" (UID: \"ef4673e8-a499-4df4-8ee9-987b41ee501f\") " pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884866 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v5jx\" (UniqueName: \"kubernetes.io/projected/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-kube-api-access-9v5jx\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884888 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-mountpoint-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884920 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7491ded8-c8fd-4838-b3b7-7a8ce3946e07-certs\") pod \"machine-config-server-fbhcz\" (UID: \"7491ded8-c8fd-4838-b3b7-7a8ce3946e07\") " pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-registration-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.884989 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-socket-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.885030 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-plugins-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.885071 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.885094 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnwq4\" (UniqueName: \"kubernetes.io/projected/ef4673e8-a499-4df4-8ee9-987b41ee501f-kube-api-access-fnwq4\") pod \"dns-default-qvtpl\" (UID: \"ef4673e8-a499-4df4-8ee9-987b41ee501f\") " pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.885116 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhdln\" (UniqueName: \"kubernetes.io/projected/7491ded8-c8fd-4838-b3b7-7a8ce3946e07-kube-api-access-jhdln\") pod \"machine-config-server-fbhcz\" (UID: \"7491ded8-c8fd-4838-b3b7-7a8ce3946e07\") " pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.885116 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-mountpoint-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.885619 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-registration-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.885912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-socket-dir\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.886059 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef4673e8-a499-4df4-8ee9-987b41ee501f-config-volume\") pod \"dns-default-qvtpl\" (UID: \"ef4673e8-a499-4df4-8ee9-987b41ee501f\") " pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:39 crc kubenswrapper[4743]: E1122 08:24:39.887326 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.387297738 +0000 UTC m=+154.093658790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.888976 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ef4673e8-a499-4df4-8ee9-987b41ee501f-metrics-tls\") pod \"dns-default-qvtpl\" (UID: \"ef4673e8-a499-4df4-8ee9-987b41ee501f\") " pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.891449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7491ded8-c8fd-4838-b3b7-7a8ce3946e07-certs\") pod \"machine-config-server-fbhcz\" (UID: \"7491ded8-c8fd-4838-b3b7-7a8ce3946e07\") " pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.891605 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7491ded8-c8fd-4838-b3b7-7a8ce3946e07-node-bootstrap-token\") pod \"machine-config-server-fbhcz\" (UID: \"7491ded8-c8fd-4838-b3b7-7a8ce3946e07\") " pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.905243 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtx8r\" (UniqueName: \"kubernetes.io/projected/5609b092-3caf-445b-99e0-edaff20d65a2-kube-api-access-qtx8r\") pod \"packageserver-d55dfcdfc-4kn72\" (UID: \"5609b092-3caf-445b-99e0-edaff20d65a2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.906593 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-6p2jm"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.907541 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.910252 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rmjqq"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.914633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/555c98c2-8078-4171-b6ec-1c2d0df9ae90-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.946165 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-bound-sa-token\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.970439 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a8f0d578-d25e-4f48-bca6-389c9b4fbd37-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nc42b\" (UID: \"a8f0d578-d25e-4f48-bca6-389c9b4fbd37\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.982901 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.986591 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:39 crc kubenswrapper[4743]: E1122 08:24:39.986805 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.486789884 +0000 UTC m=+154.193150936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.988675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:39 crc kubenswrapper[4743]: E1122 08:24:39.989057 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.489048122 +0000 UTC m=+154.195409174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.990975 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-qst5g"] Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.992662 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wvm5\" (UniqueName: \"kubernetes.io/projected/c0ee3fe1-7f79-4f58-91ae-94a3f046401f-kube-api-access-8wvm5\") pod \"router-default-5444994796-v99cs\" (UID: \"c0ee3fe1-7f79-4f58-91ae-94a3f046401f\") " pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:39 crc kubenswrapper[4743]: I1122 08:24:39.993923 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.004128 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8plkl\" (UniqueName: \"kubernetes.io/projected/6ecdcc2c-1d03-46ec-96e6-da1e04437140-kube-api-access-8plkl\") pod \"control-plane-machine-set-operator-78cbb6b69f-pg7xr\" (UID: \"6ecdcc2c-1d03-46ec-96e6-da1e04437140\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.004333 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.005126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1ce758eb-1848-4be1-b1d1-65373e1531d9-signing-key\") pod \"service-ca-9c57cc56f-j5nkm\" (UID: \"1ce758eb-1848-4be1-b1d1-65373e1531d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.021719 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzz2p\" (UniqueName: \"kubernetes.io/projected/45f0877c-996f-4f4c-aa19-970fc7cd0459-kube-api-access-pzz2p\") pod \"olm-operator-6b444d44fb-hn6c7\" (UID: \"45f0877c-996f-4f4c-aa19-970fc7cd0459\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.032624 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxthw\" (UniqueName: \"kubernetes.io/projected/6f847f14-06dc-46ee-ac13-08e1f8c86ae4-kube-api-access-xxthw\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzpfd\" (UID: \"6f847f14-06dc-46ee-ac13-08e1f8c86ae4\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.035115 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.035733 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dm8fj"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.046790 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sb8ph"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.047111 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.053926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f24b\" (UniqueName: \"kubernetes.io/projected/2619251a-fbe2-425c-bf25-5c0fafb3965a-kube-api-access-8f24b\") pod \"package-server-manager-789f6589d5-wbn2s\" (UID: \"2619251a-fbe2-425c-bf25-5c0fafb3965a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.053937 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hhpxp"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.054085 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.061385 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.070958 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.072867 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-729zq\" (UniqueName: \"kubernetes.io/projected/555c98c2-8078-4171-b6ec-1c2d0df9ae90-kube-api-access-729zq\") pod \"ingress-operator-5b745b69d9-lbrsm\" (UID: \"555c98c2-8078-4171-b6ec-1c2d0df9ae90\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.075897 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.094018 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.094569 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.594545468 +0000 UTC m=+154.300906510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.094976 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.097709 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.107980 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.129776 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.129847 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-9fhcv"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.130301 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.130595 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.135179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnwq4\" (UniqueName: \"kubernetes.io/projected/ef4673e8-a499-4df4-8ee9-987b41ee501f-kube-api-access-fnwq4\") pod \"dns-default-qvtpl\" (UID: \"ef4673e8-a499-4df4-8ee9-987b41ee501f\") " pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.137650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdrzb\" (UniqueName: \"kubernetes.io/projected/f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba-kube-api-access-rdrzb\") pod \"machine-config-operator-74547568cd-j2xkp\" (UID: \"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.160928 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.161105 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhdln\" (UniqueName: \"kubernetes.io/projected/7491ded8-c8fd-4838-b3b7-7a8ce3946e07-kube-api-access-jhdln\") pod \"machine-config-server-fbhcz\" (UID: \"7491ded8-c8fd-4838-b3b7-7a8ce3946e07\") " pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.169735 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.171124 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" event={"ID":"9e14fb50-5723-489d-acc2-c5ca42234b73","Type":"ContainerStarted","Data":"b5730219dd95aed9076dd39d5d85bbe4c233cd353f153cc89cafefc248932de2"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.175756 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.175794 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" event={"ID":"506d451b-5cf3-44fe-be73-9d43abbbf9a8","Type":"ContainerStarted","Data":"b8efadd3509a477a5638c13ad50876f18495afa57bff0c416647ae3fc3fdb399"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.177246 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" event={"ID":"b65df5f7-3f27-422f-acde-42d3029cd963","Type":"ContainerStarted","Data":"55d14daad91114ee59c758ef225276f420e626aeea2935cb9e98c92f6a2070b8"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.190965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5tsjj" event={"ID":"01b809b1-7b62-4043-9411-7194d6e96e47","Type":"ContainerStarted","Data":"615faf6ed985fe140d6b8a92bf126c051c28f0974976cbe06b85ab0c6df5c48c"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.191023 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5tsjj" event={"ID":"01b809b1-7b62-4043-9411-7194d6e96e47","Type":"ContainerStarted","Data":"8a28a88c33fb5f534366e3887bc3af5978fcd92b0f6ea009b4585cf4c3488193"} Nov 22 08:24:40 crc kubenswrapper[4743]: W1122 08:24:40.204150 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0adcbd21_d1d6_4beb_8b26_74d16b534b91.slice/crio-d7a2fa2a98811d00c51f23b3eb256faeac265d0b297bc29c847830bcdc8e5f32 WatchSource:0}: Error finding container d7a2fa2a98811d00c51f23b3eb256faeac265d0b297bc29c847830bcdc8e5f32: Status 404 returned error can't find the container with id d7a2fa2a98811d00c51f23b3eb256faeac265d0b297bc29c847830bcdc8e5f32 Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.208432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.208842 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.708830478 +0000 UTC m=+154.415191530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.238092 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" event={"ID":"193a50c2-2643-4267-b722-a92cc83d3d40","Type":"ContainerStarted","Data":"b3171ebcf6c8631fa626fbfefacd6cf1a22cc17f6f1707a6243a1a1d05566f75"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.242708 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v5jx\" (UniqueName: \"kubernetes.io/projected/ca1bf802-2f8f-4de6-9d36-d0b3e6440865-kube-api-access-9v5jx\") pod \"csi-hostpathplugin-858dv\" (UID: \"ca1bf802-2f8f-4de6-9d36-d0b3e6440865\") " pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.246756 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" event={"ID":"16da58ac-db10-4cb2-a7e9-330ac883a480","Type":"ContainerStarted","Data":"cbf452af6e9aac91ece24d031b21826bd77a6d30fa51fb3215255ea34dc0dbe9"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.246820 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" event={"ID":"16da58ac-db10-4cb2-a7e9-330ac883a480","Type":"ContainerStarted","Data":"253c2821bce5bac3ec7c8e600cbc0aee3ec70a2ac2cbeb449f0cf4f9d4b4a09b"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.248831 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" event={"ID":"42e31b43-9e9c-4ab0-a6bf-3857b755b87a","Type":"ContainerStarted","Data":"227ba7bcb1066c00572daf163f3bf5c9a9555e322e18eb9212493511120c7c44"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.250619 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" event={"ID":"1b40a4ea-3227-449c-b318-8e47b7eeefc4","Type":"ContainerStarted","Data":"64aec063836816e724404c827f8d4687960a8643cea6070b790262b6b372d45e"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.253017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" event={"ID":"71f63f00-6812-4f35-ba1e-d1ea01a27a19","Type":"ContainerStarted","Data":"6eeea9a14b0c1807fc82fb11a9836cd1ccc104d7522d9467f790d236e700029c"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.275205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" event={"ID":"7352b0d1-4af7-49c9-8029-5a97c3cdf450","Type":"ContainerStarted","Data":"89a5eabb6fe817dc4324ec588f84a9fb7d3baa534a49470db166681429cbcde3"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.275256 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" event={"ID":"7352b0d1-4af7-49c9-8029-5a97c3cdf450","Type":"ContainerStarted","Data":"e8de8782ba7fd17f9d0071ec9f59bfad41f9cf3694cda58f988970b9604b3ca6"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.278593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" event={"ID":"3a221ced-5d72-41e5-8b49-d93ec52d53f5","Type":"ContainerStarted","Data":"15aa5d5514879e29dfede13c21710811c05fe3646d6ea88947401a14274fcaac"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.280453 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.288304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" event={"ID":"c7c71cb3-54e2-471f-a91f-50c146c4e3c8","Type":"ContainerStarted","Data":"f1ee2c8d74626c838fbf2cd9276144e3424218e1453ec6997256ecc308907aae"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.289552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rmjqq" event={"ID":"73592dc4-d2b3-42f7-9bec-346286516f23","Type":"ContainerStarted","Data":"1e6e300cbbc332f04e9977ff593d88c9c4edc71c3fe8d7e417fcfd090e51ba6a"} Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.310875 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.311044 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.811015175 +0000 UTC m=+154.517376227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.311234 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.311541 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.81152945 +0000 UTC m=+154.517890502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.340173 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5kn9v"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.413254 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.413370 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.913348476 +0000 UTC m=+154.619709548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.413848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.413546 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.414169 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:40.914156731 +0000 UTC m=+154.620517783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.446087 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-fbhcz" Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.463276 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-858dv" Nov 22 08:24:40 crc kubenswrapper[4743]: W1122 08:24:40.495356 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28f586ec_7a65_4c1e_9f09_845b812246b0.slice/crio-5388f12bb92ed66906ad47991f5bdf8a688fc4df0c5293fa382bc54aed76ff95 WatchSource:0}: Error finding container 5388f12bb92ed66906ad47991f5bdf8a688fc4df0c5293fa382bc54aed76ff95: Status 404 returned error can't find the container with id 5388f12bb92ed66906ad47991f5bdf8a688fc4df0c5293fa382bc54aed76ff95 Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.499420 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt"] Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.514747 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.014717619 +0000 UTC m=+154.721078671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.515162 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.515538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.515945 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.015935145 +0000 UTC m=+154.722296197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.532362 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.618104 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.618359 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.118334079 +0000 UTC m=+154.824695131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.618718 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.619369 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.119352359 +0000 UTC m=+154.825713411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.646075 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b"] Nov 22 08:24:40 crc kubenswrapper[4743]: W1122 08:24:40.658097 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaff02729_0197_4f76_b43e_594c908f8312.slice/crio-2dedb15f18bfad7b88815151fb4cb92174f17f5f175aaba123a2091cf1eb18ac WatchSource:0}: Error finding container 2dedb15f18bfad7b88815151fb4cb92174f17f5f175aaba123a2091cf1eb18ac: Status 404 returned error can't find the container with id 2dedb15f18bfad7b88815151fb4cb92174f17f5f175aaba123a2091cf1eb18ac Nov 22 08:24:40 crc kubenswrapper[4743]: W1122 08:24:40.659462 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5564388b_e6dd_409f_a137_b34700967f4a.slice/crio-1286b581b3035331b4019abd3aa1c10c47b76fbb557fdebacd759baa7e89adac WatchSource:0}: Error finding container 1286b581b3035331b4019abd3aa1c10c47b76fbb557fdebacd759baa7e89adac: Status 404 returned error can't find the container with id 1286b581b3035331b4019abd3aa1c10c47b76fbb557fdebacd759baa7e89adac Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.677867 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72"] Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.720203 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.720605 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.220590518 +0000 UTC m=+154.926951570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.823918 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.829166 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.329144586 +0000 UTC m=+155.035505638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.836142 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr"] Nov 22 08:24:40 crc kubenswrapper[4743]: W1122 08:24:40.908913 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ee3fe1_7f79_4f58_91ae_94a3f046401f.slice/crio-07610b837a25750ca4b5d9f7b88527a8f222e4b2b7dc115b4a863d01c6c22e4b WatchSource:0}: Error finding container 07610b837a25750ca4b5d9f7b88527a8f222e4b2b7dc115b4a863d01c6c22e4b: Status 404 returned error can't find the container with id 07610b837a25750ca4b5d9f7b88527a8f222e4b2b7dc115b4a863d01c6c22e4b Nov 22 08:24:40 crc kubenswrapper[4743]: W1122 08:24:40.911707 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8f0d578_d25e_4f48_bca6_389c9b4fbd37.slice/crio-03e1367f631adee2e4e8785ada831a325d719d12de560b09e0fd5adc6193088c WatchSource:0}: Error finding container 03e1367f631adee2e4e8785ada831a325d719d12de560b09e0fd5adc6193088c: Status 404 returned error can't find the container with id 03e1367f631adee2e4e8785ada831a325d719d12de560b09e0fd5adc6193088c Nov 22 08:24:40 crc kubenswrapper[4743]: I1122 08:24:40.925845 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:40 crc kubenswrapper[4743]: E1122 08:24:40.926124 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.426108856 +0000 UTC m=+155.132469898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:40 crc kubenswrapper[4743]: W1122 08:24:40.978293 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ecdcc2c_1d03_46ec_96e6_da1e04437140.slice/crio-94493246f3b4c214a497f5f223095c6ce935eadd1baae58d15e1a0ebaab776ee WatchSource:0}: Error finding container 94493246f3b4c214a497f5f223095c6ce935eadd1baae58d15e1a0ebaab776ee: Status 404 returned error can't find the container with id 94493246f3b4c214a497f5f223095c6ce935eadd1baae58d15e1a0ebaab776ee Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.028288 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.037161 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.537111897 +0000 UTC m=+155.243472949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.089451 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7"] Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.128956 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.129512 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.6294966 +0000 UTC m=+155.335857652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.133373 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm"] Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.195256 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-j5nkm"] Nov 22 08:24:41 crc kubenswrapper[4743]: W1122 08:24:41.217310 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod555c98c2_8078_4171_b6ec_1c2d0df9ae90.slice/crio-2294771f7ea4c5c55d45fb4709908275bf8e6150a0e59bd8861d61f4850ba97d WatchSource:0}: Error finding container 2294771f7ea4c5c55d45fb4709908275bf8e6150a0e59bd8861d61f4850ba97d: Status 404 returned error can't find the container with id 2294771f7ea4c5c55d45fb4709908275bf8e6150a0e59bd8861d61f4850ba97d Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.230346 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.230642 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.730626075 +0000 UTC m=+155.436987127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.230840 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qvtpl"] Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.253724 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s"] Nov 22 08:24:41 crc kubenswrapper[4743]: W1122 08:24:41.260983 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ce758eb_1848_4be1_b1d1_65373e1531d9.slice/crio-d61decc0ef94c91d7609bf26554a16d4918be0568b00c51692bfaa2b03cb87d7 WatchSource:0}: Error finding container d61decc0ef94c91d7609bf26554a16d4918be0568b00c51692bfaa2b03cb87d7: Status 404 returned error can't find the container with id d61decc0ef94c91d7609bf26554a16d4918be0568b00c51692bfaa2b03cb87d7 Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.264524 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws"] Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.305459 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd"] Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.320423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" event={"ID":"c7c71cb3-54e2-471f-a91f-50c146c4e3c8","Type":"ContainerStarted","Data":"8f9f51382444a2c2ec19d1e8c4cb75911928e72de0030413e8a01344933775b6"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.327062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" event={"ID":"28f586ec-7a65-4c1e-9f09-845b812246b0","Type":"ContainerStarted","Data":"5388f12bb92ed66906ad47991f5bdf8a688fc4df0c5293fa382bc54aed76ff95"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.330609 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" event={"ID":"71f63f00-6812-4f35-ba1e-d1ea01a27a19","Type":"ContainerStarted","Data":"a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.330921 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.331132 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.331342 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.831322547 +0000 UTC m=+155.537683589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.332386 4743 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8c5mq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.332428 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" podUID="71f63f00-6812-4f35-ba1e-d1ea01a27a19" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.333633 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" event={"ID":"fa343393-45bb-4bde-a13b-1686db2c1979","Type":"ContainerStarted","Data":"11e6d9d6f7f15d4861e7173ba5cbffd8f98ff7b285af96ca698f83dcaacfdda2"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.333681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" event={"ID":"fa343393-45bb-4bde-a13b-1686db2c1979","Type":"ContainerStarted","Data":"d3cb00954c18a50719119688e43d663f1769e2010a826543800a0df4b5661d1e"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.341683 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" event={"ID":"9e14fb50-5723-489d-acc2-c5ca42234b73","Type":"ContainerStarted","Data":"672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.353030 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" event={"ID":"aff02729-0197-4f76-b43e-594c908f8312","Type":"ContainerStarted","Data":"2dedb15f18bfad7b88815151fb4cb92174f17f5f175aaba123a2091cf1eb18ac"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.361101 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" event={"ID":"1b40a4ea-3227-449c-b318-8e47b7eeefc4","Type":"ContainerStarted","Data":"3a3abd1e9e7ca1b0406431c5c59057ffc948ffe0bf4b9fb5290c0f6ecbf2d4ac"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.364480 4743 generic.go:334] "Generic (PLEG): container finished" podID="7352b0d1-4af7-49c9-8029-5a97c3cdf450" containerID="89a5eabb6fe817dc4324ec588f84a9fb7d3baa534a49470db166681429cbcde3" exitCode=0 Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.364737 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" event={"ID":"7352b0d1-4af7-49c9-8029-5a97c3cdf450","Type":"ContainerDied","Data":"89a5eabb6fe817dc4324ec588f84a9fb7d3baa534a49470db166681429cbcde3"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.369838 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fbhcz" event={"ID":"7491ded8-c8fd-4838-b3b7-7a8ce3946e07","Type":"ContainerStarted","Data":"df846d8b2fd1475d65dbcca9ffff8e498a2e4f8883cb7be910f19e77a1a8d93b"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.372263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" event={"ID":"aabe2e04-2ba1-4719-a4e9-8e763c6e3659","Type":"ContainerStarted","Data":"51fb716db3385df0b6217fd0b135e530cde656926a5af5569970730ab5a51409"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.372298 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" event={"ID":"aabe2e04-2ba1-4719-a4e9-8e763c6e3659","Type":"ContainerStarted","Data":"6f5a8139a6c918df02702ab47ded2a511c6eb6b858b378a0c51515d5d4309caa"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.373996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hhpxp" event={"ID":"bead015e-e8e8-44f2-8dae-41047cd66706","Type":"ContainerStarted","Data":"3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.374052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hhpxp" event={"ID":"bead015e-e8e8-44f2-8dae-41047cd66706","Type":"ContainerStarted","Data":"1febf60a2946ceb6e5b87daa709f1e3e29f258386c8dd565b74601d96a835da7"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.375058 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" event={"ID":"45f0877c-996f-4f4c-aa19-970fc7cd0459","Type":"ContainerStarted","Data":"51ba4013505c1e5d377a3704a9427b8bb544a02df0278f075c92b6ef547c3baa"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.377162 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" event={"ID":"b72706e5-e53f-4c1f-81fa-6b850a062076","Type":"ContainerStarted","Data":"7c4d9a53be746b646a98c38fff4d45ced6d9a2a2d3b586cc4ba1b9a03c3d8072"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.379010 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" event={"ID":"555c98c2-8078-4171-b6ec-1c2d0df9ae90","Type":"ContainerStarted","Data":"2294771f7ea4c5c55d45fb4709908275bf8e6150a0e59bd8861d61f4850ba97d"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.383008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" event={"ID":"3a221ced-5d72-41e5-8b49-d93ec52d53f5","Type":"ContainerStarted","Data":"25006d34ad37270860ad26b65ff8b8aab474c12d553af77feb2d86178a5dfb97"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.383222 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.384420 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" event={"ID":"a8f0d578-d25e-4f48-bca6-389c9b4fbd37","Type":"ContainerStarted","Data":"03e1367f631adee2e4e8785ada831a325d719d12de560b09e0fd5adc6193088c"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.385018 4743 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-r6rj6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.385054 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" podUID="3a221ced-5d72-41e5-8b49-d93ec52d53f5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.386678 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" event={"ID":"5609b092-3caf-445b-99e0-edaff20d65a2","Type":"ContainerStarted","Data":"307318acfa37dfa0f962bd526ff6203dfb5c71c733903dc242db5f3332f41f84"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.388163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sb8ph" event={"ID":"1eb0462a-95bc-435f-825a-59fc93898e81","Type":"ContainerStarted","Data":"687459df353e2ba88128341703c8ee8c97e9defb868fc0e753c794c2c795dc6c"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.388216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sb8ph" event={"ID":"1eb0462a-95bc-435f-825a-59fc93898e81","Type":"ContainerStarted","Data":"01084e6a4ff7c08523fa4c35bd7bd6fe79ab8f012cf36c9623d4d71be8222e7b"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.398414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" event={"ID":"1ce758eb-1848-4be1-b1d1-65373e1531d9","Type":"ContainerStarted","Data":"d61decc0ef94c91d7609bf26554a16d4918be0568b00c51692bfaa2b03cb87d7"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.402320 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" event={"ID":"0adcbd21-d1d6-4beb-8b26-74d16b534b91","Type":"ContainerStarted","Data":"d7a2fa2a98811d00c51f23b3eb256faeac265d0b297bc29c847830bcdc8e5f32"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.404342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" event={"ID":"6ecdcc2c-1d03-46ec-96e6-da1e04437140","Type":"ContainerStarted","Data":"94493246f3b4c214a497f5f223095c6ce935eadd1baae58d15e1a0ebaab776ee"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.405990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" event={"ID":"506d451b-5cf3-44fe-be73-9d43abbbf9a8","Type":"ContainerStarted","Data":"b129e856437b39de4e8636ccf5d5d2b5de93a1b8a797ec74c8d285c831ff015e"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.407247 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" event={"ID":"42e31b43-9e9c-4ab0-a6bf-3857b755b87a","Type":"ContainerStarted","Data":"f2449e0504750d747d8b378b329ab5794e77efbc79c2b43c8f2ec649b379ace2"} Nov 22 08:24:41 crc kubenswrapper[4743]: W1122 08:24:41.410388 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27bd3602_b31d_4b17_9902_72cac7b5580f.slice/crio-5988385e8831bf18b1e81c59b55cdc3d726424a4bdc6f23b7e28faba5bbfd4dd WatchSource:0}: Error finding container 5988385e8831bf18b1e81c59b55cdc3d726424a4bdc6f23b7e28faba5bbfd4dd: Status 404 returned error can't find the container with id 5988385e8831bf18b1e81c59b55cdc3d726424a4bdc6f23b7e28faba5bbfd4dd Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.431840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" event={"ID":"352175cc-5065-4a1f-ac24-7ae82d39b87d","Type":"ContainerStarted","Data":"a5fc8feb6c5aae7fca76e55b6350d33dbc6dda46663242feca52dbf98e92165d"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.432399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.432879 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:41.932861074 +0000 UTC m=+155.639222126 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.433567 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl" event={"ID":"49cb4faf-6b21-4097-8f6c-24a310cff149","Type":"ContainerStarted","Data":"598bef6fbd711a61521217da9ade529669c4954b3472e85b8bb32771e51fca71"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.442712 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" event={"ID":"193a50c2-2643-4267-b722-a92cc83d3d40","Type":"ContainerStarted","Data":"a1c5eefe27a90532794e3d3ccfb91d7e1a83e8047dcab1fc3e48bca594848eff"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.444375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" event={"ID":"2bbdf845-aa39-41ed-a45c-75ac4ba8e894","Type":"ContainerStarted","Data":"405bc010d1042f70d17ed3b45d0bb6e68ccd7074ddbd52268d49f21e04903cd7"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.446553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" event={"ID":"5564388b-e6dd-409f-a137-b34700967f4a","Type":"ContainerStarted","Data":"1286b581b3035331b4019abd3aa1c10c47b76fbb557fdebacd759baa7e89adac"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.449433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-v99cs" event={"ID":"c0ee3fe1-7f79-4f58-91ae-94a3f046401f","Type":"ContainerStarted","Data":"07610b837a25750ca4b5d9f7b88527a8f222e4b2b7dc115b4a863d01c6c22e4b"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.455236 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" event={"ID":"07f6c2e0-4230-40e0-ad71-2f652546cd38","Type":"ContainerStarted","Data":"80de222ade7a1064bb751e5a3896645a34501e9fefd15d6925d827424e705fef"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.467904 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" event={"ID":"b65df5f7-3f27-422f-acde-42d3029cd963","Type":"ContainerStarted","Data":"d7e84cf4456b971606029e37b5d67ec22b1d9032c52f5886d284420573c4362d"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.483255 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-858dv"] Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.491287 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" event={"ID":"a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5","Type":"ContainerStarted","Data":"8c09e6dc0f2368dbb2fef7d354eaa6046636091be257977ed6746a7771000453"} Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.491622 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5tsjj" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.493683 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.493717 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.533643 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.536108 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.036078282 +0000 UTC m=+155.742439344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.536394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.539426 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.039412532 +0000 UTC m=+155.745773584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.543448 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp"] Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.637972 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.639382 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.139366522 +0000 UTC m=+155.845727574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: W1122 08:24:41.679200 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf33a5b12_a9fd_4e23_ac78_9ddf2a4c50ba.slice/crio-841c518c14941c81e8d1e42417377cf7fdfb29551f436b8541fe3209ccbb504f WatchSource:0}: Error finding container 841c518c14941c81e8d1e42417377cf7fdfb29551f436b8541fe3209ccbb504f: Status 404 returned error can't find the container with id 841c518c14941c81e8d1e42417377cf7fdfb29551f436b8541fe3209ccbb504f Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.686433 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-5tsjj" podStartSLOduration=133.686406164 podStartE2EDuration="2m13.686406164s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:41.679458145 +0000 UTC m=+155.385819197" watchObservedRunningTime="2025-11-22 08:24:41.686406164 +0000 UTC m=+155.392767216" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.721832 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fwvd8" podStartSLOduration=133.721811737 podStartE2EDuration="2m13.721811737s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:41.719824907 +0000 UTC m=+155.426185959" watchObservedRunningTime="2025-11-22 08:24:41.721811737 +0000 UTC m=+155.428172809" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.740697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.741064 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.241047804 +0000 UTC m=+155.947408856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.771038 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zqhp9" podStartSLOduration=133.771002373 podStartE2EDuration="2m13.771002373s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:41.766761046 +0000 UTC m=+155.473122098" watchObservedRunningTime="2025-11-22 08:24:41.771002373 +0000 UTC m=+155.477363425" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.843097 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" podStartSLOduration=133.843077746 podStartE2EDuration="2m13.843077746s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:41.799330753 +0000 UTC m=+155.505691805" watchObservedRunningTime="2025-11-22 08:24:41.843077746 +0000 UTC m=+155.549438788" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.845947 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.846182 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.346168589 +0000 UTC m=+156.052529641 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.881886 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-cq9j5" podStartSLOduration=133.88186575 podStartE2EDuration="2m13.88186575s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:41.845510249 +0000 UTC m=+155.551871301" watchObservedRunningTime="2025-11-22 08:24:41.88186575 +0000 UTC m=+155.588226802" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.882429 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-6p2jm" podStartSLOduration=133.882423257 podStartE2EDuration="2m13.882423257s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:41.881270452 +0000 UTC m=+155.587631514" watchObservedRunningTime="2025-11-22 08:24:41.882423257 +0000 UTC m=+155.588784309" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.922047 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" podStartSLOduration=133.922027736 podStartE2EDuration="2m13.922027736s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:41.919468649 +0000 UTC m=+155.625829711" watchObservedRunningTime="2025-11-22 08:24:41.922027736 +0000 UTC m=+155.628388788" Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.947330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:41 crc kubenswrapper[4743]: E1122 08:24:41.947666 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.447652905 +0000 UTC m=+156.154013947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:41 crc kubenswrapper[4743]: I1122 08:24:41.963911 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-qst5g" podStartSLOduration=132.963839501 podStartE2EDuration="2m12.963839501s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:41.961193261 +0000 UTC m=+155.667554313" watchObservedRunningTime="2025-11-22 08:24:41.963839501 +0000 UTC m=+155.670200553" Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.048605 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.049129 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.54911224 +0000 UTC m=+156.255473292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.149839 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.150244 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.650227605 +0000 UTC m=+156.356588657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.251952 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.252217 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.752172904 +0000 UTC m=+156.458533976 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.252625 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.253057 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.75303398 +0000 UTC m=+156.459395042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.353894 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.354340 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.85430403 +0000 UTC m=+156.560665082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.354392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.354752 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.854744373 +0000 UTC m=+156.561105425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.456073 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.456893 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:42.956874928 +0000 UTC m=+156.663235980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.500684 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" event={"ID":"2619251a-fbe2-425c-bf25-5c0fafb3965a","Type":"ContainerStarted","Data":"3f115512940d7746ba8043127569ddc48a8431112d0af635196732adab5b5ca0"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.502990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" event={"ID":"27bd3602-b31d-4b17-9902-72cac7b5580f","Type":"ContainerStarted","Data":"5988385e8831bf18b1e81c59b55cdc3d726424a4bdc6f23b7e28faba5bbfd4dd"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.503959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qvtpl" event={"ID":"ef4673e8-a499-4df4-8ee9-987b41ee501f","Type":"ContainerStarted","Data":"60512cca737444dc88e4049f53ea8c41c0ddd8e8c8c7ba850cea6d1572d64fb2"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.505238 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" event={"ID":"07f6c2e0-4230-40e0-ad71-2f652546cd38","Type":"ContainerStarted","Data":"5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.506442 4743 generic.go:334] "Generic (PLEG): container finished" podID="c7c71cb3-54e2-471f-a91f-50c146c4e3c8" containerID="8f9f51382444a2c2ec19d1e8c4cb75911928e72de0030413e8a01344933775b6" exitCode=0 Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.506497 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" event={"ID":"c7c71cb3-54e2-471f-a91f-50c146c4e3c8","Type":"ContainerDied","Data":"8f9f51382444a2c2ec19d1e8c4cb75911928e72de0030413e8a01344933775b6"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.508697 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rmjqq" event={"ID":"73592dc4-d2b3-42f7-9bec-346286516f23","Type":"ContainerStarted","Data":"586bbe4c252a2f54d6139cbaa98760b7b9edaf49794f5426f5ca90ff1a7b22b4"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.510866 4743 generic.go:334] "Generic (PLEG): container finished" podID="b72706e5-e53f-4c1f-81fa-6b850a062076" containerID="27052275693bbd28bf4652f697fedb4577a9006a442d9d1361b2e52fec9c79dc" exitCode=0 Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.510909 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" event={"ID":"b72706e5-e53f-4c1f-81fa-6b850a062076","Type":"ContainerDied","Data":"27052275693bbd28bf4652f697fedb4577a9006a442d9d1361b2e52fec9c79dc"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.511687 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" event={"ID":"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba","Type":"ContainerStarted","Data":"841c518c14941c81e8d1e42417377cf7fdfb29551f436b8541fe3209ccbb504f"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.512388 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" event={"ID":"6f847f14-06dc-46ee-ac13-08e1f8c86ae4","Type":"ContainerStarted","Data":"0167bc897af72530d693ee046981e8ec696504381e8cf8ea8d725c69486014ab"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.513042 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-858dv" event={"ID":"ca1bf802-2f8f-4de6-9d36-d0b3e6440865","Type":"ContainerStarted","Data":"87ac8077af106b746669d83393c48498a9eda9922d3190a71239d8b58060bdc1"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.514361 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl" event={"ID":"49cb4faf-6b21-4097-8f6c-24a310cff149","Type":"ContainerStarted","Data":"30c34fb14dc107d562a804a7005ace279d1334788ec102235cd0f64e88e069fa"} Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.518228 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.518311 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.518341 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.518667 4743 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8c5mq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.518687 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" podUID="71f63f00-6812-4f35-ba1e-d1ea01a27a19" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.518746 4743 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-r6rj6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.518765 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" podUID="3a221ced-5d72-41e5-8b49-d93ec52d53f5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.536511 4743 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-42kzd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.536664 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" podUID="9e14fb50-5723-489d-acc2-c5ca42234b73" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.560418 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.562408 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.062396655 +0000 UTC m=+156.768757707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.585482 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" podStartSLOduration=134.585463688 podStartE2EDuration="2m14.585463688s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:42.583318543 +0000 UTC m=+156.289679595" watchObservedRunningTime="2025-11-22 08:24:42.585463688 +0000 UTC m=+156.291824730" Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.662215 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.662528 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.1625099 +0000 UTC m=+156.868870952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.763334 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.763747 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.263731338 +0000 UTC m=+156.970092390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.863905 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.864120 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.36409243 +0000 UTC m=+157.070453482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.864528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.864811 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.364799981 +0000 UTC m=+157.071161033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.965338 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.965444 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.465415741 +0000 UTC m=+157.171776803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:42 crc kubenswrapper[4743]: I1122 08:24:42.965535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:42 crc kubenswrapper[4743]: E1122 08:24:42.965891 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.465879875 +0000 UTC m=+157.172240927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.069972 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:43 crc kubenswrapper[4743]: E1122 08:24:43.070463 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.570447344 +0000 UTC m=+157.276808396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.172043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:43 crc kubenswrapper[4743]: E1122 08:24:43.172447 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.672428244 +0000 UTC m=+157.378789356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.273030 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:43 crc kubenswrapper[4743]: E1122 08:24:43.273706 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.773686784 +0000 UTC m=+157.480047836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.374125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:43 crc kubenswrapper[4743]: E1122 08:24:43.374434 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.874422797 +0000 UTC m=+157.580783849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.474927 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:43 crc kubenswrapper[4743]: E1122 08:24:43.475191 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:43.975164521 +0000 UTC m=+157.681525573 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.535489 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" event={"ID":"16da58ac-db10-4cb2-a7e9-330ac883a480","Type":"ContainerStarted","Data":"713bd061f9f3cd1aa99b0971865974cac0073223532ceb3e6e03352a5eab0e70"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.542765 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" event={"ID":"28f586ec-7a65-4c1e-9f09-845b812246b0","Type":"ContainerStarted","Data":"2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.543972 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.550601 4743 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rrn7z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.550634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl" event={"ID":"49cb4faf-6b21-4097-8f6c-24a310cff149","Type":"ContainerStarted","Data":"a9bef365a400cd1e219e01d6d5d7466d877c93f35dd7ca6cc26ce364912be9da"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.550662 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" podUID="28f586ec-7a65-4c1e-9f09-845b812246b0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.553006 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qvtpl" event={"ID":"ef4673e8-a499-4df4-8ee9-987b41ee501f","Type":"ContainerStarted","Data":"317959a860b64e37c04b2df5bf0ee4d72da18568127b549e60cc9675879d3b68"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.558456 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gw2gb" podStartSLOduration=135.55843802 podStartE2EDuration="2m15.55843802s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.557138331 +0000 UTC m=+157.263499383" watchObservedRunningTime="2025-11-22 08:24:43.55843802 +0000 UTC m=+157.264799072" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.573900 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" event={"ID":"555c98c2-8078-4171-b6ec-1c2d0df9ae90","Type":"ContainerStarted","Data":"ef32ab410781c79add825825e7a3ab1a9fd322f4d06a197e2b9ef9d04b6bf19d"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.577650 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:43 crc kubenswrapper[4743]: E1122 08:24:43.585388 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.085370728 +0000 UTC m=+157.791731780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.587288 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-gnsjl" podStartSLOduration=135.587270065 podStartE2EDuration="2m15.587270065s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.585047158 +0000 UTC m=+157.291408211" watchObservedRunningTime="2025-11-22 08:24:43.587270065 +0000 UTC m=+157.293631117" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.592113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" event={"ID":"aff02729-0197-4f76-b43e-594c908f8312","Type":"ContainerStarted","Data":"11a9001841dce6a7a53cc3fa7f9184dfef5217088521bea5a2111cf00f5c17be"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.597375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" event={"ID":"7352b0d1-4af7-49c9-8029-5a97c3cdf450","Type":"ContainerStarted","Data":"d014e5af6311fc16b78492c26ab7ba9cb4d6dd4b9ddd62ddd54160781041dddf"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.597635 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.600020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" event={"ID":"2619251a-fbe2-425c-bf25-5c0fafb3965a","Type":"ContainerStarted","Data":"5de822d5c0a569a6fc7f99ff1163fd279d62bbdb1f49e3291891af6557f1fc08"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.608017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" event={"ID":"c7c71cb3-54e2-471f-a91f-50c146c4e3c8","Type":"ContainerStarted","Data":"bc2dbce4bc3e58f527f507575a3c81ca0eb9b3423ec11f8d14ff63cb71836b3e"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.608804 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" podStartSLOduration=134.608786111 podStartE2EDuration="2m14.608786111s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.606161992 +0000 UTC m=+157.312523054" watchObservedRunningTime="2025-11-22 08:24:43.608786111 +0000 UTC m=+157.315147163" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.611689 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" event={"ID":"6f847f14-06dc-46ee-ac13-08e1f8c86ae4","Type":"ContainerStarted","Data":"98060f26899cf3f9296fa0652bf9e4fa287e48ea67ac02f9994e0015c8238564"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.614607 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" event={"ID":"1ce758eb-1848-4be1-b1d1-65373e1531d9","Type":"ContainerStarted","Data":"62b5199199a9eca7cbe859d2719a52094c6d355aab5ff023f1f88d67ddf54174"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.617460 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" event={"ID":"27bd3602-b31d-4b17-9902-72cac7b5580f","Type":"ContainerStarted","Data":"563c993394686d1106be332a6724f57eb7579962967ddbf8eca066b36ca74822"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.624025 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-6r56v" podStartSLOduration=135.624005008 podStartE2EDuration="2m15.624005008s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.623287706 +0000 UTC m=+157.329648758" watchObservedRunningTime="2025-11-22 08:24:43.624005008 +0000 UTC m=+157.330366060" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.626177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" event={"ID":"5564388b-e6dd-409f-a137-b34700967f4a","Type":"ContainerStarted","Data":"9aadc3a0986e123bcf0a362e1c9ed8598762bf3912aa6ade14a91b173e16fc04"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.629154 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" event={"ID":"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba","Type":"ContainerStarted","Data":"d3ca083d5c7be286486b70a03dd96026a3018f0a2223932f6530e8d536f5c1ad"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.631309 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-v99cs" event={"ID":"c0ee3fe1-7f79-4f58-91ae-94a3f046401f","Type":"ContainerStarted","Data":"051a054b47f4a2e92356c09337740d7f3623a9d61fb91467b18a89fb7a9d375c"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.632514 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" event={"ID":"0adcbd21-d1d6-4beb-8b26-74d16b534b91","Type":"ContainerStarted","Data":"49097e65fbf4edf26122f196d9f0c9527fdfde65d785f68309c0b4de58f9d1cb"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.639723 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" podStartSLOduration=135.639705619 podStartE2EDuration="2m15.639705619s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.638797662 +0000 UTC m=+157.345158714" watchObservedRunningTime="2025-11-22 08:24:43.639705619 +0000 UTC m=+157.346066671" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.640626 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-fbhcz" event={"ID":"7491ded8-c8fd-4838-b3b7-7a8ce3946e07","Type":"ContainerStarted","Data":"72257d74301e71cd1f66d307e5e16fae012f7f16de508f7c16694bcec61a9811"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.642719 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" event={"ID":"a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5","Type":"ContainerStarted","Data":"30708d4f5d49e36dee4ecc8fdd0754ccfd1a4ced2d4076bd984c97139cd66872"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.644666 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" event={"ID":"352175cc-5065-4a1f-ac24-7ae82d39b87d","Type":"ContainerStarted","Data":"06787c5a01ca1c1d4a04ffdb41521d91182050e3dcd86af22e41f15afae71079"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.644698 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" event={"ID":"352175cc-5065-4a1f-ac24-7ae82d39b87d","Type":"ContainerStarted","Data":"4e63df7bc77758d5abf397712f72556ffae67622408b0aaf7b2aa386ae56d48b"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.645822 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" event={"ID":"6ecdcc2c-1d03-46ec-96e6-da1e04437140","Type":"ContainerStarted","Data":"deef9c064125aaf4a5bb52957d3b472e21a2a53eaa4466fdb79272cdc756cbd2"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.648299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" event={"ID":"506d451b-5cf3-44fe-be73-9d43abbbf9a8","Type":"ContainerStarted","Data":"50a90ce70ec023c3d44b82d4f3874cc7ebf9860f22bede94fae3af9f6dc830a8"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.655038 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" event={"ID":"a8f0d578-d25e-4f48-bca6-389c9b4fbd37","Type":"ContainerStarted","Data":"6503c2c45dfd18678477dfc854200ea9e51236e7808ffbc457d658a4d2dd7445"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.683870 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:43 crc kubenswrapper[4743]: E1122 08:24:43.685355 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.185339969 +0000 UTC m=+157.891701021 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.685934 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qrt7s" podStartSLOduration=135.685920686 podStartE2EDuration="2m15.685920686s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.655662938 +0000 UTC m=+157.362024010" watchObservedRunningTime="2025-11-22 08:24:43.685920686 +0000 UTC m=+157.392281748" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.687915 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" event={"ID":"5609b092-3caf-445b-99e0-edaff20d65a2","Type":"ContainerStarted","Data":"a7d45fdcb7a2c0deca57dcf89ee142fe345f84b212218c351ff55889b495c7e0"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.689135 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.691998 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" podStartSLOduration=135.691984318 podStartE2EDuration="2m15.691984318s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.685205965 +0000 UTC m=+157.391567017" watchObservedRunningTime="2025-11-22 08:24:43.691984318 +0000 UTC m=+157.398345370" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.694775 4743 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4kn72 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.694836 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" podUID="5609b092-3caf-445b-99e0-edaff20d65a2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.699493 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" event={"ID":"2bbdf845-aa39-41ed-a45c-75ac4ba8e894","Type":"ContainerStarted","Data":"5b489a4b6c2accc15eb1228fdc33cf25c7b3a3ff0fae3cc2c7d59047ddbbe742"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.704675 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-j5nkm" podStartSLOduration=134.704657508 podStartE2EDuration="2m14.704657508s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.698985658 +0000 UTC m=+157.405346720" watchObservedRunningTime="2025-11-22 08:24:43.704657508 +0000 UTC m=+157.411018560" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.705106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" event={"ID":"45f0877c-996f-4f4c-aa19-970fc7cd0459","Type":"ContainerStarted","Data":"56c02e0a807ad1dd9e8a21ac6c40e18cec5253f220f925727ca175d98bd1dfe7"} Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.705160 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.706639 4743 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-hn6c7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.706681 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" podUID="45f0877c-996f-4f4c-aa19-970fc7cd0459" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.706873 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.707204 4743 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-42kzd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.707233 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" podUID="9e14fb50-5723-489d-acc2-c5ca42234b73" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.707268 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.707532 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dm8fj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.707563 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" podUID="07f6c2e0-4230-40e0-ad71-2f652546cd38" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.710728 4743 patch_prober.go:28] interesting pod/console-operator-58897d9998-rmjqq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.710782 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rmjqq" podUID="73592dc4-d2b3-42f7-9bec-346286516f23" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.723254 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzpfd" podStartSLOduration=135.723217615 podStartE2EDuration="2m15.723217615s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.722183094 +0000 UTC m=+157.428544146" watchObservedRunningTime="2025-11-22 08:24:43.723217615 +0000 UTC m=+157.429578667" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.778007 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-v99cs" podStartSLOduration=135.777993239 podStartE2EDuration="2m15.777993239s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.749924017 +0000 UTC m=+157.456285079" watchObservedRunningTime="2025-11-22 08:24:43.777993239 +0000 UTC m=+157.484354291" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.779070 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hhpxp" podStartSLOduration=135.779062992 podStartE2EDuration="2m15.779062992s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.777539756 +0000 UTC m=+157.483900808" watchObservedRunningTime="2025-11-22 08:24:43.779062992 +0000 UTC m=+157.485424044" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.787199 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:43 crc kubenswrapper[4743]: E1122 08:24:43.788624 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.288607498 +0000 UTC m=+157.994968550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.862821 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nc42b" podStartSLOduration=135.862796685 podStartE2EDuration="2m15.862796685s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.807304699 +0000 UTC m=+157.513665751" watchObservedRunningTime="2025-11-22 08:24:43.862796685 +0000 UTC m=+157.569157737" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.882279 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sb8ph" podStartSLOduration=7.882261649 podStartE2EDuration="7.882261649s" podCreationTimestamp="2025-11-22 08:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.829779594 +0000 UTC m=+157.536140636" watchObservedRunningTime="2025-11-22 08:24:43.882261649 +0000 UTC m=+157.588622691" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.888695 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:43 crc kubenswrapper[4743]: E1122 08:24:43.889087 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.389064573 +0000 UTC m=+158.095425625 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.929891 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-m8pkf" podStartSLOduration=134.929865598 podStartE2EDuration="2m14.929865598s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.90096963 +0000 UTC m=+157.607330682" watchObservedRunningTime="2025-11-22 08:24:43.929865598 +0000 UTC m=+157.636226650" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.966357 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" podStartSLOduration=135.966339442 podStartE2EDuration="2m15.966339442s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.931817766 +0000 UTC m=+157.638178818" watchObservedRunningTime="2025-11-22 08:24:43.966339442 +0000 UTC m=+157.672700494" Nov 22 08:24:43 crc kubenswrapper[4743]: I1122 08:24:43.990197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:43 crc kubenswrapper[4743]: E1122 08:24:43.990522 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.490510858 +0000 UTC m=+158.196871910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.000144 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" podStartSLOduration=135.000131967 podStartE2EDuration="2m15.000131967s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.999402005 +0000 UTC m=+157.705763057" watchObservedRunningTime="2025-11-22 08:24:44.000131967 +0000 UTC m=+157.706493019" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.001023 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" podStartSLOduration=135.001015953 podStartE2EDuration="2m15.001015953s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:43.967862798 +0000 UTC m=+157.674223850" watchObservedRunningTime="2025-11-22 08:24:44.001015953 +0000 UTC m=+157.707377005" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.024801 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-fbhcz" podStartSLOduration=8.024784516 podStartE2EDuration="8.024784516s" podCreationTimestamp="2025-11-22 08:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.0222528 +0000 UTC m=+157.728613852" watchObservedRunningTime="2025-11-22 08:24:44.024784516 +0000 UTC m=+157.731145558" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.050332 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-rmjqq" podStartSLOduration=136.050317533 podStartE2EDuration="2m16.050317533s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.050103886 +0000 UTC m=+157.756464938" watchObservedRunningTime="2025-11-22 08:24:44.050317533 +0000 UTC m=+157.756678585" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.068535 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-pg7xr" podStartSLOduration=135.068515279 podStartE2EDuration="2m15.068515279s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.066077696 +0000 UTC m=+157.772438748" watchObservedRunningTime="2025-11-22 08:24:44.068515279 +0000 UTC m=+157.774876331" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.091759 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.092046 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.592031735 +0000 UTC m=+158.298392787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.132053 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.134317 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.134374 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.192910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.193337 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.693322895 +0000 UTC m=+158.399683947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.294125 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.294704 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.794677237 +0000 UTC m=+158.501038289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.395799 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.396325 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.896302937 +0000 UTC m=+158.602663989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.497889 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.498286 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.998247667 +0000 UTC m=+158.704608729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.498428 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.499020 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:44.999000669 +0000 UTC m=+158.705361721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.599933 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.600150 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.100105044 +0000 UTC m=+158.806466096 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.600309 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.600717 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.100707762 +0000 UTC m=+158.807068814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.701640 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.702400 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.202376563 +0000 UTC m=+158.908737615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.720171 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" event={"ID":"2619251a-fbe2-425c-bf25-5c0fafb3965a","Type":"ContainerStarted","Data":"efd07954fb757a6adfb016529204669b7ec002b066d6191133ac50e1e7584833"} Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.720332 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.723154 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" event={"ID":"27bd3602-b31d-4b17-9902-72cac7b5580f","Type":"ContainerStarted","Data":"9e0e09313c8c0daa69b9f6f58d7960049d468708aee0e23d0d3c7bb07d520173"} Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.725597 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qvtpl" event={"ID":"ef4673e8-a499-4df4-8ee9-987b41ee501f","Type":"ContainerStarted","Data":"dcdd22e21b150197e1a50bed73466dacfc7fb953e129fb0a4a7f6b495c21ec41"} Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.726234 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.727764 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" event={"ID":"a2e27dbc-68a8-4dd0-93e7-cfd3c8f11ad5","Type":"ContainerStarted","Data":"090ab194eb260086d770e2ba84583c2fede87e186bb256035f9e0db194b2d70e"} Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.729497 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" event={"ID":"2bbdf845-aa39-41ed-a45c-75ac4ba8e894","Type":"ContainerStarted","Data":"94a2a333068b85e749a03089bddf0ec22a7229580450906528a09f1c819476e1"} Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.731887 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" event={"ID":"b72706e5-e53f-4c1f-81fa-6b850a062076","Type":"ContainerStarted","Data":"df9257f2941c099f347e6614e8da068c444b77376ae089d32afe18a10f47ae6c"} Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.733959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" event={"ID":"555c98c2-8078-4171-b6ec-1c2d0df9ae90","Type":"ContainerStarted","Data":"b98c6541ab7141233bd800f6ae92e79542bb28003ab424bc1e97c49c5c5affe3"} Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.736486 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" event={"ID":"f33a5b12-a9fd-4e23-ac78-9ddf2a4c50ba","Type":"ContainerStarted","Data":"e7c3849c78b717948aab226a4b0276b28d7c552f84cfad6bb24c0504dfbc4665"} Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.737534 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-dgk7t" podStartSLOduration=136.737521998 podStartE2EDuration="2m16.737521998s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.108948543 +0000 UTC m=+157.815309595" watchObservedRunningTime="2025-11-22 08:24:44.737521998 +0000 UTC m=+158.443883050" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.739652 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" event={"ID":"c7c71cb3-54e2-471f-a91f-50c146c4e3c8","Type":"ContainerStarted","Data":"6d5b0c42585243241b38c32ddbba1ad0ddfeca93b5b4d0f6f2d6fe330f67df47"} Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.740275 4743 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-4kn72 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.740314 4743 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rrn7z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.740320 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" podUID="5609b092-3caf-445b-99e0-edaff20d65a2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.740353 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" podUID="28f586ec-7a65-4c1e-9f09-845b812246b0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.740315 4743 patch_prober.go:28] interesting pod/console-operator-58897d9998-rmjqq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.740393 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rmjqq" podUID="73592dc4-d2b3-42f7-9bec-346286516f23" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.740530 4743 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-hn6c7 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.740591 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" podUID="45f0877c-996f-4f4c-aa19-970fc7cd0459" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.740605 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" podStartSLOduration=135.74059107 podStartE2EDuration="2m15.74059107s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.734775786 +0000 UTC m=+158.441136838" watchObservedRunningTime="2025-11-22 08:24:44.74059107 +0000 UTC m=+158.446952122" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.746071 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.758213 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lbrsm" podStartSLOduration=136.758193458 podStartE2EDuration="2m16.758193458s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.756027393 +0000 UTC m=+158.462388445" watchObservedRunningTime="2025-11-22 08:24:44.758193458 +0000 UTC m=+158.464554510" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.796897 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" podStartSLOduration=135.796874719 podStartE2EDuration="2m15.796874719s" podCreationTimestamp="2025-11-22 08:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.779861059 +0000 UTC m=+158.486222111" watchObservedRunningTime="2025-11-22 08:24:44.796874719 +0000 UTC m=+158.503235761" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.798204 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5kn9v" podStartSLOduration=136.798196859 podStartE2EDuration="2m16.798196859s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.795527749 +0000 UTC m=+158.501888801" watchObservedRunningTime="2025-11-22 08:24:44.798196859 +0000 UTC m=+158.504557911" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.804202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.807177 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.307160908 +0000 UTC m=+159.013521960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.829761 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qvtpl" podStartSLOduration=7.829737745 podStartE2EDuration="7.829737745s" podCreationTimestamp="2025-11-22 08:24:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.813811877 +0000 UTC m=+158.520172939" watchObservedRunningTime="2025-11-22 08:24:44.829737745 +0000 UTC m=+158.536098807" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.853411 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-9fhcv" podStartSLOduration=136.853393745 podStartE2EDuration="2m16.853393745s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.850829418 +0000 UTC m=+158.557190470" watchObservedRunningTime="2025-11-22 08:24:44.853393745 +0000 UTC m=+158.559754797" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.901217 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7rcws" podStartSLOduration=136.901194429 podStartE2EDuration="2m16.901194429s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.899843169 +0000 UTC m=+158.606204221" watchObservedRunningTime="2025-11-22 08:24:44.901194429 +0000 UTC m=+158.607555481" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.914967 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:44 crc kubenswrapper[4743]: E1122 08:24:44.915352 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.415336094 +0000 UTC m=+159.121697146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.959545 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" podStartSLOduration=136.95952457 podStartE2EDuration="2m16.95952457s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.959229311 +0000 UTC m=+158.665590353" watchObservedRunningTime="2025-11-22 08:24:44.95952457 +0000 UTC m=+158.665885642" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.960304 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-68cdt" podStartSLOduration=136.960299533 podStartE2EDuration="2m16.960299533s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.932729346 +0000 UTC m=+158.639090398" watchObservedRunningTime="2025-11-22 08:24:44.960299533 +0000 UTC m=+158.666660585" Nov 22 08:24:44 crc kubenswrapper[4743]: I1122 08:24:44.982405 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j2xkp" podStartSLOduration=136.982384656 podStartE2EDuration="2m16.982384656s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:44.978289013 +0000 UTC m=+158.684650095" watchObservedRunningTime="2025-11-22 08:24:44.982384656 +0000 UTC m=+158.688745708" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.016442 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.016730 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.516717367 +0000 UTC m=+159.223078419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.117127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.117339 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.617314116 +0000 UTC m=+159.323675168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.117420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.117817 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.617804871 +0000 UTC m=+159.324165993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.132476 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.132552 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.218632 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.219090 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.71907281 +0000 UTC m=+159.425433862 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.219177 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.219614 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.719571235 +0000 UTC m=+159.425932287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.321216 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.321321 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.821300758 +0000 UTC m=+159.527661810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.321451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.321772 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.821762462 +0000 UTC m=+159.528123514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.422812 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.422967 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.922945629 +0000 UTC m=+159.629306691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.423072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.423411 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:45.923401233 +0000 UTC m=+159.629762285 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.524408 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.524660 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.024629031 +0000 UTC m=+159.730990083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.525029 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.525461 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.025448925 +0000 UTC m=+159.731809977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.625962 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.626179 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.126121267 +0000 UTC m=+159.832482319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.626531 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.626859 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.126847459 +0000 UTC m=+159.833208511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.651996 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-krx5n"] Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.653193 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.656063 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.665753 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krx5n"] Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.727641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.727827 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78xrd\" (UniqueName: \"kubernetes.io/projected/ffd1a20f-f616-4301-8c3c-546e5e3d349d-kube-api-access-78xrd\") pod \"certified-operators-krx5n\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.727861 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.227829559 +0000 UTC m=+159.934190621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.727926 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.728020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-utilities\") pod \"certified-operators-krx5n\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.728179 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-catalog-content\") pod \"certified-operators-krx5n\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.728207 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.22819202 +0000 UTC m=+159.934553082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.745080 4743 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-rrn7z container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.745163 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" podUID="28f586ec-7a65-4c1e-9f09-845b812246b0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.829175 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.829326 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.329295815 +0000 UTC m=+160.035656867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.829437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78xrd\" (UniqueName: \"kubernetes.io/projected/ffd1a20f-f616-4301-8c3c-546e5e3d349d-kube-api-access-78xrd\") pod \"certified-operators-krx5n\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.829505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.829605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-utilities\") pod \"certified-operators-krx5n\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.829743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-catalog-content\") pod \"certified-operators-krx5n\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.830315 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.330303415 +0000 UTC m=+160.036664467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.830986 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-catalog-content\") pod \"certified-operators-krx5n\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.831115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-utilities\") pod \"certified-operators-krx5n\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.850463 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dvccz"] Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.851632 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.855558 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.868757 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dvccz"] Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.871900 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78xrd\" (UniqueName: \"kubernetes.io/projected/ffd1a20f-f616-4301-8c3c-546e5e3d349d-kube-api-access-78xrd\") pod \"certified-operators-krx5n\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.933630 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.933829 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvktx\" (UniqueName: \"kubernetes.io/projected/0ac4d9cc-a76c-4061-b234-91ceaa669957-kube-api-access-gvktx\") pod \"community-operators-dvccz\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:45 crc kubenswrapper[4743]: E1122 08:24:45.933872 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.433842563 +0000 UTC m=+160.140203615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.933945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-utilities\") pod \"community-operators-dvccz\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.934009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-catalog-content\") pod \"community-operators-dvccz\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:45 crc kubenswrapper[4743]: I1122 08:24:45.966344 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.035526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-utilities\") pod \"community-operators-dvccz\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.035633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-catalog-content\") pod \"community-operators-dvccz\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.035705 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.035740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvktx\" (UniqueName: \"kubernetes.io/projected/0ac4d9cc-a76c-4061-b234-91ceaa669957-kube-api-access-gvktx\") pod \"community-operators-dvccz\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.036046 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.53603488 +0000 UTC m=+160.242395932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.036115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-utilities\") pod \"community-operators-dvccz\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.036193 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-catalog-content\") pod \"community-operators-dvccz\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.042151 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d27kb"] Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.043276 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.053249 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d27kb"] Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.066648 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvktx\" (UniqueName: \"kubernetes.io/projected/0ac4d9cc-a76c-4061-b234-91ceaa669957-kube-api-access-gvktx\") pod \"community-operators-dvccz\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.133347 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.133416 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.136594 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.136839 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-utilities\") pod \"certified-operators-d27kb\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.136961 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-catalog-content\") pod \"certified-operators-d27kb\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.137009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glrvz\" (UniqueName: \"kubernetes.io/projected/60be236b-0a63-4c71-9e90-3d78e811f956-kube-api-access-glrvz\") pod \"certified-operators-d27kb\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.137199 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.637185446 +0000 UTC m=+160.343546498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.182902 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.242183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glrvz\" (UniqueName: \"kubernetes.io/projected/60be236b-0a63-4c71-9e90-3d78e811f956-kube-api-access-glrvz\") pod \"certified-operators-d27kb\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.242254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-utilities\") pod \"certified-operators-d27kb\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.242282 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.242323 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-catalog-content\") pod \"certified-operators-d27kb\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.242719 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-catalog-content\") pod \"certified-operators-d27kb\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.244786 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-utilities\") pod \"certified-operators-d27kb\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.245054 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.745040763 +0000 UTC m=+160.451401815 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.249962 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-djmhx"] Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.250902 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.265275 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djmhx"] Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.270656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glrvz\" (UniqueName: \"kubernetes.io/projected/60be236b-0a63-4c71-9e90-3d78e811f956-kube-api-access-glrvz\") pod \"certified-operators-d27kb\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.320868 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krx5n"] Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.344097 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.344375 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.844335923 +0000 UTC m=+160.550696975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.344704 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.344862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-catalog-content\") pod \"community-operators-djmhx\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.345017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5666p\" (UniqueName: \"kubernetes.io/projected/26d26a53-22e4-4d36-9e75-872a43d2a7cc-kube-api-access-5666p\") pod \"community-operators-djmhx\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.345100 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.845086505 +0000 UTC m=+160.551447547 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.345236 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-utilities\") pod \"community-operators-djmhx\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.355986 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.446921 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.447264 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:46.947232771 +0000 UTC m=+160.653593833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.447330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-catalog-content\") pod \"community-operators-djmhx\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.447426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5666p\" (UniqueName: \"kubernetes.io/projected/26d26a53-22e4-4d36-9e75-872a43d2a7cc-kube-api-access-5666p\") pod \"community-operators-djmhx\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.447536 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-utilities\") pod \"community-operators-djmhx\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.465702 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dvccz"] Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.547871 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-catalog-content\") pod \"community-operators-djmhx\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.548507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.548845 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.04883088 +0000 UTC m=+160.755191932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.548848 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5666p\" (UniqueName: \"kubernetes.io/projected/26d26a53-22e4-4d36-9e75-872a43d2a7cc-kube-api-access-5666p\") pod \"community-operators-djmhx\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.549097 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-utilities\") pod \"community-operators-djmhx\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.570492 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.611972 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d27kb"] Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.649419 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.649697 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.149670197 +0000 UTC m=+160.856031239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.752483 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.753366 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.253342868 +0000 UTC m=+160.959703920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.763768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krx5n" event={"ID":"ffd1a20f-f616-4301-8c3c-546e5e3d349d","Type":"ContainerStarted","Data":"781e7e7569dbd368989d36674a1dd96a06d8a8e941a2edbbfae68c286998af7b"} Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.770147 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvccz" event={"ID":"0ac4d9cc-a76c-4061-b234-91ceaa669957","Type":"ContainerStarted","Data":"4e8ac2ab254185494eb31c963fac4df31558d86da7007f788477a6c029f985be"} Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.774419 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d27kb" event={"ID":"60be236b-0a63-4c71-9e90-3d78e811f956","Type":"ContainerStarted","Data":"4b64437fb424b2f06d11142b99a9045e6abcb9a292591f5737f8bec623b37635"} Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.855100 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.856250 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.356231216 +0000 UTC m=+161.062592278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.951556 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-djmhx"] Nov 22 08:24:46 crc kubenswrapper[4743]: I1122 08:24:46.956827 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:46 crc kubenswrapper[4743]: E1122 08:24:46.957213 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.457199887 +0000 UTC m=+161.163560939 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.058212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.058771 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.558739584 +0000 UTC m=+161.265100646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.140514 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:47 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:47 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:47 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.141382 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.159399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.159752 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.659737866 +0000 UTC m=+161.366098918 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.260428 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.260661 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.760622954 +0000 UTC m=+161.466984006 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.262125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.262549 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.762529381 +0000 UTC m=+161.468890433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.363269 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.363511 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.86345903 +0000 UTC m=+161.569820082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.363613 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.364460 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.86444809 +0000 UTC m=+161.570809142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.465326 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.465521 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.965482002 +0000 UTC m=+161.671843064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.465616 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.466033 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:47.966020028 +0000 UTC m=+161.672381260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.567150 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.567486 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:48.067468283 +0000 UTC m=+161.773829335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.668862 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.669234 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:48.169218687 +0000 UTC m=+161.875579739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.770291 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.771829 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:48.271805426 +0000 UTC m=+161.978166478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.779544 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djmhx" event={"ID":"26d26a53-22e4-4d36-9e75-872a43d2a7cc","Type":"ContainerStarted","Data":"c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30"} Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.779617 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djmhx" event={"ID":"26d26a53-22e4-4d36-9e75-872a43d2a7cc","Type":"ContainerStarted","Data":"06c35630f088395ea4d6e24d9aba4bf06c05d1dd24764db44e4000aecc732005"} Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.781646 4743 generic.go:334] "Generic (PLEG): container finished" podID="60be236b-0a63-4c71-9e90-3d78e811f956" containerID="cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079" exitCode=0 Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.781703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d27kb" event={"ID":"60be236b-0a63-4c71-9e90-3d78e811f956","Type":"ContainerDied","Data":"cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079"} Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.784078 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.784648 4743 generic.go:334] "Generic (PLEG): container finished" podID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerID="1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48" exitCode=0 Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.784811 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krx5n" event={"ID":"ffd1a20f-f616-4301-8c3c-546e5e3d349d","Type":"ContainerDied","Data":"1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48"} Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.786673 4743 generic.go:334] "Generic (PLEG): container finished" podID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerID="8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc" exitCode=0 Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.786764 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvccz" event={"ID":"0ac4d9cc-a76c-4061-b234-91ceaa669957","Type":"ContainerDied","Data":"8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc"} Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.788612 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-858dv" event={"ID":"ca1bf802-2f8f-4de6-9d36-d0b3e6440865","Type":"ContainerStarted","Data":"4fa7af14d16c7904f27c2bdfc02623370884e1a9286535392123f2a7758a5b2c"} Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.853764 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqk4"] Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.855011 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.857985 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.867447 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqk4"] Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.871820 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.872283 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:48.372266631 +0000 UTC m=+162.078627683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.946881 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.947727 4743 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-wtpmb container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.947784 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" podUID="7352b0d1-4af7-49c9-8029-5a97c3cdf450" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.947871 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.948040 4743 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-wtpmb container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.948059 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" podUID="7352b0d1-4af7-49c9-8029-5a97c3cdf450" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.953053 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.953253 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.972885 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.973192 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.973655 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-utilities\") pod \"redhat-marketplace-wgqk4\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.973713 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-catalog-content\") pod \"redhat-marketplace-wgqk4\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:47 crc kubenswrapper[4743]: I1122 08:24:47.973736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhnq9\" (UniqueName: \"kubernetes.io/projected/50a7a114-710f-439d-8c79-58a4ba712cda-kube-api-access-nhnq9\") pod \"redhat-marketplace-wgqk4\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:47 crc kubenswrapper[4743]: E1122 08:24:47.973863 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:48.47384038 +0000 UTC m=+162.180201432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.074669 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.074714 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ad768e0-3532-44b1-a3fb-d5db53e76bdf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.074744 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ad768e0-3532-44b1-a3fb-d5db53e76bdf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.074768 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-utilities\") pod \"redhat-marketplace-wgqk4\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.074799 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-catalog-content\") pod \"redhat-marketplace-wgqk4\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.074815 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhnq9\" (UniqueName: \"kubernetes.io/projected/50a7a114-710f-439d-8c79-58a4ba712cda-kube-api-access-nhnq9\") pod \"redhat-marketplace-wgqk4\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.075234 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:48.575210052 +0000 UTC m=+162.281571104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.075435 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-utilities\") pod \"redhat-marketplace-wgqk4\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.075519 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-catalog-content\") pod \"redhat-marketplace-wgqk4\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.148569 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhnq9\" (UniqueName: \"kubernetes.io/projected/50a7a114-710f-439d-8c79-58a4ba712cda-kube-api-access-nhnq9\") pod \"redhat-marketplace-wgqk4\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.151721 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:48 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:48 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:48 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.151779 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.169897 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.177191 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.177516 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ad768e0-3532-44b1-a3fb-d5db53e76bdf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.177663 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ad768e0-3532-44b1-a3fb-d5db53e76bdf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.177738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ad768e0-3532-44b1-a3fb-d5db53e76bdf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.177822 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:48.677804291 +0000 UTC m=+162.384165343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.223014 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ad768e0-3532-44b1-a3fb-d5db53e76bdf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.257477 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9dbk7"] Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.258458 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.273560 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.277372 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dbk7"] Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.279785 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.280071 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:48.78006019 +0000 UTC m=+162.486421232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.360229 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.382105 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.382305 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmz8d\" (UniqueName: \"kubernetes.io/projected/812538df-82a4-49d5-b50e-b99315f995ca-kube-api-access-bmz8d\") pod \"redhat-marketplace-9dbk7\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.382332 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-utilities\") pod \"redhat-marketplace-9dbk7\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.382359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-catalog-content\") pod \"redhat-marketplace-9dbk7\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.382509 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:48.882490855 +0000 UTC m=+162.588851907 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.484628 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.484674 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmz8d\" (UniqueName: \"kubernetes.io/projected/812538df-82a4-49d5-b50e-b99315f995ca-kube-api-access-bmz8d\") pod \"redhat-marketplace-9dbk7\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.484701 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-utilities\") pod \"redhat-marketplace-9dbk7\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.484745 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-catalog-content\") pod \"redhat-marketplace-9dbk7\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.486055 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:48.986042471 +0000 UTC m=+162.692403523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.486660 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-utilities\") pod \"redhat-marketplace-9dbk7\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.487105 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-catalog-content\") pod \"redhat-marketplace-9dbk7\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.520868 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmz8d\" (UniqueName: \"kubernetes.io/projected/812538df-82a4-49d5-b50e-b99315f995ca-kube-api-access-bmz8d\") pod \"redhat-marketplace-9dbk7\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.577689 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.587712 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.588037 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.088021142 +0000 UTC m=+162.794382194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.637135 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqk4"] Nov 22 08:24:48 crc kubenswrapper[4743]: W1122 08:24:48.648541 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50a7a114_710f_439d_8c79_58a4ba712cda.slice/crio-5ed74c1f6980f84ee651657f07a21f4f75dbe6f4397f326d113b9abf7dedc841 WatchSource:0}: Error finding container 5ed74c1f6980f84ee651657f07a21f4f75dbe6f4397f326d113b9abf7dedc841: Status 404 returned error can't find the container with id 5ed74c1f6980f84ee651657f07a21f4f75dbe6f4397f326d113b9abf7dedc841 Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.689448 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.689738 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.189722945 +0000 UTC m=+162.896083997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.746475 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.790935 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.791354 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.291324224 +0000 UTC m=+162.997685276 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.791970 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.792395 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.292378676 +0000 UTC m=+162.998739728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.803381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ad768e0-3532-44b1-a3fb-d5db53e76bdf","Type":"ContainerStarted","Data":"3148f67a38a38cca223a1e8cd340eec358c49594c171a8a518f3752e4202acf8"} Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.804729 4743 generic.go:334] "Generic (PLEG): container finished" podID="5564388b-e6dd-409f-a137-b34700967f4a" containerID="9aadc3a0986e123bcf0a362e1c9ed8598762bf3912aa6ade14a91b173e16fc04" exitCode=0 Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.804784 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" event={"ID":"5564388b-e6dd-409f-a137-b34700967f4a","Type":"ContainerDied","Data":"9aadc3a0986e123bcf0a362e1c9ed8598762bf3912aa6ade14a91b173e16fc04"} Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.805993 4743 generic.go:334] "Generic (PLEG): container finished" podID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerID="c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30" exitCode=0 Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.806039 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djmhx" event={"ID":"26d26a53-22e4-4d36-9e75-872a43d2a7cc","Type":"ContainerDied","Data":"c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30"} Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.820145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqk4" event={"ID":"50a7a114-710f-439d-8c79-58a4ba712cda","Type":"ContainerStarted","Data":"5ed74c1f6980f84ee651657f07a21f4f75dbe6f4397f326d113b9abf7dedc841"} Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.860751 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dlhns"] Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.862523 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.865751 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.868552 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dlhns"] Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.893879 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.894457 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.394430569 +0000 UTC m=+163.100791621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.977195 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.977610 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.977230 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.977690 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.995650 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-utilities\") pod \"redhat-operators-dlhns\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.995758 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-catalog-content\") pod \"redhat-operators-dlhns\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.995789 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:48 crc kubenswrapper[4743]: I1122 08:24:48.995810 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpx4c\" (UniqueName: \"kubernetes.io/projected/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-kube-api-access-vpx4c\") pod \"redhat-operators-dlhns\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:48 crc kubenswrapper[4743]: E1122 08:24:48.996085 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.496073189 +0000 UTC m=+163.202434241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.066256 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dbk7"] Nov 22 08:24:49 crc kubenswrapper[4743]: W1122 08:24:49.080534 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod812538df_82a4_49d5_b50e_b99315f995ca.slice/crio-84949486679eb90221803590c54ba157c606785204fad779afa5829b6be0e676 WatchSource:0}: Error finding container 84949486679eb90221803590c54ba157c606785204fad779afa5829b6be0e676: Status 404 returned error can't find the container with id 84949486679eb90221803590c54ba157c606785204fad779afa5829b6be0e676 Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.084117 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.086985 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.093789 4743 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gwf6h container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.093838 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" podUID="c7c71cb3-54e2-471f-a91f-50c146c4e3c8" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.097075 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.097207 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.597188804 +0000 UTC m=+163.303549856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.097259 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-utilities\") pod \"redhat-operators-dlhns\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.097367 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-catalog-content\") pod \"redhat-operators-dlhns\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.097403 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpx4c\" (UniqueName: \"kubernetes.io/projected/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-kube-api-access-vpx4c\") pod \"redhat-operators-dlhns\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.097430 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.097724 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.59771435 +0000 UTC m=+163.304075402 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.097775 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-utilities\") pod \"redhat-operators-dlhns\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.098004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-catalog-content\") pod \"redhat-operators-dlhns\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.121047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpx4c\" (UniqueName: \"kubernetes.io/projected/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-kube-api-access-vpx4c\") pod \"redhat-operators-dlhns\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.129061 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.135041 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:49 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:49 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:49 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.135104 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.197151 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.198106 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.198594 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.698543766 +0000 UTC m=+163.404904818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.201943 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.202327 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.702318619 +0000 UTC m=+163.408679671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.216901 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.219141 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.219181 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.220724 4743 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-67bbq container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.16:8443/livez\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.220750 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" podUID="b72706e5-e53f-4c1f-81fa-6b850a062076" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.16:8443/livez\": dial tcp 10.217.0.16:8443: connect: connection refused" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.263322 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v4w46"] Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.265105 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.286025 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-rmjqq" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.288895 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v4w46"] Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.302651 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.304131 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.804113115 +0000 UTC m=+163.510474167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.404940 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-catalog-content\") pod \"redhat-operators-v4w46\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.405058 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-utilities\") pod \"redhat-operators-v4w46\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.405215 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.405240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvs72\" (UniqueName: \"kubernetes.io/projected/bdaf46ea-41b9-4db9-8b88-43f5a406a910-kube-api-access-qvs72\") pod \"redhat-operators-v4w46\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.406877 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:49.906864578 +0000 UTC m=+163.613225630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.507126 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.507456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvs72\" (UniqueName: \"kubernetes.io/projected/bdaf46ea-41b9-4db9-8b88-43f5a406a910-kube-api-access-qvs72\") pod \"redhat-operators-v4w46\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.507520 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-catalog-content\") pod \"redhat-operators-v4w46\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.507551 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-utilities\") pod \"redhat-operators-v4w46\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.508285 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-utilities\") pod \"redhat-operators-v4w46\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.508410 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.008389435 +0000 UTC m=+163.714750487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.508964 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-catalog-content\") pod \"redhat-operators-v4w46\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.509222 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dlhns"] Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.509293 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-r6rj6" Nov 22 08:24:49 crc kubenswrapper[4743]: W1122 08:24:49.523778 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba69292_a0c6_4ab8_8fba_1144f4d1e88b.slice/crio-17673f772362d64fe98f44c37b0213c5e59896d03a5282f5e1fe7b8ddaabf0e1 WatchSource:0}: Error finding container 17673f772362d64fe98f44c37b0213c5e59896d03a5282f5e1fe7b8ddaabf0e1: Status 404 returned error can't find the container with id 17673f772362d64fe98f44c37b0213c5e59896d03a5282f5e1fe7b8ddaabf0e1 Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.526251 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvs72\" (UniqueName: \"kubernetes.io/projected/bdaf46ea-41b9-4db9-8b88-43f5a406a910-kube-api-access-qvs72\") pod \"redhat-operators-v4w46\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.608691 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.608987 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.108974794 +0000 UTC m=+163.815335846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.613316 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.653784 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.654875 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.665830 4743 patch_prober.go:28] interesting pod/console-f9d7485db-hhpxp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.665889 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hhpxp" podUID="bead015e-e8e8-44f2-8dae-41047cd66706" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.670950 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.709519 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.711115 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.211095449 +0000 UTC m=+163.917456501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.811450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.812040 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.312027518 +0000 UTC m=+164.018388570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.833910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ad768e0-3532-44b1-a3fb-d5db53e76bdf","Type":"ContainerStarted","Data":"2248c48807098e74a988ba35b8249cf452c6278e0ae4d700603ab2266c6b3e42"} Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.840936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhns" event={"ID":"cba69292-a0c6-4ab8-8fba-1144f4d1e88b","Type":"ContainerStarted","Data":"17673f772362d64fe98f44c37b0213c5e59896d03a5282f5e1fe7b8ddaabf0e1"} Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.859496 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dbk7" event={"ID":"812538df-82a4-49d5-b50e-b99315f995ca","Type":"ContainerStarted","Data":"84949486679eb90221803590c54ba157c606785204fad779afa5829b6be0e676"} Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.871958 4743 generic.go:334] "Generic (PLEG): container finished" podID="50a7a114-710f-439d-8c79-58a4ba712cda" containerID="d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd" exitCode=0 Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.872037 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqk4" event={"ID":"50a7a114-710f-439d-8c79-58a4ba712cda","Type":"ContainerDied","Data":"d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd"} Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.891491 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v4w46"] Nov 22 08:24:49 crc kubenswrapper[4743]: I1122 08:24:49.914985 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:49 crc kubenswrapper[4743]: E1122 08:24:49.915260 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.415246516 +0000 UTC m=+164.121607568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.017737 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.018870 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.518855116 +0000 UTC m=+164.225216168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.052280 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-4kn72" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.062064 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.071812 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-hn6c7" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.118647 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjx28\" (UniqueName: \"kubernetes.io/projected/5564388b-e6dd-409f-a137-b34700967f4a-kube-api-access-fjx28\") pod \"5564388b-e6dd-409f-a137-b34700967f4a\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.118902 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.119094 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5564388b-e6dd-409f-a137-b34700967f4a-config-volume\") pod \"5564388b-e6dd-409f-a137-b34700967f4a\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.119155 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5564388b-e6dd-409f-a137-b34700967f4a-secret-volume\") pod \"5564388b-e6dd-409f-a137-b34700967f4a\" (UID: \"5564388b-e6dd-409f-a137-b34700967f4a\") " Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.120174 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5564388b-e6dd-409f-a137-b34700967f4a-config-volume" (OuterVolumeSpecName: "config-volume") pod "5564388b-e6dd-409f-a137-b34700967f4a" (UID: "5564388b-e6dd-409f-a137-b34700967f4a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.120282 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.620259929 +0000 UTC m=+164.326621081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.131186 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.133917 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:50 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:50 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:50 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.133989 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.137066 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5564388b-e6dd-409f-a137-b34700967f4a-kube-api-access-fjx28" (OuterVolumeSpecName: "kube-api-access-fjx28") pod "5564388b-e6dd-409f-a137-b34700967f4a" (UID: "5564388b-e6dd-409f-a137-b34700967f4a"). InnerVolumeSpecName "kube-api-access-fjx28". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.154050 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5564388b-e6dd-409f-a137-b34700967f4a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5564388b-e6dd-409f-a137-b34700967f4a" (UID: "5564388b-e6dd-409f-a137-b34700967f4a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.220663 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.220721 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5564388b-e6dd-409f-a137-b34700967f4a-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.220735 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjx28\" (UniqueName: \"kubernetes.io/projected/5564388b-e6dd-409f-a137-b34700967f4a-kube-api-access-fjx28\") on node \"crc\" DevicePath \"\"" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.220743 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5564388b-e6dd-409f-a137-b34700967f4a-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.221029 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.720987572 +0000 UTC m=+164.427348614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.322281 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.322423 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.822397356 +0000 UTC m=+164.528758398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.322473 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.322843 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.822825209 +0000 UTC m=+164.529186261 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.438498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.438726 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.938691966 +0000 UTC m=+164.645053018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.439303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.441324 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:50.941296214 +0000 UTC m=+164.647657446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.542113 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.542543 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.042523022 +0000 UTC m=+164.748884084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.644229 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.644602 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.144588396 +0000 UTC m=+164.850949448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.745480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.745665 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.245640279 +0000 UTC m=+164.952001331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.745691 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.746036 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.24602557 +0000 UTC m=+164.952386622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.846495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.846692 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.346673821 +0000 UTC m=+165.053034883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.847156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.847527 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.347508186 +0000 UTC m=+165.053869238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.880159 4743 generic.go:334] "Generic (PLEG): container finished" podID="812538df-82a4-49d5-b50e-b99315f995ca" containerID="5c0ae3e67df091bc50b00a0142160a924deedfa7a7ed56ed7c050c31d2267190" exitCode=0 Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.880690 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dbk7" event={"ID":"812538df-82a4-49d5-b50e-b99315f995ca","Type":"ContainerDied","Data":"5c0ae3e67df091bc50b00a0142160a924deedfa7a7ed56ed7c050c31d2267190"} Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.884745 4743 generic.go:334] "Generic (PLEG): container finished" podID="3ad768e0-3532-44b1-a3fb-d5db53e76bdf" containerID="2248c48807098e74a988ba35b8249cf452c6278e0ae4d700603ab2266c6b3e42" exitCode=0 Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.884819 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ad768e0-3532-44b1-a3fb-d5db53e76bdf","Type":"ContainerDied","Data":"2248c48807098e74a988ba35b8249cf452c6278e0ae4d700603ab2266c6b3e42"} Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.886053 4743 generic.go:334] "Generic (PLEG): container finished" podID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerID="f35c9152254195699523489d96332a3ec727ffde12b11a8d084e4704c0f8c619" exitCode=0 Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.886086 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4w46" event={"ID":"bdaf46ea-41b9-4db9-8b88-43f5a406a910","Type":"ContainerDied","Data":"f35c9152254195699523489d96332a3ec727ffde12b11a8d084e4704c0f8c619"} Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.886129 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4w46" event={"ID":"bdaf46ea-41b9-4db9-8b88-43f5a406a910","Type":"ContainerStarted","Data":"48dba3bd780286d43ec600e3e9e85a8bd99dca953592f252d877fe13c02d1a26"} Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.888655 4743 generic.go:334] "Generic (PLEG): container finished" podID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerID="4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f" exitCode=0 Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.888716 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhns" event={"ID":"cba69292-a0c6-4ab8-8fba-1144f4d1e88b","Type":"ContainerDied","Data":"4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f"} Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.924049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" event={"ID":"5564388b-e6dd-409f-a137-b34700967f4a","Type":"ContainerDied","Data":"1286b581b3035331b4019abd3aa1c10c47b76fbb557fdebacd759baa7e89adac"} Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.924090 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1286b581b3035331b4019abd3aa1c10c47b76fbb557fdebacd759baa7e89adac" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.924182 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.947251 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-wtpmb" Nov 22 08:24:50 crc kubenswrapper[4743]: I1122 08:24:50.948209 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:50 crc kubenswrapper[4743]: E1122 08:24:50.948971 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.448952881 +0000 UTC m=+165.155313933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.049784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.050104 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.550090406 +0000 UTC m=+165.256451458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.137268 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:51 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:51 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:51 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.137373 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.150954 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.151179 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.651149579 +0000 UTC m=+165.357510651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.151324 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.151665 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.651653254 +0000 UTC m=+165.358014306 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.229149 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.233510 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5564388b-e6dd-409f-a137-b34700967f4a" containerName="collect-profiles" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.233537 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5564388b-e6dd-409f-a137-b34700967f4a" containerName="collect-profiles" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.233673 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5564388b-e6dd-409f-a137-b34700967f4a" containerName="collect-profiles" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.234002 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.234080 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.236046 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.236439 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.252318 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.252789 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.752771719 +0000 UTC m=+165.459132771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.353809 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3317794e-757c-471c-ac8b-adad390e622d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3317794e-757c-471c-ac8b-adad390e622d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.353914 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.354039 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3317794e-757c-471c-ac8b-adad390e622d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3317794e-757c-471c-ac8b-adad390e622d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.354531 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.854497052 +0000 UTC m=+165.560858104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.485850 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.486023 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.985993219 +0000 UTC m=+165.692354271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.486161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3317794e-757c-471c-ac8b-adad390e622d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3317794e-757c-471c-ac8b-adad390e622d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.486205 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.486267 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3317794e-757c-471c-ac8b-adad390e622d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3317794e-757c-471c-ac8b-adad390e622d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.486281 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3317794e-757c-471c-ac8b-adad390e622d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"3317794e-757c-471c-ac8b-adad390e622d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.486539 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:51.986528675 +0000 UTC m=+165.692889727 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.511052 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3317794e-757c-471c-ac8b-adad390e622d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"3317794e-757c-471c-ac8b-adad390e622d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.552924 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.586674 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.587008 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.08696732 +0000 UTC m=+165.793328362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.587080 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.587792 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.087773124 +0000 UTC m=+165.794134336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.689345 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.690041 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.190008312 +0000 UTC m=+165.896369364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.792100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.792588 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.793046 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.293022654 +0000 UTC m=+165.999383706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.797282 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8426c723-9bfa-4856-b445-b01251484a35-metrics-certs\") pod \"network-metrics-daemon-4vkc4\" (UID: \"8426c723-9bfa-4856-b445-b01251484a35\") " pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.888307 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4vkc4" Nov 22 08:24:51 crc kubenswrapper[4743]: I1122 08:24:51.894357 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:51 crc kubenswrapper[4743]: E1122 08:24:51.894743 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.394723636 +0000 UTC m=+166.101084688 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:51.999313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.002261 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.502230032 +0000 UTC m=+166.208591084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.086362 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Nov 22 08:24:52 crc kubenswrapper[4743]: W1122 08:24:52.096373 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3317794e_757c_471c_ac8b_adad390e622d.slice/crio-77bf5fe83c1af840892a30aa2f1e2c9467cd01fdee8f62d6e9e6da4cb9a0df08 WatchSource:0}: Error finding container 77bf5fe83c1af840892a30aa2f1e2c9467cd01fdee8f62d6e9e6da4cb9a0df08: Status 404 returned error can't find the container with id 77bf5fe83c1af840892a30aa2f1e2c9467cd01fdee8f62d6e9e6da4cb9a0df08 Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.099792 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.100241 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.600216053 +0000 UTC m=+166.306577095 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.135395 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:52 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:52 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:52 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.135477 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.204634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.205277 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.705233425 +0000 UTC m=+166.411594477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.243065 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.292028 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4vkc4"] Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.306910 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.307134 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.807104292 +0000 UTC m=+166.513465354 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.307267 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.307649 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.807636178 +0000 UTC m=+166.513997240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: W1122 08:24:52.322553 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8426c723_9bfa_4856_b445_b01251484a35.slice/crio-62d0a76afebbe8db509b9c70a914c69a4dc747b269d3510fc876b7b62a090433 WatchSource:0}: Error finding container 62d0a76afebbe8db509b9c70a914c69a4dc747b269d3510fc876b7b62a090433: Status 404 returned error can't find the container with id 62d0a76afebbe8db509b9c70a914c69a4dc747b269d3510fc876b7b62a090433 Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.408465 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kubelet-dir\") pod \"3ad768e0-3532-44b1-a3fb-d5db53e76bdf\" (UID: \"3ad768e0-3532-44b1-a3fb-d5db53e76bdf\") " Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.408618 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.408606 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3ad768e0-3532-44b1-a3fb-d5db53e76bdf" (UID: "3ad768e0-3532-44b1-a3fb-d5db53e76bdf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.408645 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kube-api-access\") pod \"3ad768e0-3532-44b1-a3fb-d5db53e76bdf\" (UID: \"3ad768e0-3532-44b1-a3fb-d5db53e76bdf\") " Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.408746 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.908721842 +0000 UTC m=+166.615082924 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.408832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.408885 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.409176 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:52.909163626 +0000 UTC m=+166.615524678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.417883 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3ad768e0-3532-44b1-a3fb-d5db53e76bdf" (UID: "3ad768e0-3532-44b1-a3fb-d5db53e76bdf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.515070 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.515372 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ad768e0-3532-44b1-a3fb-d5db53e76bdf-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.515445 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.015429215 +0000 UTC m=+166.721790267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.618626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.618999 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.118986803 +0000 UTC m=+166.825347855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.720279 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.720817 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.220801109 +0000 UTC m=+166.927162161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.823881 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.824448 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.3244371 +0000 UTC m=+167.030798152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.928372 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.928518 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.428497303 +0000 UTC m=+167.134858355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.928751 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:52 crc kubenswrapper[4743]: E1122 08:24:52.931105 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.43108636 +0000 UTC m=+167.137447412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.970767 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" event={"ID":"8426c723-9bfa-4856-b445-b01251484a35","Type":"ContainerStarted","Data":"bb5353c73caad2a1e013dd9232bc514d0a3de95fe7343842f1dcc413b061ccdb"} Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.970821 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" event={"ID":"8426c723-9bfa-4856-b445-b01251484a35","Type":"ContainerStarted","Data":"62d0a76afebbe8db509b9c70a914c69a4dc747b269d3510fc876b7b62a090433"} Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.974449 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3317794e-757c-471c-ac8b-adad390e622d","Type":"ContainerStarted","Data":"c0f05882b2b3472bcff1f5e28bd2a2ca5d08eb82906b9afecf4c42cfd0f4c703"} Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.974494 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3317794e-757c-471c-ac8b-adad390e622d","Type":"ContainerStarted","Data":"77bf5fe83c1af840892a30aa2f1e2c9467cd01fdee8f62d6e9e6da4cb9a0df08"} Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.976905 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ad768e0-3532-44b1-a3fb-d5db53e76bdf","Type":"ContainerDied","Data":"3148f67a38a38cca223a1e8cd340eec358c49594c171a8a518f3752e4202acf8"} Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.976935 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3148f67a38a38cca223a1e8cd340eec358c49594c171a8a518f3752e4202acf8" Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.977012 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Nov 22 08:24:52 crc kubenswrapper[4743]: I1122 08:24:52.995102 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.995080671 podStartE2EDuration="1.995080671s" podCreationTimestamp="2025-11-22 08:24:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:52.991102352 +0000 UTC m=+166.697463404" watchObservedRunningTime="2025-11-22 08:24:52.995080671 +0000 UTC m=+166.701441723" Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.034067 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:53 crc kubenswrapper[4743]: E1122 08:24:53.034919 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.534890886 +0000 UTC m=+167.241251978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.136434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:53 crc kubenswrapper[4743]: E1122 08:24:53.136900 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.636885397 +0000 UTC m=+167.343246449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.138236 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:53 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:53 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:53 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.138258 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.237940 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:53 crc kubenswrapper[4743]: E1122 08:24:53.238539 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.738502957 +0000 UTC m=+167.444864009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.339684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:53 crc kubenswrapper[4743]: E1122 08:24:53.340103 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.840084726 +0000 UTC m=+167.546445778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.440948 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:53 crc kubenswrapper[4743]: E1122 08:24:53.441249 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:53.941234492 +0000 UTC m=+167.647595544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.542711 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:53 crc kubenswrapper[4743]: E1122 08:24:53.543534 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:54.043511861 +0000 UTC m=+167.749872913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.644242 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:53 crc kubenswrapper[4743]: E1122 08:24:53.645126 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:54.145109781 +0000 UTC m=+167.851470833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.749564 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:53 crc kubenswrapper[4743]: E1122 08:24:53.754867 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:54.254846484 +0000 UTC m=+167.961207546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.858937 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:53 crc kubenswrapper[4743]: E1122 08:24:53.859507 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:54.359484435 +0000 UTC m=+168.065845487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.959908 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:53 crc kubenswrapper[4743]: E1122 08:24:53.960171 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:54.460160816 +0000 UTC m=+168.166521868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.994340 4743 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Nov 22 08:24:53 crc kubenswrapper[4743]: I1122 08:24:53.994470 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4vkc4" event={"ID":"8426c723-9bfa-4856-b445-b01251484a35","Type":"ContainerStarted","Data":"dadabfa684de91ce3fa3fb3827df86919f268cb80a5e51441b42a38027f7e194"} Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.007199 4743 generic.go:334] "Generic (PLEG): container finished" podID="3317794e-757c-471c-ac8b-adad390e622d" containerID="c0f05882b2b3472bcff1f5e28bd2a2ca5d08eb82906b9afecf4c42cfd0f4c703" exitCode=0 Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.007281 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3317794e-757c-471c-ac8b-adad390e622d","Type":"ContainerDied","Data":"c0f05882b2b3472bcff1f5e28bd2a2ca5d08eb82906b9afecf4c42cfd0f4c703"} Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.012404 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4vkc4" podStartSLOduration=146.012387734 podStartE2EDuration="2m26.012387734s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:54.008353123 +0000 UTC m=+167.714714175" watchObservedRunningTime="2025-11-22 08:24:54.012387734 +0000 UTC m=+167.718748786" Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.061043 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:54 crc kubenswrapper[4743]: E1122 08:24:54.062052 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:54.562036334 +0000 UTC m=+168.268397386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.091724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-858dv" event={"ID":"ca1bf802-2f8f-4de6-9d36-d0b3e6440865","Type":"ContainerStarted","Data":"013c84d6654c1252cea12f54c5fffc8f8f3172b7582400892ae37edc3c696cb2"} Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.094492 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.100178 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gwf6h" Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.137210 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:54 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:54 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:54 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.137257 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.161946 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:54 crc kubenswrapper[4743]: E1122 08:24:54.164259 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:54.664243711 +0000 UTC m=+168.370604763 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.231717 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.247896 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-67bbq" Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.263021 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:54 crc kubenswrapper[4743]: E1122 08:24:54.263154 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-11-22 08:24:54.763128419 +0000 UTC m=+168.469489471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.263463 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:54 crc kubenswrapper[4743]: E1122 08:24:54.265684 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-11-22 08:24:54.765664515 +0000 UTC m=+168.472025647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-592fz" (UID: "63019b95-c8f5-4782-85ba-def26be394f0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.354152 4743 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-11-22T08:24:53.994364263Z","Handler":null,"Name":""} Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.358139 4743 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.358180 4743 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.364344 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.372975 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.465343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.472650 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.472699 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.553490 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-592fz\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:54 crc kubenswrapper[4743]: I1122 08:24:54.699274 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.114564 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-858dv" event={"ID":"ca1bf802-2f8f-4de6-9d36-d0b3e6440865","Type":"ContainerStarted","Data":"db9a3e737a463f4a6d6a379f2eea96b53f7e7cfa1fc3b53ddf3e296982ef5e40"} Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.155299 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:55 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:55 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:55 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.155353 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.218369 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.222151 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qvtpl" Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.222187 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-592fz"] Nov 22 08:24:55 crc kubenswrapper[4743]: W1122 08:24:55.266822 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63019b95_c8f5_4782_85ba_def26be394f0.slice/crio-d11eb4e4e74c9184f354a4d245e061236d7cc26ac1bf6fc64a44b4ddcfe5643d WatchSource:0}: Error finding container d11eb4e4e74c9184f354a4d245e061236d7cc26ac1bf6fc64a44b4ddcfe5643d: Status 404 returned error can't find the container with id d11eb4e4e74c9184f354a4d245e061236d7cc26ac1bf6fc64a44b4ddcfe5643d Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.626230 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.798340 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3317794e-757c-471c-ac8b-adad390e622d-kubelet-dir\") pod \"3317794e-757c-471c-ac8b-adad390e622d\" (UID: \"3317794e-757c-471c-ac8b-adad390e622d\") " Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.798922 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3317794e-757c-471c-ac8b-adad390e622d-kube-api-access\") pod \"3317794e-757c-471c-ac8b-adad390e622d\" (UID: \"3317794e-757c-471c-ac8b-adad390e622d\") " Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.798642 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3317794e-757c-471c-ac8b-adad390e622d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3317794e-757c-471c-ac8b-adad390e622d" (UID: "3317794e-757c-471c-ac8b-adad390e622d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.803501 4743 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3317794e-757c-471c-ac8b-adad390e622d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.818197 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3317794e-757c-471c-ac8b-adad390e622d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3317794e-757c-471c-ac8b-adad390e622d" (UID: "3317794e-757c-471c-ac8b-adad390e622d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:24:55 crc kubenswrapper[4743]: I1122 08:24:55.904188 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3317794e-757c-471c-ac8b-adad390e622d-kube-api-access\") on node \"crc\" DevicePath \"\"" Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.131346 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"3317794e-757c-471c-ac8b-adad390e622d","Type":"ContainerDied","Data":"77bf5fe83c1af840892a30aa2f1e2c9467cd01fdee8f62d6e9e6da4cb9a0df08"} Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.131412 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77bf5fe83c1af840892a30aa2f1e2c9467cd01fdee8f62d6e9e6da4cb9a0df08" Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.131519 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.137054 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:56 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:56 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:56 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.137442 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.141984 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-858dv" event={"ID":"ca1bf802-2f8f-4de6-9d36-d0b3e6440865","Type":"ContainerStarted","Data":"f2d996f1bf0bf87e5e7b60d62118dcf09dbf9a0a4a3e26d5ccca2cfe8bf0e0ab"} Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.157713 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" event={"ID":"63019b95-c8f5-4782-85ba-def26be394f0","Type":"ContainerStarted","Data":"393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807"} Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.157895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" event={"ID":"63019b95-c8f5-4782-85ba-def26be394f0","Type":"ContainerStarted","Data":"d11eb4e4e74c9184f354a4d245e061236d7cc26ac1bf6fc64a44b4ddcfe5643d"} Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.157989 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.171023 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-858dv" podStartSLOduration=20.17100375 podStartE2EDuration="20.17100375s" podCreationTimestamp="2025-11-22 08:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:56.170944368 +0000 UTC m=+169.877305430" watchObservedRunningTime="2025-11-22 08:24:56.17100375 +0000 UTC m=+169.877364802" Nov 22 08:24:56 crc kubenswrapper[4743]: I1122 08:24:56.196147 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" podStartSLOduration=148.196126024 podStartE2EDuration="2m28.196126024s" podCreationTimestamp="2025-11-22 08:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:24:56.194271008 +0000 UTC m=+169.900632070" watchObservedRunningTime="2025-11-22 08:24:56.196126024 +0000 UTC m=+169.902487076" Nov 22 08:24:57 crc kubenswrapper[4743]: I1122 08:24:57.135337 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:57 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:57 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:57 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:57 crc kubenswrapper[4743]: I1122 08:24:57.136281 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:58 crc kubenswrapper[4743]: I1122 08:24:58.133650 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:58 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:58 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:58 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:58 crc kubenswrapper[4743]: I1122 08:24:58.133743 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:58 crc kubenswrapper[4743]: I1122 08:24:58.977716 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:24:58 crc kubenswrapper[4743]: I1122 08:24:58.978074 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:24:58 crc kubenswrapper[4743]: I1122 08:24:58.978705 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:24:58 crc kubenswrapper[4743]: I1122 08:24:58.978732 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:24:59 crc kubenswrapper[4743]: I1122 08:24:59.133065 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:24:59 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:24:59 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:24:59 crc kubenswrapper[4743]: healthz check failed Nov 22 08:24:59 crc kubenswrapper[4743]: I1122 08:24:59.133113 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:24:59 crc kubenswrapper[4743]: I1122 08:24:59.653754 4743 patch_prober.go:28] interesting pod/console-f9d7485db-hhpxp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Nov 22 08:24:59 crc kubenswrapper[4743]: I1122 08:24:59.653811 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hhpxp" podUID="bead015e-e8e8-44f2-8dae-41047cd66706" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Nov 22 08:25:00 crc kubenswrapper[4743]: I1122 08:25:00.133409 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:25:00 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:25:00 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:25:00 crc kubenswrapper[4743]: healthz check failed Nov 22 08:25:00 crc kubenswrapper[4743]: I1122 08:25:00.133519 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:25:01 crc kubenswrapper[4743]: I1122 08:25:01.137004 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:25:01 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:25:01 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:25:01 crc kubenswrapper[4743]: healthz check failed Nov 22 08:25:01 crc kubenswrapper[4743]: I1122 08:25:01.137604 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:25:01 crc kubenswrapper[4743]: I1122 08:25:01.245744 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:25:01 crc kubenswrapper[4743]: I1122 08:25:01.245805 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:25:02 crc kubenswrapper[4743]: I1122 08:25:02.133420 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:25:02 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:25:02 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:25:02 crc kubenswrapper[4743]: healthz check failed Nov 22 08:25:02 crc kubenswrapper[4743]: I1122 08:25:02.133485 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:25:03 crc kubenswrapper[4743]: I1122 08:25:03.136504 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:25:03 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:25:03 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:25:03 crc kubenswrapper[4743]: healthz check failed Nov 22 08:25:03 crc kubenswrapper[4743]: I1122 08:25:03.136665 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:25:04 crc kubenswrapper[4743]: I1122 08:25:04.134463 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:25:04 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:25:04 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:25:04 crc kubenswrapper[4743]: healthz check failed Nov 22 08:25:04 crc kubenswrapper[4743]: I1122 08:25:04.134776 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:25:05 crc kubenswrapper[4743]: I1122 08:25:05.136177 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:25:05 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:25:05 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:25:05 crc kubenswrapper[4743]: healthz check failed Nov 22 08:25:05 crc kubenswrapper[4743]: I1122 08:25:05.136253 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:25:06 crc kubenswrapper[4743]: I1122 08:25:06.134630 4743 patch_prober.go:28] interesting pod/router-default-5444994796-v99cs container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Nov 22 08:25:06 crc kubenswrapper[4743]: [-]has-synced failed: reason withheld Nov 22 08:25:06 crc kubenswrapper[4743]: [+]process-running ok Nov 22 08:25:06 crc kubenswrapper[4743]: healthz check failed Nov 22 08:25:06 crc kubenswrapper[4743]: I1122 08:25:06.134708 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-v99cs" podUID="c0ee3fe1-7f79-4f58-91ae-94a3f046401f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Nov 22 08:25:06 crc kubenswrapper[4743]: I1122 08:25:06.278003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Nov 22 08:25:07 crc kubenswrapper[4743]: I1122 08:25:07.135049 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:25:07 crc kubenswrapper[4743]: I1122 08:25:07.137785 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-v99cs" Nov 22 08:25:08 crc kubenswrapper[4743]: I1122 08:25:08.977874 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:25:08 crc kubenswrapper[4743]: I1122 08:25:08.978277 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:25:08 crc kubenswrapper[4743]: I1122 08:25:08.978993 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:25:08 crc kubenswrapper[4743]: I1122 08:25:08.979053 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:25:08 crc kubenswrapper[4743]: I1122 08:25:08.979098 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-5tsjj" Nov 22 08:25:08 crc kubenswrapper[4743]: I1122 08:25:08.979498 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:25:08 crc kubenswrapper[4743]: I1122 08:25:08.979521 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:25:08 crc kubenswrapper[4743]: I1122 08:25:08.979697 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"615faf6ed985fe140d6b8a92bf126c051c28f0974976cbe06b85ab0c6df5c48c"} pod="openshift-console/downloads-7954f5f757-5tsjj" containerMessage="Container download-server failed liveness probe, will be restarted" Nov 22 08:25:08 crc kubenswrapper[4743]: I1122 08:25:08.979799 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" containerID="cri-o://615faf6ed985fe140d6b8a92bf126c051c28f0974976cbe06b85ab0c6df5c48c" gracePeriod=2 Nov 22 08:25:09 crc kubenswrapper[4743]: I1122 08:25:09.653872 4743 patch_prober.go:28] interesting pod/console-f9d7485db-hhpxp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Nov 22 08:25:09 crc kubenswrapper[4743]: I1122 08:25:09.653951 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hhpxp" podUID="bead015e-e8e8-44f2-8dae-41047cd66706" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Nov 22 08:25:10 crc kubenswrapper[4743]: I1122 08:25:10.265202 4743 generic.go:334] "Generic (PLEG): container finished" podID="01b809b1-7b62-4043-9411-7194d6e96e47" containerID="615faf6ed985fe140d6b8a92bf126c051c28f0974976cbe06b85ab0c6df5c48c" exitCode=0 Nov 22 08:25:10 crc kubenswrapper[4743]: I1122 08:25:10.265306 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5tsjj" event={"ID":"01b809b1-7b62-4043-9411-7194d6e96e47","Type":"ContainerDied","Data":"615faf6ed985fe140d6b8a92bf126c051c28f0974976cbe06b85ab0c6df5c48c"} Nov 22 08:25:14 crc kubenswrapper[4743]: I1122 08:25:14.707693 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:25:18 crc kubenswrapper[4743]: I1122 08:25:18.977656 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:25:18 crc kubenswrapper[4743]: I1122 08:25:18.977730 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:25:19 crc kubenswrapper[4743]: I1122 08:25:19.660167 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:25:19 crc kubenswrapper[4743]: I1122 08:25:19.666178 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:25:20 crc kubenswrapper[4743]: I1122 08:25:20.099921 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-wbn2s" Nov 22 08:25:26 crc kubenswrapper[4743]: E1122 08:25:26.794724 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 08:25:26 crc kubenswrapper[4743]: E1122 08:25:26.795294 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78xrd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-krx5n_openshift-marketplace(ffd1a20f-f616-4301-8c3c-546e5e3d349d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 08:25:26 crc kubenswrapper[4743]: E1122 08:25:26.796634 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-krx5n" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" Nov 22 08:25:27 crc kubenswrapper[4743]: E1122 08:25:27.023344 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Nov 22 08:25:27 crc kubenswrapper[4743]: E1122 08:25:27.023483 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-glrvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-d27kb_openshift-marketplace(60be236b-0a63-4c71-9e90-3d78e811f956): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 08:25:27 crc kubenswrapper[4743]: E1122 08:25:27.024794 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-d27kb" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" Nov 22 08:25:27 crc kubenswrapper[4743]: E1122 08:25:27.122989 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 08:25:27 crc kubenswrapper[4743]: E1122 08:25:27.123152 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qvs72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-v4w46_openshift-marketplace(bdaf46ea-41b9-4db9-8b88-43f5a406a910): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 08:25:27 crc kubenswrapper[4743]: E1122 08:25:27.124303 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-v4w46" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" Nov 22 08:25:28 crc kubenswrapper[4743]: E1122 08:25:28.434529 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-d27kb" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" Nov 22 08:25:28 crc kubenswrapper[4743]: E1122 08:25:28.434904 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-v4w46" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" Nov 22 08:25:28 crc kubenswrapper[4743]: E1122 08:25:28.435055 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-krx5n" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" Nov 22 08:25:28 crc kubenswrapper[4743]: I1122 08:25:28.978467 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:25:28 crc kubenswrapper[4743]: I1122 08:25:28.978869 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:25:30 crc kubenswrapper[4743]: E1122 08:25:30.173285 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Nov 22 08:25:30 crc kubenswrapper[4743]: E1122 08:25:30.174824 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpx4c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-dlhns_openshift-marketplace(cba69292-a0c6-4ab8-8fba-1144f4d1e88b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 08:25:30 crc kubenswrapper[4743]: E1122 08:25:30.176029 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-dlhns" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" Nov 22 08:25:31 crc kubenswrapper[4743]: I1122 08:25:31.241392 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:25:31 crc kubenswrapper[4743]: I1122 08:25:31.241460 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:25:31 crc kubenswrapper[4743]: I1122 08:25:31.241512 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:25:31 crc kubenswrapper[4743]: I1122 08:25:31.242141 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 08:25:31 crc kubenswrapper[4743]: I1122 08:25:31.242200 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202" gracePeriod=600 Nov 22 08:25:35 crc kubenswrapper[4743]: E1122 08:25:35.898779 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-dlhns" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" Nov 22 08:25:36 crc kubenswrapper[4743]: I1122 08:25:36.405334 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202" exitCode=0 Nov 22 08:25:36 crc kubenswrapper[4743]: I1122 08:25:36.405411 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202"} Nov 22 08:25:36 crc kubenswrapper[4743]: E1122 08:25:36.687697 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 08:25:36 crc kubenswrapper[4743]: E1122 08:25:36.687846 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gvktx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-dvccz_openshift-marketplace(0ac4d9cc-a76c-4061-b234-91ceaa669957): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 08:25:36 crc kubenswrapper[4743]: E1122 08:25:36.689058 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-dvccz" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" Nov 22 08:25:37 crc kubenswrapper[4743]: E1122 08:25:37.581255 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-dvccz" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" Nov 22 08:25:38 crc kubenswrapper[4743]: I1122 08:25:38.979867 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:25:38 crc kubenswrapper[4743]: I1122 08:25:38.980230 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:25:39 crc kubenswrapper[4743]: I1122 08:25:39.423352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-5tsjj" event={"ID":"01b809b1-7b62-4043-9411-7194d6e96e47","Type":"ContainerStarted","Data":"1a09bc05ccdf751ec7e8174c07dc9544c566d59ecb3fda3a97ac3052e1076696"} Nov 22 08:25:39 crc kubenswrapper[4743]: E1122 08:25:39.841716 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Nov 22 08:25:39 crc kubenswrapper[4743]: E1122 08:25:39.841976 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5666p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-djmhx_openshift-marketplace(26d26a53-22e4-4d36-9e75-872a43d2a7cc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Nov 22 08:25:39 crc kubenswrapper[4743]: E1122 08:25:39.843124 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-djmhx" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" Nov 22 08:25:40 crc kubenswrapper[4743]: I1122 08:25:40.431225 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dbk7" event={"ID":"812538df-82a4-49d5-b50e-b99315f995ca","Type":"ContainerStarted","Data":"9fe7b8f831743819b77b283e34c2f977f778ba544108d62694c366b44ee7d8ef"} Nov 22 08:25:40 crc kubenswrapper[4743]: I1122 08:25:40.433752 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqk4" event={"ID":"50a7a114-710f-439d-8c79-58a4ba712cda","Type":"ContainerStarted","Data":"0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24"} Nov 22 08:25:40 crc kubenswrapper[4743]: I1122 08:25:40.437168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"b533a206a803330d2ceef7d65a61f0f3c6cea04ce828dfcfa9e6b7b17b4b0e77"} Nov 22 08:25:40 crc kubenswrapper[4743]: E1122 08:25:40.439206 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-djmhx" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" Nov 22 08:25:40 crc kubenswrapper[4743]: I1122 08:25:40.439196 4743 patch_prober.go:28] interesting pod/downloads-7954f5f757-5tsjj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Nov 22 08:25:40 crc kubenswrapper[4743]: I1122 08:25:40.439256 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-5tsjj" podUID="01b809b1-7b62-4043-9411-7194d6e96e47" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Nov 22 08:25:41 crc kubenswrapper[4743]: I1122 08:25:41.448451 4743 generic.go:334] "Generic (PLEG): container finished" podID="812538df-82a4-49d5-b50e-b99315f995ca" containerID="9fe7b8f831743819b77b283e34c2f977f778ba544108d62694c366b44ee7d8ef" exitCode=0 Nov 22 08:25:41 crc kubenswrapper[4743]: I1122 08:25:41.448652 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dbk7" event={"ID":"812538df-82a4-49d5-b50e-b99315f995ca","Type":"ContainerDied","Data":"9fe7b8f831743819b77b283e34c2f977f778ba544108d62694c366b44ee7d8ef"} Nov 22 08:25:41 crc kubenswrapper[4743]: I1122 08:25:41.453154 4743 generic.go:334] "Generic (PLEG): container finished" podID="50a7a114-710f-439d-8c79-58a4ba712cda" containerID="0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24" exitCode=0 Nov 22 08:25:41 crc kubenswrapper[4743]: I1122 08:25:41.453207 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqk4" event={"ID":"50a7a114-710f-439d-8c79-58a4ba712cda","Type":"ContainerDied","Data":"0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24"} Nov 22 08:25:48 crc kubenswrapper[4743]: I1122 08:25:48.984020 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-5tsjj" Nov 22 08:25:48 crc kubenswrapper[4743]: I1122 08:25:48.988479 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-5tsjj" Nov 22 08:26:04 crc kubenswrapper[4743]: I1122 08:26:04.590877 4743 generic.go:334] "Generic (PLEG): container finished" podID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerID="7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb" exitCode=0 Nov 22 08:26:04 crc kubenswrapper[4743]: I1122 08:26:04.590963 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krx5n" event={"ID":"ffd1a20f-f616-4301-8c3c-546e5e3d349d","Type":"ContainerDied","Data":"7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb"} Nov 22 08:26:04 crc kubenswrapper[4743]: I1122 08:26:04.606525 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dbk7" event={"ID":"812538df-82a4-49d5-b50e-b99315f995ca","Type":"ContainerStarted","Data":"06a0f818294a9a734d4421269d73183ab53ae3da18a0d58599496046bb9176cb"} Nov 22 08:26:04 crc kubenswrapper[4743]: I1122 08:26:04.608695 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqk4" event={"ID":"50a7a114-710f-439d-8c79-58a4ba712cda","Type":"ContainerStarted","Data":"1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845"} Nov 22 08:26:04 crc kubenswrapper[4743]: I1122 08:26:04.643533 4743 generic.go:334] "Generic (PLEG): container finished" podID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerID="80cd1f903a5fecc063754b4da3c249a01d786be806e3b6825fe9d16e9d70b4d5" exitCode=0 Nov 22 08:26:04 crc kubenswrapper[4743]: I1122 08:26:04.643603 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4w46" event={"ID":"bdaf46ea-41b9-4db9-8b88-43f5a406a910","Type":"ContainerDied","Data":"80cd1f903a5fecc063754b4da3c249a01d786be806e3b6825fe9d16e9d70b4d5"} Nov 22 08:26:04 crc kubenswrapper[4743]: I1122 08:26:04.647244 4743 generic.go:334] "Generic (PLEG): container finished" podID="60be236b-0a63-4c71-9e90-3d78e811f956" containerID="aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b" exitCode=0 Nov 22 08:26:04 crc kubenswrapper[4743]: I1122 08:26:04.647291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d27kb" event={"ID":"60be236b-0a63-4c71-9e90-3d78e811f956","Type":"ContainerDied","Data":"aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b"} Nov 22 08:26:04 crc kubenswrapper[4743]: I1122 08:26:04.685743 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9dbk7" podStartSLOduration=3.889296175 podStartE2EDuration="1m16.685721068s" podCreationTimestamp="2025-11-22 08:24:48 +0000 UTC" firstStartedPulling="2025-11-22 08:24:50.882718063 +0000 UTC m=+164.589079115" lastFinishedPulling="2025-11-22 08:26:03.679142956 +0000 UTC m=+237.385504008" observedRunningTime="2025-11-22 08:26:04.652012062 +0000 UTC m=+238.358373124" watchObservedRunningTime="2025-11-22 08:26:04.685721068 +0000 UTC m=+238.392082120" Nov 22 08:26:04 crc kubenswrapper[4743]: I1122 08:26:04.708544 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wgqk4" podStartSLOduration=3.884072615 podStartE2EDuration="1m17.708527628s" podCreationTimestamp="2025-11-22 08:24:47 +0000 UTC" firstStartedPulling="2025-11-22 08:24:49.877093971 +0000 UTC m=+163.583455013" lastFinishedPulling="2025-11-22 08:26:03.701548974 +0000 UTC m=+237.407910026" observedRunningTime="2025-11-22 08:26:04.707635921 +0000 UTC m=+238.413996973" watchObservedRunningTime="2025-11-22 08:26:04.708527628 +0000 UTC m=+238.414888670" Nov 22 08:26:05 crc kubenswrapper[4743]: I1122 08:26:05.654177 4743 generic.go:334] "Generic (PLEG): container finished" podID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerID="fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5" exitCode=0 Nov 22 08:26:05 crc kubenswrapper[4743]: I1122 08:26:05.654247 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvccz" event={"ID":"0ac4d9cc-a76c-4061-b234-91ceaa669957","Type":"ContainerDied","Data":"fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5"} Nov 22 08:26:05 crc kubenswrapper[4743]: I1122 08:26:05.657053 4743 generic.go:334] "Generic (PLEG): container finished" podID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerID="e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2" exitCode=0 Nov 22 08:26:05 crc kubenswrapper[4743]: I1122 08:26:05.657130 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhns" event={"ID":"cba69292-a0c6-4ab8-8fba-1144f4d1e88b","Type":"ContainerDied","Data":"e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2"} Nov 22 08:26:05 crc kubenswrapper[4743]: I1122 08:26:05.658889 4743 generic.go:334] "Generic (PLEG): container finished" podID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerID="53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010" exitCode=0 Nov 22 08:26:05 crc kubenswrapper[4743]: I1122 08:26:05.658919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djmhx" event={"ID":"26d26a53-22e4-4d36-9e75-872a43d2a7cc","Type":"ContainerDied","Data":"53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010"} Nov 22 08:26:06 crc kubenswrapper[4743]: I1122 08:26:06.667935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4w46" event={"ID":"bdaf46ea-41b9-4db9-8b88-43f5a406a910","Type":"ContainerStarted","Data":"2ef367e50492f25728c621a644c2d65cebc3970d32798a0eb039a4543808e248"} Nov 22 08:26:06 crc kubenswrapper[4743]: I1122 08:26:06.670196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhns" event={"ID":"cba69292-a0c6-4ab8-8fba-1144f4d1e88b","Type":"ContainerStarted","Data":"c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c"} Nov 22 08:26:06 crc kubenswrapper[4743]: I1122 08:26:06.672499 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djmhx" event={"ID":"26d26a53-22e4-4d36-9e75-872a43d2a7cc","Type":"ContainerStarted","Data":"80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e"} Nov 22 08:26:06 crc kubenswrapper[4743]: I1122 08:26:06.677922 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d27kb" event={"ID":"60be236b-0a63-4c71-9e90-3d78e811f956","Type":"ContainerStarted","Data":"59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0"} Nov 22 08:26:06 crc kubenswrapper[4743]: I1122 08:26:06.687794 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v4w46" podStartSLOduration=2.815579397 podStartE2EDuration="1m17.68777728s" podCreationTimestamp="2025-11-22 08:24:49 +0000 UTC" firstStartedPulling="2025-11-22 08:24:50.887254979 +0000 UTC m=+164.593616031" lastFinishedPulling="2025-11-22 08:26:05.759452862 +0000 UTC m=+239.465813914" observedRunningTime="2025-11-22 08:26:06.68676845 +0000 UTC m=+240.393129502" watchObservedRunningTime="2025-11-22 08:26:06.68777728 +0000 UTC m=+240.394138332" Nov 22 08:26:06 crc kubenswrapper[4743]: I1122 08:26:06.692645 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krx5n" event={"ID":"ffd1a20f-f616-4301-8c3c-546e5e3d349d","Type":"ContainerStarted","Data":"dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255"} Nov 22 08:26:06 crc kubenswrapper[4743]: I1122 08:26:06.705525 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d27kb" podStartSLOduration=2.309107054 podStartE2EDuration="1m20.705502369s" podCreationTimestamp="2025-11-22 08:24:46 +0000 UTC" firstStartedPulling="2025-11-22 08:24:47.783777555 +0000 UTC m=+161.490138607" lastFinishedPulling="2025-11-22 08:26:06.18017287 +0000 UTC m=+239.886533922" observedRunningTime="2025-11-22 08:26:06.703727716 +0000 UTC m=+240.410088768" watchObservedRunningTime="2025-11-22 08:26:06.705502369 +0000 UTC m=+240.411863421" Nov 22 08:26:06 crc kubenswrapper[4743]: I1122 08:26:06.726274 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dlhns" podStartSLOduration=3.179829587 podStartE2EDuration="1m18.726256228s" podCreationTimestamp="2025-11-22 08:24:48 +0000 UTC" firstStartedPulling="2025-11-22 08:24:50.89429303 +0000 UTC m=+164.600654172" lastFinishedPulling="2025-11-22 08:26:06.440719761 +0000 UTC m=+240.147080813" observedRunningTime="2025-11-22 08:26:06.725811044 +0000 UTC m=+240.432172116" watchObservedRunningTime="2025-11-22 08:26:06.726256228 +0000 UTC m=+240.432617280" Nov 22 08:26:06 crc kubenswrapper[4743]: I1122 08:26:06.746842 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-djmhx" podStartSLOduration=3.266981756 podStartE2EDuration="1m20.746829521s" podCreationTimestamp="2025-11-22 08:24:46 +0000 UTC" firstStartedPulling="2025-11-22 08:24:48.820117868 +0000 UTC m=+162.526478930" lastFinishedPulling="2025-11-22 08:26:06.299965643 +0000 UTC m=+240.006326695" observedRunningTime="2025-11-22 08:26:06.745908674 +0000 UTC m=+240.452269726" watchObservedRunningTime="2025-11-22 08:26:06.746829521 +0000 UTC m=+240.453190573" Nov 22 08:26:06 crc kubenswrapper[4743]: I1122 08:26:06.768111 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-krx5n" podStartSLOduration=3.849977535 podStartE2EDuration="1m21.768088645s" podCreationTimestamp="2025-11-22 08:24:45 +0000 UTC" firstStartedPulling="2025-11-22 08:24:47.786245069 +0000 UTC m=+161.492606121" lastFinishedPulling="2025-11-22 08:26:05.704356179 +0000 UTC m=+239.410717231" observedRunningTime="2025-11-22 08:26:06.763420166 +0000 UTC m=+240.469781218" watchObservedRunningTime="2025-11-22 08:26:06.768088645 +0000 UTC m=+240.474449717" Nov 22 08:26:07 crc kubenswrapper[4743]: I1122 08:26:07.700019 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvccz" event={"ID":"0ac4d9cc-a76c-4061-b234-91ceaa669957","Type":"ContainerStarted","Data":"c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3"} Nov 22 08:26:07 crc kubenswrapper[4743]: I1122 08:26:07.725386 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dvccz" podStartSLOduration=3.838468511 podStartE2EDuration="1m22.725365585s" podCreationTimestamp="2025-11-22 08:24:45 +0000 UTC" firstStartedPulling="2025-11-22 08:24:47.790883488 +0000 UTC m=+161.497244540" lastFinishedPulling="2025-11-22 08:26:06.677780562 +0000 UTC m=+240.384141614" observedRunningTime="2025-11-22 08:26:07.723805939 +0000 UTC m=+241.430167011" watchObservedRunningTime="2025-11-22 08:26:07.725365585 +0000 UTC m=+241.431726637" Nov 22 08:26:08 crc kubenswrapper[4743]: I1122 08:26:08.170626 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:26:08 crc kubenswrapper[4743]: I1122 08:26:08.170701 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:26:08 crc kubenswrapper[4743]: I1122 08:26:08.452561 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:26:08 crc kubenswrapper[4743]: I1122 08:26:08.578846 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:26:08 crc kubenswrapper[4743]: I1122 08:26:08.578919 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:26:08 crc kubenswrapper[4743]: I1122 08:26:08.623995 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:26:08 crc kubenswrapper[4743]: I1122 08:26:08.747559 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:26:08 crc kubenswrapper[4743]: I1122 08:26:08.759552 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:26:09 crc kubenswrapper[4743]: I1122 08:26:09.197389 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:26:09 crc kubenswrapper[4743]: I1122 08:26:09.197785 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:26:09 crc kubenswrapper[4743]: I1122 08:26:09.614566 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:26:09 crc kubenswrapper[4743]: I1122 08:26:09.615007 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:26:10 crc kubenswrapper[4743]: I1122 08:26:10.247336 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dlhns" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerName="registry-server" probeResult="failure" output=< Nov 22 08:26:10 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 08:26:10 crc kubenswrapper[4743]: > Nov 22 08:26:10 crc kubenswrapper[4743]: I1122 08:26:10.651404 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v4w46" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerName="registry-server" probeResult="failure" output=< Nov 22 08:26:10 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 08:26:10 crc kubenswrapper[4743]: > Nov 22 08:26:12 crc kubenswrapper[4743]: I1122 08:26:12.824188 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dbk7"] Nov 22 08:26:12 crc kubenswrapper[4743]: I1122 08:26:12.824564 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9dbk7" podUID="812538df-82a4-49d5-b50e-b99315f995ca" containerName="registry-server" containerID="cri-o://06a0f818294a9a734d4421269d73183ab53ae3da18a0d58599496046bb9176cb" gracePeriod=2 Nov 22 08:26:13 crc kubenswrapper[4743]: I1122 08:26:13.733270 4743 generic.go:334] "Generic (PLEG): container finished" podID="812538df-82a4-49d5-b50e-b99315f995ca" containerID="06a0f818294a9a734d4421269d73183ab53ae3da18a0d58599496046bb9176cb" exitCode=0 Nov 22 08:26:13 crc kubenswrapper[4743]: I1122 08:26:13.733332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dbk7" event={"ID":"812538df-82a4-49d5-b50e-b99315f995ca","Type":"ContainerDied","Data":"06a0f818294a9a734d4421269d73183ab53ae3da18a0d58599496046bb9176cb"} Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.239529 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.320722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-catalog-content\") pod \"812538df-82a4-49d5-b50e-b99315f995ca\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.320832 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-utilities\") pod \"812538df-82a4-49d5-b50e-b99315f995ca\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.320897 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmz8d\" (UniqueName: \"kubernetes.io/projected/812538df-82a4-49d5-b50e-b99315f995ca-kube-api-access-bmz8d\") pod \"812538df-82a4-49d5-b50e-b99315f995ca\" (UID: \"812538df-82a4-49d5-b50e-b99315f995ca\") " Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.321794 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-utilities" (OuterVolumeSpecName: "utilities") pod "812538df-82a4-49d5-b50e-b99315f995ca" (UID: "812538df-82a4-49d5-b50e-b99315f995ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.327734 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812538df-82a4-49d5-b50e-b99315f995ca-kube-api-access-bmz8d" (OuterVolumeSpecName: "kube-api-access-bmz8d") pod "812538df-82a4-49d5-b50e-b99315f995ca" (UID: "812538df-82a4-49d5-b50e-b99315f995ca"). InnerVolumeSpecName "kube-api-access-bmz8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.338828 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "812538df-82a4-49d5-b50e-b99315f995ca" (UID: "812538df-82a4-49d5-b50e-b99315f995ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.422132 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.422175 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/812538df-82a4-49d5-b50e-b99315f995ca-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.422191 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmz8d\" (UniqueName: \"kubernetes.io/projected/812538df-82a4-49d5-b50e-b99315f995ca-kube-api-access-bmz8d\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.747723 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dbk7" event={"ID":"812538df-82a4-49d5-b50e-b99315f995ca","Type":"ContainerDied","Data":"84949486679eb90221803590c54ba157c606785204fad779afa5829b6be0e676"} Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.747786 4743 scope.go:117] "RemoveContainer" containerID="06a0f818294a9a734d4421269d73183ab53ae3da18a0d58599496046bb9176cb" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.747841 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dbk7" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.778876 4743 scope.go:117] "RemoveContainer" containerID="9fe7b8f831743819b77b283e34c2f977f778ba544108d62694c366b44ee7d8ef" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.781075 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dbk7"] Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.785033 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dbk7"] Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.808642 4743 scope.go:117] "RemoveContainer" containerID="5c0ae3e67df091bc50b00a0142160a924deedfa7a7ed56ed7c050c31d2267190" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.966743 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:26:15 crc kubenswrapper[4743]: I1122 08:26:15.967041 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.010117 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.183866 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.184107 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.219924 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.357348 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.357410 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.405744 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.571327 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.571992 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.619751 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.788722 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.796602 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.801370 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:26:16 crc kubenswrapper[4743]: I1122 08:26:16.836561 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:26:17 crc kubenswrapper[4743]: I1122 08:26:17.159866 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="812538df-82a4-49d5-b50e-b99315f995ca" path="/var/lib/kubelet/pods/812538df-82a4-49d5-b50e-b99315f995ca/volumes" Nov 22 08:26:18 crc kubenswrapper[4743]: I1122 08:26:18.425082 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d27kb"] Nov 22 08:26:18 crc kubenswrapper[4743]: I1122 08:26:18.769397 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d27kb" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" containerName="registry-server" containerID="cri-o://59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0" gracePeriod=2 Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.200996 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.260550 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.274920 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-catalog-content\") pod \"60be236b-0a63-4c71-9e90-3d78e811f956\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.275128 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glrvz\" (UniqueName: \"kubernetes.io/projected/60be236b-0a63-4c71-9e90-3d78e811f956-kube-api-access-glrvz\") pod \"60be236b-0a63-4c71-9e90-3d78e811f956\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.275262 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-utilities\") pod \"60be236b-0a63-4c71-9e90-3d78e811f956\" (UID: \"60be236b-0a63-4c71-9e90-3d78e811f956\") " Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.277107 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-utilities" (OuterVolumeSpecName: "utilities") pod "60be236b-0a63-4c71-9e90-3d78e811f956" (UID: "60be236b-0a63-4c71-9e90-3d78e811f956"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.285411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60be236b-0a63-4c71-9e90-3d78e811f956-kube-api-access-glrvz" (OuterVolumeSpecName: "kube-api-access-glrvz") pod "60be236b-0a63-4c71-9e90-3d78e811f956" (UID: "60be236b-0a63-4c71-9e90-3d78e811f956"). InnerVolumeSpecName "kube-api-access-glrvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.302098 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.376840 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.376887 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glrvz\" (UniqueName: \"kubernetes.io/projected/60be236b-0a63-4c71-9e90-3d78e811f956-kube-api-access-glrvz\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.497730 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60be236b-0a63-4c71-9e90-3d78e811f956" (UID: "60be236b-0a63-4c71-9e90-3d78e811f956"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.579304 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60be236b-0a63-4c71-9e90-3d78e811f956-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.666155 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.721873 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.775624 4743 generic.go:334] "Generic (PLEG): container finished" podID="60be236b-0a63-4c71-9e90-3d78e811f956" containerID="59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0" exitCode=0 Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.775701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d27kb" event={"ID":"60be236b-0a63-4c71-9e90-3d78e811f956","Type":"ContainerDied","Data":"59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0"} Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.775715 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d27kb" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.775762 4743 scope.go:117] "RemoveContainer" containerID="59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.775750 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d27kb" event={"ID":"60be236b-0a63-4c71-9e90-3d78e811f956","Type":"ContainerDied","Data":"4b64437fb424b2f06d11142b99a9045e6abcb9a292591f5737f8bec623b37635"} Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.793705 4743 scope.go:117] "RemoveContainer" containerID="aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.813227 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d27kb"] Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.813640 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d27kb"] Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.825909 4743 scope.go:117] "RemoveContainer" containerID="cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.842698 4743 scope.go:117] "RemoveContainer" containerID="59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0" Nov 22 08:26:19 crc kubenswrapper[4743]: E1122 08:26:19.843157 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0\": container with ID starting with 59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0 not found: ID does not exist" containerID="59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.843195 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0"} err="failed to get container status \"59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0\": rpc error: code = NotFound desc = could not find container \"59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0\": container with ID starting with 59f0584bcccffabd8382651fa816cd4afebd8bdd853367d910b11deba0f118b0 not found: ID does not exist" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.843217 4743 scope.go:117] "RemoveContainer" containerID="aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b" Nov 22 08:26:19 crc kubenswrapper[4743]: E1122 08:26:19.843524 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b\": container with ID starting with aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b not found: ID does not exist" containerID="aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.843548 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b"} err="failed to get container status \"aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b\": rpc error: code = NotFound desc = could not find container \"aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b\": container with ID starting with aedfe05ae35dd8c990286197208bb8ffddbf1c1135dab58077d191d4d0e96a6b not found: ID does not exist" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.843563 4743 scope.go:117] "RemoveContainer" containerID="cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079" Nov 22 08:26:19 crc kubenswrapper[4743]: E1122 08:26:19.843834 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079\": container with ID starting with cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079 not found: ID does not exist" containerID="cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079" Nov 22 08:26:19 crc kubenswrapper[4743]: I1122 08:26:19.843854 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079"} err="failed to get container status \"cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079\": rpc error: code = NotFound desc = could not find container \"cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079\": container with ID starting with cd284823a1467f9d09fcb07f5ea2afc8957497e2b2f7fd5cd7637ce2adcc3079 not found: ID does not exist" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.223543 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djmhx"] Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.223837 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-djmhx" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerName="registry-server" containerID="cri-o://80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e" gracePeriod=2 Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.599837 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.696295 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-utilities\") pod \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.696402 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5666p\" (UniqueName: \"kubernetes.io/projected/26d26a53-22e4-4d36-9e75-872a43d2a7cc-kube-api-access-5666p\") pod \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.696556 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-catalog-content\") pod \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\" (UID: \"26d26a53-22e4-4d36-9e75-872a43d2a7cc\") " Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.697415 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-utilities" (OuterVolumeSpecName: "utilities") pod "26d26a53-22e4-4d36-9e75-872a43d2a7cc" (UID: "26d26a53-22e4-4d36-9e75-872a43d2a7cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.703187 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d26a53-22e4-4d36-9e75-872a43d2a7cc-kube-api-access-5666p" (OuterVolumeSpecName: "kube-api-access-5666p") pod "26d26a53-22e4-4d36-9e75-872a43d2a7cc" (UID: "26d26a53-22e4-4d36-9e75-872a43d2a7cc"). InnerVolumeSpecName "kube-api-access-5666p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.751179 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26d26a53-22e4-4d36-9e75-872a43d2a7cc" (UID: "26d26a53-22e4-4d36-9e75-872a43d2a7cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.783150 4743 generic.go:334] "Generic (PLEG): container finished" podID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerID="80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e" exitCode=0 Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.783227 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djmhx" event={"ID":"26d26a53-22e4-4d36-9e75-872a43d2a7cc","Type":"ContainerDied","Data":"80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e"} Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.783254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-djmhx" event={"ID":"26d26a53-22e4-4d36-9e75-872a43d2a7cc","Type":"ContainerDied","Data":"06c35630f088395ea4d6e24d9aba4bf06c05d1dd24764db44e4000aecc732005"} Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.783273 4743 scope.go:117] "RemoveContainer" containerID="80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.783212 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-djmhx" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.797669 4743 scope.go:117] "RemoveContainer" containerID="53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.798311 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5666p\" (UniqueName: \"kubernetes.io/projected/26d26a53-22e4-4d36-9e75-872a43d2a7cc-kube-api-access-5666p\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.798338 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.798352 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d26a53-22e4-4d36-9e75-872a43d2a7cc-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.811812 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-djmhx"] Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.816102 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-djmhx"] Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.823256 4743 scope.go:117] "RemoveContainer" containerID="c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.834340 4743 scope.go:117] "RemoveContainer" containerID="80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e" Nov 22 08:26:20 crc kubenswrapper[4743]: E1122 08:26:20.834738 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e\": container with ID starting with 80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e not found: ID does not exist" containerID="80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.834784 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e"} err="failed to get container status \"80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e\": rpc error: code = NotFound desc = could not find container \"80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e\": container with ID starting with 80b386ada381d9dbce471942915a7a3bf07d82714a37c1c389d3dc33d4029c9e not found: ID does not exist" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.834815 4743 scope.go:117] "RemoveContainer" containerID="53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010" Nov 22 08:26:20 crc kubenswrapper[4743]: E1122 08:26:20.835266 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010\": container with ID starting with 53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010 not found: ID does not exist" containerID="53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.835306 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010"} err="failed to get container status \"53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010\": rpc error: code = NotFound desc = could not find container \"53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010\": container with ID starting with 53c7a2fdb216b3071bdb8908c5345a6aa7edaeef8e39a29c2ea9fad066184010 not found: ID does not exist" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.835330 4743 scope.go:117] "RemoveContainer" containerID="c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30" Nov 22 08:26:20 crc kubenswrapper[4743]: E1122 08:26:20.835607 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30\": container with ID starting with c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30 not found: ID does not exist" containerID="c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30" Nov 22 08:26:20 crc kubenswrapper[4743]: I1122 08:26:20.835640 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30"} err="failed to get container status \"c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30\": rpc error: code = NotFound desc = could not find container \"c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30\": container with ID starting with c35b5f91238f5da38b75ee2f03e113b19c344213e4066e36580e533857083e30 not found: ID does not exist" Nov 22 08:26:21 crc kubenswrapper[4743]: I1122 08:26:21.159236 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" path="/var/lib/kubelet/pods/26d26a53-22e4-4d36-9e75-872a43d2a7cc/volumes" Nov 22 08:26:21 crc kubenswrapper[4743]: I1122 08:26:21.159909 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" path="/var/lib/kubelet/pods/60be236b-0a63-4c71-9e90-3d78e811f956/volumes" Nov 22 08:26:21 crc kubenswrapper[4743]: I1122 08:26:21.793141 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-42kzd"] Nov 22 08:26:22 crc kubenswrapper[4743]: I1122 08:26:22.819565 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v4w46"] Nov 22 08:26:22 crc kubenswrapper[4743]: I1122 08:26:22.819778 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v4w46" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerName="registry-server" containerID="cri-o://2ef367e50492f25728c621a644c2d65cebc3970d32798a0eb039a4543808e248" gracePeriod=2 Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.811069 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.811246 4743 generic.go:334] "Generic (PLEG): container finished" podID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerID="2ef367e50492f25728c621a644c2d65cebc3970d32798a0eb039a4543808e248" exitCode=0 Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.811269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4w46" event={"ID":"bdaf46ea-41b9-4db9-8b88-43f5a406a910","Type":"ContainerDied","Data":"2ef367e50492f25728c621a644c2d65cebc3970d32798a0eb039a4543808e248"} Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.811609 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v4w46" event={"ID":"bdaf46ea-41b9-4db9-8b88-43f5a406a910","Type":"ContainerDied","Data":"48dba3bd780286d43ec600e3e9e85a8bd99dca953592f252d877fe13c02d1a26"} Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.811639 4743 scope.go:117] "RemoveContainer" containerID="2ef367e50492f25728c621a644c2d65cebc3970d32798a0eb039a4543808e248" Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.836488 4743 scope.go:117] "RemoveContainer" containerID="80cd1f903a5fecc063754b4da3c249a01d786be806e3b6825fe9d16e9d70b4d5" Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.852758 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvs72\" (UniqueName: \"kubernetes.io/projected/bdaf46ea-41b9-4db9-8b88-43f5a406a910-kube-api-access-qvs72\") pod \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.852802 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-catalog-content\") pod \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.852838 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-utilities\") pod \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\" (UID: \"bdaf46ea-41b9-4db9-8b88-43f5a406a910\") " Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.853809 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-utilities" (OuterVolumeSpecName: "utilities") pod "bdaf46ea-41b9-4db9-8b88-43f5a406a910" (UID: "bdaf46ea-41b9-4db9-8b88-43f5a406a910"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.861873 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdaf46ea-41b9-4db9-8b88-43f5a406a910-kube-api-access-qvs72" (OuterVolumeSpecName: "kube-api-access-qvs72") pod "bdaf46ea-41b9-4db9-8b88-43f5a406a910" (UID: "bdaf46ea-41b9-4db9-8b88-43f5a406a910"). InnerVolumeSpecName "kube-api-access-qvs72". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.865875 4743 scope.go:117] "RemoveContainer" containerID="f35c9152254195699523489d96332a3ec727ffde12b11a8d084e4704c0f8c619" Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.954652 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:24 crc kubenswrapper[4743]: I1122 08:26:24.954688 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvs72\" (UniqueName: \"kubernetes.io/projected/bdaf46ea-41b9-4db9-8b88-43f5a406a910-kube-api-access-qvs72\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:25 crc kubenswrapper[4743]: I1122 08:26:25.009604 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdaf46ea-41b9-4db9-8b88-43f5a406a910" (UID: "bdaf46ea-41b9-4db9-8b88-43f5a406a910"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:26:25 crc kubenswrapper[4743]: I1122 08:26:25.055747 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdaf46ea-41b9-4db9-8b88-43f5a406a910-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:25 crc kubenswrapper[4743]: I1122 08:26:25.817487 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v4w46" Nov 22 08:26:25 crc kubenswrapper[4743]: I1122 08:26:25.834018 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v4w46"] Nov 22 08:26:25 crc kubenswrapper[4743]: I1122 08:26:25.836921 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v4w46"] Nov 22 08:26:27 crc kubenswrapper[4743]: I1122 08:26:27.160033 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" path="/var/lib/kubelet/pods/bdaf46ea-41b9-4db9-8b88-43f5a406a910/volumes" Nov 22 08:26:46 crc kubenswrapper[4743]: I1122 08:26:46.822552 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" podUID="9e14fb50-5723-489d-acc2-c5ca42234b73" containerName="oauth-openshift" containerID="cri-o://672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede" gracePeriod=15 Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.193730 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.229658 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-679ddc4df6-c9476"] Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.229861 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerName="extract-content" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.229872 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerName="extract-content" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.229882 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" containerName="extract-content" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.229888 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" containerName="extract-content" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.229895 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812538df-82a4-49d5-b50e-b99315f995ca" containerName="extract-utilities" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.229901 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="812538df-82a4-49d5-b50e-b99315f995ca" containerName="extract-utilities" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.229910 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.229916 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.229924 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.229931 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.229938 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" containerName="extract-utilities" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.229943 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" containerName="extract-utilities" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.229952 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad768e0-3532-44b1-a3fb-d5db53e76bdf" containerName="pruner" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.229961 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad768e0-3532-44b1-a3fb-d5db53e76bdf" containerName="pruner" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.229972 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerName="extract-utilities" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.229980 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerName="extract-utilities" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.229991 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.229998 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.230007 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerName="extract-utilities" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230014 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerName="extract-utilities" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.230025 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e14fb50-5723-489d-acc2-c5ca42234b73" containerName="oauth-openshift" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230031 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e14fb50-5723-489d-acc2-c5ca42234b73" containerName="oauth-openshift" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.230040 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerName="extract-content" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230046 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerName="extract-content" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.230058 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3317794e-757c-471c-ac8b-adad390e622d" containerName="pruner" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230063 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3317794e-757c-471c-ac8b-adad390e622d" containerName="pruner" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.230071 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812538df-82a4-49d5-b50e-b99315f995ca" containerName="extract-content" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230076 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="812538df-82a4-49d5-b50e-b99315f995ca" containerName="extract-content" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.230084 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812538df-82a4-49d5-b50e-b99315f995ca" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230090 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="812538df-82a4-49d5-b50e-b99315f995ca" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230194 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="60be236b-0a63-4c71-9e90-3d78e811f956" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230204 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad768e0-3532-44b1-a3fb-d5db53e76bdf" containerName="pruner" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230211 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d26a53-22e4-4d36-9e75-872a43d2a7cc" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230219 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdaf46ea-41b9-4db9-8b88-43f5a406a910" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230227 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3317794e-757c-471c-ac8b-adad390e622d" containerName="pruner" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230236 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="812538df-82a4-49d5-b50e-b99315f995ca" containerName="registry-server" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230244 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e14fb50-5723-489d-acc2-c5ca42234b73" containerName="oauth-openshift" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.230813 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.233881 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-cliconfig\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.235145 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.235347 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h88lm\" (UniqueName: \"kubernetes.io/projected/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-kube-api-access-h88lm\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.235465 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-audit-policies\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.235508 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-router-certs\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.235536 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-audit-dir\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.235562 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-template-login\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.235606 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.235830 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-template-error\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.236227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.236302 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.236332 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.236360 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-session\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.236398 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-service-ca\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.236423 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.236513 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.236867 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.246462 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-679ddc4df6-c9476"] Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337359 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-idp-0-file-data\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337399 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-policies\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337416 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-serving-cert\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337432 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-dir\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337447 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-router-certs\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337469 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-trusted-ca-bundle\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337486 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-ocp-branding-template\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337506 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-provider-selection\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337531 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5zbj\" (UniqueName: \"kubernetes.io/projected/9e14fb50-5723-489d-acc2-c5ca42234b73-kube-api-access-j5zbj\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337550 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-login\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-session\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337600 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-error\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337615 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-service-ca\") pod \"9e14fb50-5723-489d-acc2-c5ca42234b73\" (UID: \"9e14fb50-5723-489d-acc2-c5ca42234b73\") " Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337689 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-audit-policies\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-audit-dir\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337727 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-router-certs\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337746 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-template-login\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337761 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-template-error\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337814 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337866 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-session\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337883 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-service-ca\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337927 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337951 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h88lm\" (UniqueName: \"kubernetes.io/projected/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-kube-api-access-h88lm\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.337991 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.338096 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-audit-dir\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.338453 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-audit-policies\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.338783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.338969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.340267 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.341062 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.342494 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-service-ca\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.343243 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.346971 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.347007 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.347114 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-template-login\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.347149 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.347211 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.347805 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.348694 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.348887 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-session\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.349270 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-template-error\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.348936 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.349194 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.349459 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-router-certs\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.348706 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.349818 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.349864 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e14fb50-5723-489d-acc2-c5ca42234b73-kube-api-access-j5zbj" (OuterVolumeSpecName: "kube-api-access-j5zbj") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "kube-api-access-j5zbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.351037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9e14fb50-5723-489d-acc2-c5ca42234b73" (UID: "9e14fb50-5723-489d-acc2-c5ca42234b73"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.352809 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.354325 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h88lm\" (UniqueName: \"kubernetes.io/projected/fe4f3711-a28f-4b1c-a238-c532ab0fbec6-kube-api-access-h88lm\") pod \"oauth-openshift-679ddc4df6-c9476\" (UID: \"fe4f3711-a28f-4b1c-a238-c532ab0fbec6\") " pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438516 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438597 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438619 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438630 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438641 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438653 4743 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-policies\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438688 4743 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9e14fb50-5723-489d-acc2-c5ca42234b73-audit-dir\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438699 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438710 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438721 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438733 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438776 4743 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9e14fb50-5723-489d-acc2-c5ca42234b73-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.438789 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5zbj\" (UniqueName: \"kubernetes.io/projected/9e14fb50-5723-489d-acc2-c5ca42234b73-kube-api-access-j5zbj\") on node \"crc\" DevicePath \"\"" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.558949 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.932791 4743 generic.go:334] "Generic (PLEG): container finished" podID="9e14fb50-5723-489d-acc2-c5ca42234b73" containerID="672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede" exitCode=0 Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.932856 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" event={"ID":"9e14fb50-5723-489d-acc2-c5ca42234b73","Type":"ContainerDied","Data":"672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede"} Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.932964 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.933072 4743 scope.go:117] "RemoveContainer" containerID="672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.933059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-42kzd" event={"ID":"9e14fb50-5723-489d-acc2-c5ca42234b73","Type":"ContainerDied","Data":"b5730219dd95aed9076dd39d5d85bbe4c233cd353f153cc89cafefc248932de2"} Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.964123 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-679ddc4df6-c9476"] Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.981625 4743 scope.go:117] "RemoveContainer" containerID="672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede" Nov 22 08:26:47 crc kubenswrapper[4743]: E1122 08:26:47.982170 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede\": container with ID starting with 672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede not found: ID does not exist" containerID="672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede" Nov 22 08:26:47 crc kubenswrapper[4743]: I1122 08:26:47.982235 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede"} err="failed to get container status \"672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede\": rpc error: code = NotFound desc = could not find container \"672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede\": container with ID starting with 672a34dc82cc5f5b32fb49b0707438346d4c55c47a7dc60869929b1470b4fede not found: ID does not exist" Nov 22 08:26:48 crc kubenswrapper[4743]: I1122 08:26:48.007602 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-42kzd"] Nov 22 08:26:48 crc kubenswrapper[4743]: I1122 08:26:48.010725 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-42kzd"] Nov 22 08:26:48 crc kubenswrapper[4743]: I1122 08:26:48.944260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" event={"ID":"fe4f3711-a28f-4b1c-a238-c532ab0fbec6","Type":"ContainerStarted","Data":"aa7505b332d1a1575eb9c03c56decab00883465e37f643b67a7e08dcb9773d6e"} Nov 22 08:26:48 crc kubenswrapper[4743]: I1122 08:26:48.944628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" event={"ID":"fe4f3711-a28f-4b1c-a238-c532ab0fbec6","Type":"ContainerStarted","Data":"29b6b31f45b4fdea74b75d4f53714fb9a860cd1c896dad00aceb3bc762fc53f7"} Nov 22 08:26:48 crc kubenswrapper[4743]: I1122 08:26:48.945646 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:48 crc kubenswrapper[4743]: I1122 08:26:48.950038 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" Nov 22 08:26:49 crc kubenswrapper[4743]: I1122 08:26:49.001496 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-679ddc4df6-c9476" podStartSLOduration=28.001469449 podStartE2EDuration="28.001469449s" podCreationTimestamp="2025-11-22 08:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:26:48.968175272 +0000 UTC m=+282.674536324" watchObservedRunningTime="2025-11-22 08:26:49.001469449 +0000 UTC m=+282.707830501" Nov 22 08:26:49 crc kubenswrapper[4743]: I1122 08:26:49.160180 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e14fb50-5723-489d-acc2-c5ca42234b73" path="/var/lib/kubelet/pods/9e14fb50-5723-489d-acc2-c5ca42234b73/volumes" Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.880638 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krx5n"] Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.881486 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-krx5n" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerName="registry-server" containerID="cri-o://dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255" gracePeriod=30 Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.884372 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dvccz"] Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.884606 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dvccz" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerName="registry-server" containerID="cri-o://c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3" gracePeriod=30 Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.894685 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dm8fj"] Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.894954 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" podUID="07f6c2e0-4230-40e0-ad71-2f652546cd38" containerName="marketplace-operator" containerID="cri-o://5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4" gracePeriod=30 Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.905902 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqk4"] Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.906245 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wgqk4" podUID="50a7a114-710f-439d-8c79-58a4ba712cda" containerName="registry-server" containerID="cri-o://1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845" gracePeriod=30 Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.911787 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5zpxd"] Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.912613 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.915717 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dlhns"] Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.916024 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dlhns" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerName="registry-server" containerID="cri-o://c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c" gracePeriod=30 Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.919838 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5zpxd"] Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.942234 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5jt\" (UniqueName: \"kubernetes.io/projected/87ba7bcd-5643-4c3a-a351-554d57e3c8a0-kube-api-access-sj5jt\") pod \"marketplace-operator-79b997595-5zpxd\" (UID: \"87ba7bcd-5643-4c3a-a351-554d57e3c8a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.942279 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87ba7bcd-5643-4c3a-a351-554d57e3c8a0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5zpxd\" (UID: \"87ba7bcd-5643-4c3a-a351-554d57e3c8a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:08 crc kubenswrapper[4743]: I1122 08:27:08.942483 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87ba7bcd-5643-4c3a-a351-554d57e3c8a0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5zpxd\" (UID: \"87ba7bcd-5643-4c3a-a351-554d57e3c8a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.043346 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87ba7bcd-5643-4c3a-a351-554d57e3c8a0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5zpxd\" (UID: \"87ba7bcd-5643-4c3a-a351-554d57e3c8a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.043411 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5jt\" (UniqueName: \"kubernetes.io/projected/87ba7bcd-5643-4c3a-a351-554d57e3c8a0-kube-api-access-sj5jt\") pod \"marketplace-operator-79b997595-5zpxd\" (UID: \"87ba7bcd-5643-4c3a-a351-554d57e3c8a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.043440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87ba7bcd-5643-4c3a-a351-554d57e3c8a0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5zpxd\" (UID: \"87ba7bcd-5643-4c3a-a351-554d57e3c8a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.045128 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/87ba7bcd-5643-4c3a-a351-554d57e3c8a0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5zpxd\" (UID: \"87ba7bcd-5643-4c3a-a351-554d57e3c8a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.050450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/87ba7bcd-5643-4c3a-a351-554d57e3c8a0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5zpxd\" (UID: \"87ba7bcd-5643-4c3a-a351-554d57e3c8a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.063629 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5jt\" (UniqueName: \"kubernetes.io/projected/87ba7bcd-5643-4c3a-a351-554d57e3c8a0-kube-api-access-sj5jt\") pod \"marketplace-operator-79b997595-5zpxd\" (UID: \"87ba7bcd-5643-4c3a-a351-554d57e3c8a0\") " pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:09 crc kubenswrapper[4743]: E1122 08:27:09.198886 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c is running failed: container process not found" containerID="c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 08:27:09 crc kubenswrapper[4743]: E1122 08:27:09.199395 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c is running failed: container process not found" containerID="c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 08:27:09 crc kubenswrapper[4743]: E1122 08:27:09.199940 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c is running failed: container process not found" containerID="c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c" cmd=["grpc_health_probe","-addr=:50051"] Nov 22 08:27:09 crc kubenswrapper[4743]: E1122 08:27:09.199992 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-dlhns" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerName="registry-server" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.235886 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.486208 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dm8fj container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.486634 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" podUID="07f6c2e0-4230-40e0-ad71-2f652546cd38" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.675759 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5zpxd"] Nov 22 08:27:09 crc kubenswrapper[4743]: W1122 08:27:09.690505 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87ba7bcd_5643_4c3a_a351_554d57e3c8a0.slice/crio-b94c8905973d782d0998d1b1b28bff3967dbeb9ba01e1e8a34c79ff6d23aaaed WatchSource:0}: Error finding container b94c8905973d782d0998d1b1b28bff3967dbeb9ba01e1e8a34c79ff6d23aaaed: Status 404 returned error can't find the container with id b94c8905973d782d0998d1b1b28bff3967dbeb9ba01e1e8a34c79ff6d23aaaed Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.800807 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.885734 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.899276 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.902809 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.911350 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.953703 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-catalog-content\") pod \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.953759 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-utilities\") pod \"50a7a114-710f-439d-8c79-58a4ba712cda\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.953778 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-utilities\") pod \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.953795 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-trusted-ca\") pod \"07f6c2e0-4230-40e0-ad71-2f652546cd38\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.953831 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78xrd\" (UniqueName: \"kubernetes.io/projected/ffd1a20f-f616-4301-8c3c-546e5e3d349d-kube-api-access-78xrd\") pod \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.953855 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-operator-metrics\") pod \"07f6c2e0-4230-40e0-ad71-2f652546cd38\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.953883 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhnq9\" (UniqueName: \"kubernetes.io/projected/50a7a114-710f-439d-8c79-58a4ba712cda-kube-api-access-nhnq9\") pod \"50a7a114-710f-439d-8c79-58a4ba712cda\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.953922 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvktx\" (UniqueName: \"kubernetes.io/projected/0ac4d9cc-a76c-4061-b234-91ceaa669957-kube-api-access-gvktx\") pod \"0ac4d9cc-a76c-4061-b234-91ceaa669957\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.953964 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-catalog-content\") pod \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.954000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-catalog-content\") pod \"0ac4d9cc-a76c-4061-b234-91ceaa669957\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.954016 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpx4c\" (UniqueName: \"kubernetes.io/projected/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-kube-api-access-vpx4c\") pod \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\" (UID: \"cba69292-a0c6-4ab8-8fba-1144f4d1e88b\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.954031 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnpd2\" (UniqueName: \"kubernetes.io/projected/07f6c2e0-4230-40e0-ad71-2f652546cd38-kube-api-access-cnpd2\") pod \"07f6c2e0-4230-40e0-ad71-2f652546cd38\" (UID: \"07f6c2e0-4230-40e0-ad71-2f652546cd38\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.954079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-catalog-content\") pod \"50a7a114-710f-439d-8c79-58a4ba712cda\" (UID: \"50a7a114-710f-439d-8c79-58a4ba712cda\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.954102 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-utilities\") pod \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\" (UID: \"ffd1a20f-f616-4301-8c3c-546e5e3d349d\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.954188 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-utilities\") pod \"0ac4d9cc-a76c-4061-b234-91ceaa669957\" (UID: \"0ac4d9cc-a76c-4061-b234-91ceaa669957\") " Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.955406 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-utilities" (OuterVolumeSpecName: "utilities") pod "cba69292-a0c6-4ab8-8fba-1144f4d1e88b" (UID: "cba69292-a0c6-4ab8-8fba-1144f4d1e88b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.967283 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "07f6c2e0-4230-40e0-ad71-2f652546cd38" (UID: "07f6c2e0-4230-40e0-ad71-2f652546cd38"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.977653 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-utilities" (OuterVolumeSpecName: "utilities") pod "50a7a114-710f-439d-8c79-58a4ba712cda" (UID: "50a7a114-710f-439d-8c79-58a4ba712cda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.979459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-utilities" (OuterVolumeSpecName: "utilities") pod "ffd1a20f-f616-4301-8c3c-546e5e3d349d" (UID: "ffd1a20f-f616-4301-8c3c-546e5e3d349d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.980285 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd1a20f-f616-4301-8c3c-546e5e3d349d-kube-api-access-78xrd" (OuterVolumeSpecName: "kube-api-access-78xrd") pod "ffd1a20f-f616-4301-8c3c-546e5e3d349d" (UID: "ffd1a20f-f616-4301-8c3c-546e5e3d349d"). InnerVolumeSpecName "kube-api-access-78xrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.984205 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "07f6c2e0-4230-40e0-ad71-2f652546cd38" (UID: "07f6c2e0-4230-40e0-ad71-2f652546cd38"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:27:09 crc kubenswrapper[4743]: I1122 08:27:09.984359 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac4d9cc-a76c-4061-b234-91ceaa669957-kube-api-access-gvktx" (OuterVolumeSpecName: "kube-api-access-gvktx") pod "0ac4d9cc-a76c-4061-b234-91ceaa669957" (UID: "0ac4d9cc-a76c-4061-b234-91ceaa669957"). InnerVolumeSpecName "kube-api-access-gvktx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:09.992620 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-utilities" (OuterVolumeSpecName: "utilities") pod "0ac4d9cc-a76c-4061-b234-91ceaa669957" (UID: "0ac4d9cc-a76c-4061-b234-91ceaa669957"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.025406 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-kube-api-access-vpx4c" (OuterVolumeSpecName: "kube-api-access-vpx4c") pod "cba69292-a0c6-4ab8-8fba-1144f4d1e88b" (UID: "cba69292-a0c6-4ab8-8fba-1144f4d1e88b"). InnerVolumeSpecName "kube-api-access-vpx4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.035413 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f6c2e0-4230-40e0-ad71-2f652546cd38-kube-api-access-cnpd2" (OuterVolumeSpecName: "kube-api-access-cnpd2") pod "07f6c2e0-4230-40e0-ad71-2f652546cd38" (UID: "07f6c2e0-4230-40e0-ad71-2f652546cd38"). InnerVolumeSpecName "kube-api-access-cnpd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.035704 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a7a114-710f-439d-8c79-58a4ba712cda-kube-api-access-nhnq9" (OuterVolumeSpecName: "kube-api-access-nhnq9") pod "50a7a114-710f-439d-8c79-58a4ba712cda" (UID: "50a7a114-710f-439d-8c79-58a4ba712cda"). InnerVolumeSpecName "kube-api-access-nhnq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.037661 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "50a7a114-710f-439d-8c79-58a4ba712cda" (UID: "50a7a114-710f-439d-8c79-58a4ba712cda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.065168 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78xrd\" (UniqueName: \"kubernetes.io/projected/ffd1a20f-f616-4301-8c3c-546e5e3d349d-kube-api-access-78xrd\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.065896 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.066000 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhnq9\" (UniqueName: \"kubernetes.io/projected/50a7a114-710f-439d-8c79-58a4ba712cda-kube-api-access-nhnq9\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.066191 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvktx\" (UniqueName: \"kubernetes.io/projected/0ac4d9cc-a76c-4061-b234-91ceaa669957-kube-api-access-gvktx\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.066324 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpx4c\" (UniqueName: \"kubernetes.io/projected/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-kube-api-access-vpx4c\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.066445 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnpd2\" (UniqueName: \"kubernetes.io/projected/07f6c2e0-4230-40e0-ad71-2f652546cd38-kube-api-access-cnpd2\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.066596 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.066720 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.066817 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.066911 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/50a7a114-710f-439d-8c79-58a4ba712cda-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.066988 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.067070 4743 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/07f6c2e0-4230-40e0-ad71-2f652546cd38-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.075903 4743 generic.go:334] "Generic (PLEG): container finished" podID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerID="c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3" exitCode=0 Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.076152 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvccz" event={"ID":"0ac4d9cc-a76c-4061-b234-91ceaa669957","Type":"ContainerDied","Data":"c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.076273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dvccz" event={"ID":"0ac4d9cc-a76c-4061-b234-91ceaa669957","Type":"ContainerDied","Data":"4e8ac2ab254185494eb31c963fac4df31558d86da7007f788477a6c029f985be"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.076420 4743 scope.go:117] "RemoveContainer" containerID="c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.076657 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dvccz" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.084491 4743 generic.go:334] "Generic (PLEG): container finished" podID="50a7a114-710f-439d-8c79-58a4ba712cda" containerID="1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845" exitCode=0 Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.084675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqk4" event={"ID":"50a7a114-710f-439d-8c79-58a4ba712cda","Type":"ContainerDied","Data":"1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.084738 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wgqk4" event={"ID":"50a7a114-710f-439d-8c79-58a4ba712cda","Type":"ContainerDied","Data":"5ed74c1f6980f84ee651657f07a21f4f75dbe6f4397f326d113b9abf7dedc841"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.084699 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wgqk4" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.085015 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffd1a20f-f616-4301-8c3c-546e5e3d349d" (UID: "ffd1a20f-f616-4301-8c3c-546e5e3d349d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.089889 4743 generic.go:334] "Generic (PLEG): container finished" podID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerID="c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c" exitCode=0 Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.090043 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhns" event={"ID":"cba69292-a0c6-4ab8-8fba-1144f4d1e88b","Type":"ContainerDied","Data":"c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.090131 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dlhns" event={"ID":"cba69292-a0c6-4ab8-8fba-1144f4d1e88b","Type":"ContainerDied","Data":"17673f772362d64fe98f44c37b0213c5e59896d03a5282f5e1fe7b8ddaabf0e1"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.090270 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dlhns" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.101501 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" event={"ID":"87ba7bcd-5643-4c3a-a351-554d57e3c8a0","Type":"ContainerStarted","Data":"82c4db49dc2c1b16cb90d30d48a68ed30ae157b22c9a3e1036b5d6165def6c93"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.101541 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" event={"ID":"87ba7bcd-5643-4c3a-a351-554d57e3c8a0","Type":"ContainerStarted","Data":"b94c8905973d782d0998d1b1b28bff3967dbeb9ba01e1e8a34c79ff6d23aaaed"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.102003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.103461 4743 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5zpxd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" start-of-body= Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.103498 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" podUID="87ba7bcd-5643-4c3a-a351-554d57e3c8a0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.55:8080/healthz\": dial tcp 10.217.0.55:8080: connect: connection refused" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.106192 4743 scope.go:117] "RemoveContainer" containerID="fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.108497 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ac4d9cc-a76c-4061-b234-91ceaa669957" (UID: "0ac4d9cc-a76c-4061-b234-91ceaa669957"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.111511 4743 generic.go:334] "Generic (PLEG): container finished" podID="07f6c2e0-4230-40e0-ad71-2f652546cd38" containerID="5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4" exitCode=0 Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.111634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" event={"ID":"07f6c2e0-4230-40e0-ad71-2f652546cd38","Type":"ContainerDied","Data":"5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.111655 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.111666 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dm8fj" event={"ID":"07f6c2e0-4230-40e0-ad71-2f652546cd38","Type":"ContainerDied","Data":"80de222ade7a1064bb751e5a3896645a34501e9fefd15d6925d827424e705fef"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.123871 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" podStartSLOduration=2.123851105 podStartE2EDuration="2.123851105s" podCreationTimestamp="2025-11-22 08:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:27:10.118018462 +0000 UTC m=+303.824379504" watchObservedRunningTime="2025-11-22 08:27:10.123851105 +0000 UTC m=+303.830212147" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.132252 4743 generic.go:334] "Generic (PLEG): container finished" podID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerID="dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255" exitCode=0 Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.132302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krx5n" event={"ID":"ffd1a20f-f616-4301-8c3c-546e5e3d349d","Type":"ContainerDied","Data":"dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.132332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krx5n" event={"ID":"ffd1a20f-f616-4301-8c3c-546e5e3d349d","Type":"ContainerDied","Data":"781e7e7569dbd368989d36674a1dd96a06d8a8e941a2edbbfae68c286998af7b"} Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.132406 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krx5n" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.135202 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqk4"] Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.137732 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wgqk4"] Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.151100 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cba69292-a0c6-4ab8-8fba-1144f4d1e88b" (UID: "cba69292-a0c6-4ab8-8fba-1144f4d1e88b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.157776 4743 scope.go:117] "RemoveContainer" containerID="8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.158862 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dm8fj"] Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.164329 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dm8fj"] Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.171899 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffd1a20f-f616-4301-8c3c-546e5e3d349d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.171960 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba69292-a0c6-4ab8-8fba-1144f4d1e88b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.171973 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ac4d9cc-a76c-4061-b234-91ceaa669957-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.172757 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krx5n"] Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.177808 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-krx5n"] Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.180346 4743 scope.go:117] "RemoveContainer" containerID="c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.181041 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3\": container with ID starting with c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3 not found: ID does not exist" containerID="c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.181128 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3"} err="failed to get container status \"c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3\": rpc error: code = NotFound desc = could not find container \"c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3\": container with ID starting with c28244585833554a940fe90d98cad3464df8aeda6b052f1df2b1e75799bb2da3 not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.181448 4743 scope.go:117] "RemoveContainer" containerID="fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.182029 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5\": container with ID starting with fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5 not found: ID does not exist" containerID="fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.182108 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5"} err="failed to get container status \"fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5\": rpc error: code = NotFound desc = could not find container \"fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5\": container with ID starting with fca096fae83f6d409842b63249d9cfa0f02dcee61b537a098bb47521a663a0b5 not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.182148 4743 scope.go:117] "RemoveContainer" containerID="8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.182481 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc\": container with ID starting with 8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc not found: ID does not exist" containerID="8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.182543 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc"} err="failed to get container status \"8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc\": rpc error: code = NotFound desc = could not find container \"8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc\": container with ID starting with 8a29f3ac9120b1a3fc030ef2e6156915987e92ad846dbaf4e012abb916d999fc not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.182564 4743 scope.go:117] "RemoveContainer" containerID="1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.196762 4743 scope.go:117] "RemoveContainer" containerID="0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.211777 4743 scope.go:117] "RemoveContainer" containerID="d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.225425 4743 scope.go:117] "RemoveContainer" containerID="1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.225976 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845\": container with ID starting with 1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845 not found: ID does not exist" containerID="1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.226037 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845"} err="failed to get container status \"1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845\": rpc error: code = NotFound desc = could not find container \"1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845\": container with ID starting with 1fe0c95cbeaa8b807d1233678dc24438042a21d79150562454c29a86ee2c0845 not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.226070 4743 scope.go:117] "RemoveContainer" containerID="0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.226485 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24\": container with ID starting with 0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24 not found: ID does not exist" containerID="0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.226524 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24"} err="failed to get container status \"0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24\": rpc error: code = NotFound desc = could not find container \"0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24\": container with ID starting with 0ab6bcb7ec77178bd557f72cdf8ed042d8834ce85cc50285b735949a7ab11d24 not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.226552 4743 scope.go:117] "RemoveContainer" containerID="d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.226896 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd\": container with ID starting with d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd not found: ID does not exist" containerID="d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.226927 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd"} err="failed to get container status \"d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd\": rpc error: code = NotFound desc = could not find container \"d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd\": container with ID starting with d9cb6bd310320069ca57f611e4c751e603e5683915d2613c52a0e0a2b09ff9cd not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.226962 4743 scope.go:117] "RemoveContainer" containerID="c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.238952 4743 scope.go:117] "RemoveContainer" containerID="e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.268087 4743 scope.go:117] "RemoveContainer" containerID="4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.281237 4743 scope.go:117] "RemoveContainer" containerID="c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.281680 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c\": container with ID starting with c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c not found: ID does not exist" containerID="c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.281713 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c"} err="failed to get container status \"c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c\": rpc error: code = NotFound desc = could not find container \"c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c\": container with ID starting with c9c73aaf5d1bf75afe74148b7ad6d725e8ce09e212a4f2bba1d69aed225c3b0c not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.281736 4743 scope.go:117] "RemoveContainer" containerID="e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.282115 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2\": container with ID starting with e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2 not found: ID does not exist" containerID="e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.282156 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2"} err="failed to get container status \"e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2\": rpc error: code = NotFound desc = could not find container \"e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2\": container with ID starting with e35e5097276a544e819001d3f692f8c7ce137b311fdb10fc8c0863e7a62c40b2 not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.282173 4743 scope.go:117] "RemoveContainer" containerID="4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.282487 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f\": container with ID starting with 4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f not found: ID does not exist" containerID="4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.282506 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f"} err="failed to get container status \"4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f\": rpc error: code = NotFound desc = could not find container \"4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f\": container with ID starting with 4dab127e7717d54fd4c79f81c779f608ff15bdf5412f724c39f0d1304a113e7f not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.282517 4743 scope.go:117] "RemoveContainer" containerID="5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.295975 4743 scope.go:117] "RemoveContainer" containerID="5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.296566 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4\": container with ID starting with 5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4 not found: ID does not exist" containerID="5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.296628 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4"} err="failed to get container status \"5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4\": rpc error: code = NotFound desc = could not find container \"5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4\": container with ID starting with 5016d3855a009e523b582d4fb774dc8d5636f44189779e230f2523a159942da4 not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.296647 4743 scope.go:117] "RemoveContainer" containerID="dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.310372 4743 scope.go:117] "RemoveContainer" containerID="7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.324404 4743 scope.go:117] "RemoveContainer" containerID="1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.335374 4743 scope.go:117] "RemoveContainer" containerID="dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.335874 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255\": container with ID starting with dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255 not found: ID does not exist" containerID="dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.335920 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255"} err="failed to get container status \"dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255\": rpc error: code = NotFound desc = could not find container \"dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255\": container with ID starting with dfb1da3926727bbf24b619e6f34c4ac5bfc7cf645f8279e030b46ee5bf055255 not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.335977 4743 scope.go:117] "RemoveContainer" containerID="7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.336317 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb\": container with ID starting with 7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb not found: ID does not exist" containerID="7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.336372 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb"} err="failed to get container status \"7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb\": rpc error: code = NotFound desc = could not find container \"7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb\": container with ID starting with 7090d8ab8d7c17c8da2733be2037e7bfa0866d7b9f5c2ac26a1421b31aff6bfb not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.336404 4743 scope.go:117] "RemoveContainer" containerID="1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48" Nov 22 08:27:10 crc kubenswrapper[4743]: E1122 08:27:10.336901 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48\": container with ID starting with 1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48 not found: ID does not exist" containerID="1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.336948 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48"} err="failed to get container status \"1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48\": rpc error: code = NotFound desc = could not find container \"1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48\": container with ID starting with 1157deb82a9cffd7c9a79dbae7ce28644d9fb911a6247e1c74dee802111c6e48 not found: ID does not exist" Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.418211 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dvccz"] Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.423785 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dvccz"] Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.431442 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dlhns"] Nov 22 08:27:10 crc kubenswrapper[4743]: I1122 08:27:10.437495 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dlhns"] Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.085904 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jbq5n"] Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086081 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerName="extract-utilities" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086092 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerName="extract-utilities" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086101 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerName="extract-utilities" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086106 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerName="extract-utilities" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086112 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086118 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086127 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerName="extract-content" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086133 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerName="extract-content" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086141 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerName="extract-content" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086149 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerName="extract-content" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086158 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086164 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086178 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f6c2e0-4230-40e0-ad71-2f652546cd38" containerName="marketplace-operator" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086183 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f6c2e0-4230-40e0-ad71-2f652546cd38" containerName="marketplace-operator" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086190 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086196 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086202 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a7a114-710f-439d-8c79-58a4ba712cda" containerName="extract-content" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086208 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a7a114-710f-439d-8c79-58a4ba712cda" containerName="extract-content" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086215 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerName="extract-utilities" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086221 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerName="extract-utilities" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086228 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerName="extract-content" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086233 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerName="extract-content" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086241 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a7a114-710f-439d-8c79-58a4ba712cda" containerName="extract-utilities" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086246 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a7a114-710f-439d-8c79-58a4ba712cda" containerName="extract-utilities" Nov 22 08:27:11 crc kubenswrapper[4743]: E1122 08:27:11.086255 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a7a114-710f-439d-8c79-58a4ba712cda" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086260 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a7a114-710f-439d-8c79-58a4ba712cda" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086333 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f6c2e0-4230-40e0-ad71-2f652546cd38" containerName="marketplace-operator" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086344 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086351 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086361 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.086370 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a7a114-710f-439d-8c79-58a4ba712cda" containerName="registry-server" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.087016 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.088912 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.095659 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jbq5n"] Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.142943 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5zpxd" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.162050 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f6c2e0-4230-40e0-ad71-2f652546cd38" path="/var/lib/kubelet/pods/07f6c2e0-4230-40e0-ad71-2f652546cd38/volumes" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.162636 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ac4d9cc-a76c-4061-b234-91ceaa669957" path="/var/lib/kubelet/pods/0ac4d9cc-a76c-4061-b234-91ceaa669957/volumes" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.163191 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a7a114-710f-439d-8c79-58a4ba712cda" path="/var/lib/kubelet/pods/50a7a114-710f-439d-8c79-58a4ba712cda/volumes" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.164233 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba69292-a0c6-4ab8-8fba-1144f4d1e88b" path="/var/lib/kubelet/pods/cba69292-a0c6-4ab8-8fba-1144f4d1e88b/volumes" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.164927 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd1a20f-f616-4301-8c3c-546e5e3d349d" path="/var/lib/kubelet/pods/ffd1a20f-f616-4301-8c3c-546e5e3d349d/volumes" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.183763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc02866-fa76-46c8-9213-6c879aad1284-utilities\") pod \"certified-operators-jbq5n\" (UID: \"0fc02866-fa76-46c8-9213-6c879aad1284\") " pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.183816 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc02866-fa76-46c8-9213-6c879aad1284-catalog-content\") pod \"certified-operators-jbq5n\" (UID: \"0fc02866-fa76-46c8-9213-6c879aad1284\") " pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.183847 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjs9r\" (UniqueName: \"kubernetes.io/projected/0fc02866-fa76-46c8-9213-6c879aad1284-kube-api-access-gjs9r\") pod \"certified-operators-jbq5n\" (UID: \"0fc02866-fa76-46c8-9213-6c879aad1284\") " pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.288260 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc02866-fa76-46c8-9213-6c879aad1284-utilities\") pod \"certified-operators-jbq5n\" (UID: \"0fc02866-fa76-46c8-9213-6c879aad1284\") " pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.288318 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc02866-fa76-46c8-9213-6c879aad1284-catalog-content\") pod \"certified-operators-jbq5n\" (UID: \"0fc02866-fa76-46c8-9213-6c879aad1284\") " pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.288349 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjs9r\" (UniqueName: \"kubernetes.io/projected/0fc02866-fa76-46c8-9213-6c879aad1284-kube-api-access-gjs9r\") pod \"certified-operators-jbq5n\" (UID: \"0fc02866-fa76-46c8-9213-6c879aad1284\") " pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.288697 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mpsmp"] Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.289753 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.295829 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpsmp"] Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.295984 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.296298 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fc02866-fa76-46c8-9213-6c879aad1284-utilities\") pod \"certified-operators-jbq5n\" (UID: \"0fc02866-fa76-46c8-9213-6c879aad1284\") " pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.296343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fc02866-fa76-46c8-9213-6c879aad1284-catalog-content\") pod \"certified-operators-jbq5n\" (UID: \"0fc02866-fa76-46c8-9213-6c879aad1284\") " pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.308514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjs9r\" (UniqueName: \"kubernetes.io/projected/0fc02866-fa76-46c8-9213-6c879aad1284-kube-api-access-gjs9r\") pod \"certified-operators-jbq5n\" (UID: \"0fc02866-fa76-46c8-9213-6c879aad1284\") " pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.389629 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l8xm\" (UniqueName: \"kubernetes.io/projected/00fed59e-401b-4b13-b307-44e90ae76dce-kube-api-access-6l8xm\") pod \"redhat-marketplace-mpsmp\" (UID: \"00fed59e-401b-4b13-b307-44e90ae76dce\") " pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.389696 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00fed59e-401b-4b13-b307-44e90ae76dce-catalog-content\") pod \"redhat-marketplace-mpsmp\" (UID: \"00fed59e-401b-4b13-b307-44e90ae76dce\") " pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.389734 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00fed59e-401b-4b13-b307-44e90ae76dce-utilities\") pod \"redhat-marketplace-mpsmp\" (UID: \"00fed59e-401b-4b13-b307-44e90ae76dce\") " pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.413761 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.490507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00fed59e-401b-4b13-b307-44e90ae76dce-utilities\") pod \"redhat-marketplace-mpsmp\" (UID: \"00fed59e-401b-4b13-b307-44e90ae76dce\") " pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.490849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l8xm\" (UniqueName: \"kubernetes.io/projected/00fed59e-401b-4b13-b307-44e90ae76dce-kube-api-access-6l8xm\") pod \"redhat-marketplace-mpsmp\" (UID: \"00fed59e-401b-4b13-b307-44e90ae76dce\") " pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.490872 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00fed59e-401b-4b13-b307-44e90ae76dce-catalog-content\") pod \"redhat-marketplace-mpsmp\" (UID: \"00fed59e-401b-4b13-b307-44e90ae76dce\") " pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.491264 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00fed59e-401b-4b13-b307-44e90ae76dce-catalog-content\") pod \"redhat-marketplace-mpsmp\" (UID: \"00fed59e-401b-4b13-b307-44e90ae76dce\") " pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.491573 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00fed59e-401b-4b13-b307-44e90ae76dce-utilities\") pod \"redhat-marketplace-mpsmp\" (UID: \"00fed59e-401b-4b13-b307-44e90ae76dce\") " pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.508671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l8xm\" (UniqueName: \"kubernetes.io/projected/00fed59e-401b-4b13-b307-44e90ae76dce-kube-api-access-6l8xm\") pod \"redhat-marketplace-mpsmp\" (UID: \"00fed59e-401b-4b13-b307-44e90ae76dce\") " pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.610916 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.621941 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jbq5n"] Nov 22 08:27:11 crc kubenswrapper[4743]: W1122 08:27:11.636322 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fc02866_fa76_46c8_9213_6c879aad1284.slice/crio-c513c3a6f57dae63734a9d867b091d39f47ac2bb477113d23bf5e44029ea6deb WatchSource:0}: Error finding container c513c3a6f57dae63734a9d867b091d39f47ac2bb477113d23bf5e44029ea6deb: Status 404 returned error can't find the container with id c513c3a6f57dae63734a9d867b091d39f47ac2bb477113d23bf5e44029ea6deb Nov 22 08:27:11 crc kubenswrapper[4743]: I1122 08:27:11.827350 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mpsmp"] Nov 22 08:27:11 crc kubenswrapper[4743]: W1122 08:27:11.912343 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00fed59e_401b_4b13_b307_44e90ae76dce.slice/crio-f61cb73bc74c71fa46c554abae9f2e61c8b06b14e4d523e7c928bc28ad737d63 WatchSource:0}: Error finding container f61cb73bc74c71fa46c554abae9f2e61c8b06b14e4d523e7c928bc28ad737d63: Status 404 returned error can't find the container with id f61cb73bc74c71fa46c554abae9f2e61c8b06b14e4d523e7c928bc28ad737d63 Nov 22 08:27:12 crc kubenswrapper[4743]: I1122 08:27:12.150050 4743 generic.go:334] "Generic (PLEG): container finished" podID="0fc02866-fa76-46c8-9213-6c879aad1284" containerID="06b9e188bfb47154f23a4329ae6b40472b8e2029dd6865c9ea583a8d38ddeada" exitCode=0 Nov 22 08:27:12 crc kubenswrapper[4743]: I1122 08:27:12.150515 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbq5n" event={"ID":"0fc02866-fa76-46c8-9213-6c879aad1284","Type":"ContainerDied","Data":"06b9e188bfb47154f23a4329ae6b40472b8e2029dd6865c9ea583a8d38ddeada"} Nov 22 08:27:12 crc kubenswrapper[4743]: I1122 08:27:12.150634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbq5n" event={"ID":"0fc02866-fa76-46c8-9213-6c879aad1284","Type":"ContainerStarted","Data":"c513c3a6f57dae63734a9d867b091d39f47ac2bb477113d23bf5e44029ea6deb"} Nov 22 08:27:12 crc kubenswrapper[4743]: I1122 08:27:12.158819 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpsmp" event={"ID":"00fed59e-401b-4b13-b307-44e90ae76dce","Type":"ContainerStarted","Data":"dc1ad4c81473c36bbddc1d8acb76ab182ea1119633d309e0f9d7c2f27ea3e1ab"} Nov 22 08:27:12 crc kubenswrapper[4743]: I1122 08:27:12.158861 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpsmp" event={"ID":"00fed59e-401b-4b13-b307-44e90ae76dce","Type":"ContainerStarted","Data":"f61cb73bc74c71fa46c554abae9f2e61c8b06b14e4d523e7c928bc28ad737d63"} Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.165089 4743 generic.go:334] "Generic (PLEG): container finished" podID="00fed59e-401b-4b13-b307-44e90ae76dce" containerID="dc1ad4c81473c36bbddc1d8acb76ab182ea1119633d309e0f9d7c2f27ea3e1ab" exitCode=0 Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.165213 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpsmp" event={"ID":"00fed59e-401b-4b13-b307-44e90ae76dce","Type":"ContainerDied","Data":"dc1ad4c81473c36bbddc1d8acb76ab182ea1119633d309e0f9d7c2f27ea3e1ab"} Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.486672 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hbk4w"] Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.487774 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.489553 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.497486 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hbk4w"] Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.616180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e2b63b-9379-47e8-92c8-991b9599c53c-catalog-content\") pod \"redhat-operators-hbk4w\" (UID: \"69e2b63b-9379-47e8-92c8-991b9599c53c\") " pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.616281 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e2b63b-9379-47e8-92c8-991b9599c53c-utilities\") pod \"redhat-operators-hbk4w\" (UID: \"69e2b63b-9379-47e8-92c8-991b9599c53c\") " pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.616673 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n6z8\" (UniqueName: \"kubernetes.io/projected/69e2b63b-9379-47e8-92c8-991b9599c53c-kube-api-access-2n6z8\") pod \"redhat-operators-hbk4w\" (UID: \"69e2b63b-9379-47e8-92c8-991b9599c53c\") " pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.686132 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m5ksl"] Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.688985 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.692272 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.701677 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m5ksl"] Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.718001 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n6z8\" (UniqueName: \"kubernetes.io/projected/69e2b63b-9379-47e8-92c8-991b9599c53c-kube-api-access-2n6z8\") pod \"redhat-operators-hbk4w\" (UID: \"69e2b63b-9379-47e8-92c8-991b9599c53c\") " pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.718096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e2b63b-9379-47e8-92c8-991b9599c53c-catalog-content\") pod \"redhat-operators-hbk4w\" (UID: \"69e2b63b-9379-47e8-92c8-991b9599c53c\") " pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.718321 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e2b63b-9379-47e8-92c8-991b9599c53c-utilities\") pod \"redhat-operators-hbk4w\" (UID: \"69e2b63b-9379-47e8-92c8-991b9599c53c\") " pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.718485 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69e2b63b-9379-47e8-92c8-991b9599c53c-catalog-content\") pod \"redhat-operators-hbk4w\" (UID: \"69e2b63b-9379-47e8-92c8-991b9599c53c\") " pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.718655 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69e2b63b-9379-47e8-92c8-991b9599c53c-utilities\") pod \"redhat-operators-hbk4w\" (UID: \"69e2b63b-9379-47e8-92c8-991b9599c53c\") " pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.741769 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n6z8\" (UniqueName: \"kubernetes.io/projected/69e2b63b-9379-47e8-92c8-991b9599c53c-kube-api-access-2n6z8\") pod \"redhat-operators-hbk4w\" (UID: \"69e2b63b-9379-47e8-92c8-991b9599c53c\") " pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.810365 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.819863 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74807c56-d30f-4fbd-b0ac-44c792f32b99-utilities\") pod \"community-operators-m5ksl\" (UID: \"74807c56-d30f-4fbd-b0ac-44c792f32b99\") " pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.819953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v22t9\" (UniqueName: \"kubernetes.io/projected/74807c56-d30f-4fbd-b0ac-44c792f32b99-kube-api-access-v22t9\") pod \"community-operators-m5ksl\" (UID: \"74807c56-d30f-4fbd-b0ac-44c792f32b99\") " pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.820008 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74807c56-d30f-4fbd-b0ac-44c792f32b99-catalog-content\") pod \"community-operators-m5ksl\" (UID: \"74807c56-d30f-4fbd-b0ac-44c792f32b99\") " pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.928376 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74807c56-d30f-4fbd-b0ac-44c792f32b99-catalog-content\") pod \"community-operators-m5ksl\" (UID: \"74807c56-d30f-4fbd-b0ac-44c792f32b99\") " pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.928544 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74807c56-d30f-4fbd-b0ac-44c792f32b99-utilities\") pod \"community-operators-m5ksl\" (UID: \"74807c56-d30f-4fbd-b0ac-44c792f32b99\") " pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.928641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v22t9\" (UniqueName: \"kubernetes.io/projected/74807c56-d30f-4fbd-b0ac-44c792f32b99-kube-api-access-v22t9\") pod \"community-operators-m5ksl\" (UID: \"74807c56-d30f-4fbd-b0ac-44c792f32b99\") " pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.929604 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74807c56-d30f-4fbd-b0ac-44c792f32b99-catalog-content\") pod \"community-operators-m5ksl\" (UID: \"74807c56-d30f-4fbd-b0ac-44c792f32b99\") " pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.929725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74807c56-d30f-4fbd-b0ac-44c792f32b99-utilities\") pod \"community-operators-m5ksl\" (UID: \"74807c56-d30f-4fbd-b0ac-44c792f32b99\") " pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:13 crc kubenswrapper[4743]: I1122 08:27:13.961720 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v22t9\" (UniqueName: \"kubernetes.io/projected/74807c56-d30f-4fbd-b0ac-44c792f32b99-kube-api-access-v22t9\") pod \"community-operators-m5ksl\" (UID: \"74807c56-d30f-4fbd-b0ac-44c792f32b99\") " pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:14 crc kubenswrapper[4743]: I1122 08:27:14.008032 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:14 crc kubenswrapper[4743]: I1122 08:27:14.280009 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m5ksl"] Nov 22 08:27:14 crc kubenswrapper[4743]: W1122 08:27:14.285435 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74807c56_d30f_4fbd_b0ac_44c792f32b99.slice/crio-678d99c047052012a73146ba74a06ec6bb05fa9d5df2bd0755d916afa9eb8adc WatchSource:0}: Error finding container 678d99c047052012a73146ba74a06ec6bb05fa9d5df2bd0755d916afa9eb8adc: Status 404 returned error can't find the container with id 678d99c047052012a73146ba74a06ec6bb05fa9d5df2bd0755d916afa9eb8adc Nov 22 08:27:14 crc kubenswrapper[4743]: I1122 08:27:14.528632 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hbk4w"] Nov 22 08:27:15 crc kubenswrapper[4743]: I1122 08:27:15.186889 4743 generic.go:334] "Generic (PLEG): container finished" podID="69e2b63b-9379-47e8-92c8-991b9599c53c" containerID="124796b743cffad9650d654da5cd3bbe652e00b5adc298bbc1cdbd7727528cc5" exitCode=0 Nov 22 08:27:15 crc kubenswrapper[4743]: I1122 08:27:15.187006 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbk4w" event={"ID":"69e2b63b-9379-47e8-92c8-991b9599c53c","Type":"ContainerDied","Data":"124796b743cffad9650d654da5cd3bbe652e00b5adc298bbc1cdbd7727528cc5"} Nov 22 08:27:15 crc kubenswrapper[4743]: I1122 08:27:15.187052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbk4w" event={"ID":"69e2b63b-9379-47e8-92c8-991b9599c53c","Type":"ContainerStarted","Data":"be9d992f8b74930b06b10a9a25dd60db766483c110c4b0381211bb0be6a01ec5"} Nov 22 08:27:15 crc kubenswrapper[4743]: I1122 08:27:15.189553 4743 generic.go:334] "Generic (PLEG): container finished" podID="0fc02866-fa76-46c8-9213-6c879aad1284" containerID="93c7d1cb1e9501bf8493f7ff11a9ad9a1874e8b82ce2516093e47172e6bd8e17" exitCode=0 Nov 22 08:27:15 crc kubenswrapper[4743]: I1122 08:27:15.189600 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbq5n" event={"ID":"0fc02866-fa76-46c8-9213-6c879aad1284","Type":"ContainerDied","Data":"93c7d1cb1e9501bf8493f7ff11a9ad9a1874e8b82ce2516093e47172e6bd8e17"} Nov 22 08:27:15 crc kubenswrapper[4743]: I1122 08:27:15.192403 4743 generic.go:334] "Generic (PLEG): container finished" podID="74807c56-d30f-4fbd-b0ac-44c792f32b99" containerID="1f051771860b898d8dd4f435c733ba02757dfffacd25a62aa0025069df73b26b" exitCode=0 Nov 22 08:27:15 crc kubenswrapper[4743]: I1122 08:27:15.192484 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5ksl" event={"ID":"74807c56-d30f-4fbd-b0ac-44c792f32b99","Type":"ContainerDied","Data":"1f051771860b898d8dd4f435c733ba02757dfffacd25a62aa0025069df73b26b"} Nov 22 08:27:15 crc kubenswrapper[4743]: I1122 08:27:15.192772 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5ksl" event={"ID":"74807c56-d30f-4fbd-b0ac-44c792f32b99","Type":"ContainerStarted","Data":"678d99c047052012a73146ba74a06ec6bb05fa9d5df2bd0755d916afa9eb8adc"} Nov 22 08:27:16 crc kubenswrapper[4743]: I1122 08:27:16.200033 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbq5n" event={"ID":"0fc02866-fa76-46c8-9213-6c879aad1284","Type":"ContainerStarted","Data":"87d702ad291c4bb91300d5f8d897aca58f3b5e5b1bfe0467cbcc243dfc73dd0e"} Nov 22 08:27:16 crc kubenswrapper[4743]: I1122 08:27:16.202837 4743 generic.go:334] "Generic (PLEG): container finished" podID="00fed59e-401b-4b13-b307-44e90ae76dce" containerID="47531e8d6a2da83aa50b39695579803890204f98430c59b24c101f37f0a1bc0d" exitCode=0 Nov 22 08:27:16 crc kubenswrapper[4743]: I1122 08:27:16.202877 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpsmp" event={"ID":"00fed59e-401b-4b13-b307-44e90ae76dce","Type":"ContainerDied","Data":"47531e8d6a2da83aa50b39695579803890204f98430c59b24c101f37f0a1bc0d"} Nov 22 08:27:16 crc kubenswrapper[4743]: I1122 08:27:16.219676 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jbq5n" podStartSLOduration=1.481338736 podStartE2EDuration="5.219434478s" podCreationTimestamp="2025-11-22 08:27:11 +0000 UTC" firstStartedPulling="2025-11-22 08:27:12.152856208 +0000 UTC m=+305.859217260" lastFinishedPulling="2025-11-22 08:27:15.89095195 +0000 UTC m=+309.597313002" observedRunningTime="2025-11-22 08:27:16.217014056 +0000 UTC m=+309.923375108" watchObservedRunningTime="2025-11-22 08:27:16.219434478 +0000 UTC m=+309.925795530" Nov 22 08:27:17 crc kubenswrapper[4743]: I1122 08:27:17.209524 4743 generic.go:334] "Generic (PLEG): container finished" podID="74807c56-d30f-4fbd-b0ac-44c792f32b99" containerID="347ff6e866a4d090262917eeba2a040d26ed8c1db56ba472fa0c0a2bb2e8b494" exitCode=0 Nov 22 08:27:17 crc kubenswrapper[4743]: I1122 08:27:17.209634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5ksl" event={"ID":"74807c56-d30f-4fbd-b0ac-44c792f32b99","Type":"ContainerDied","Data":"347ff6e866a4d090262917eeba2a040d26ed8c1db56ba472fa0c0a2bb2e8b494"} Nov 22 08:27:17 crc kubenswrapper[4743]: I1122 08:27:17.212284 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbk4w" event={"ID":"69e2b63b-9379-47e8-92c8-991b9599c53c","Type":"ContainerStarted","Data":"a66ab9e5944d74caf45eb94130c51023763357f36c1990e2c3d52b1cf695315e"} Nov 22 08:27:18 crc kubenswrapper[4743]: I1122 08:27:18.219761 4743 generic.go:334] "Generic (PLEG): container finished" podID="69e2b63b-9379-47e8-92c8-991b9599c53c" containerID="a66ab9e5944d74caf45eb94130c51023763357f36c1990e2c3d52b1cf695315e" exitCode=0 Nov 22 08:27:18 crc kubenswrapper[4743]: I1122 08:27:18.219856 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbk4w" event={"ID":"69e2b63b-9379-47e8-92c8-991b9599c53c","Type":"ContainerDied","Data":"a66ab9e5944d74caf45eb94130c51023763357f36c1990e2c3d52b1cf695315e"} Nov 22 08:27:19 crc kubenswrapper[4743]: I1122 08:27:19.226140 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m5ksl" event={"ID":"74807c56-d30f-4fbd-b0ac-44c792f32b99","Type":"ContainerStarted","Data":"97a470a2b872bd9d434b45fc7fd72fd97e67bb60c37a3f0cfd151abd5e544ceb"} Nov 22 08:27:19 crc kubenswrapper[4743]: I1122 08:27:19.229178 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mpsmp" event={"ID":"00fed59e-401b-4b13-b307-44e90ae76dce","Type":"ContainerStarted","Data":"24888c79a13c2c03682e1f96aad0af31429b5f9870240ac44964779096838710"} Nov 22 08:27:19 crc kubenswrapper[4743]: I1122 08:27:19.244178 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m5ksl" podStartSLOduration=2.8551200359999998 podStartE2EDuration="6.244164051s" podCreationTimestamp="2025-11-22 08:27:13 +0000 UTC" firstStartedPulling="2025-11-22 08:27:15.193560564 +0000 UTC m=+308.899921616" lastFinishedPulling="2025-11-22 08:27:18.582604579 +0000 UTC m=+312.288965631" observedRunningTime="2025-11-22 08:27:19.243426069 +0000 UTC m=+312.949787121" watchObservedRunningTime="2025-11-22 08:27:19.244164051 +0000 UTC m=+312.950525103" Nov 22 08:27:19 crc kubenswrapper[4743]: I1122 08:27:19.263643 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mpsmp" podStartSLOduration=4.338947584 podStartE2EDuration="8.263621728s" podCreationTimestamp="2025-11-22 08:27:11 +0000 UTC" firstStartedPulling="2025-11-22 08:27:13.422049265 +0000 UTC m=+307.128410317" lastFinishedPulling="2025-11-22 08:27:17.346723409 +0000 UTC m=+311.053084461" observedRunningTime="2025-11-22 08:27:19.263093042 +0000 UTC m=+312.969454094" watchObservedRunningTime="2025-11-22 08:27:19.263621728 +0000 UTC m=+312.969982780" Nov 22 08:27:20 crc kubenswrapper[4743]: I1122 08:27:20.237553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hbk4w" event={"ID":"69e2b63b-9379-47e8-92c8-991b9599c53c","Type":"ContainerStarted","Data":"8cfca97634bbcbf913d62fd52aa55f8801faef78b78704c787246451eecc4131"} Nov 22 08:27:20 crc kubenswrapper[4743]: I1122 08:27:20.257413 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hbk4w" podStartSLOduration=2.504400469 podStartE2EDuration="7.25739512s" podCreationTimestamp="2025-11-22 08:27:13 +0000 UTC" firstStartedPulling="2025-11-22 08:27:15.188875935 +0000 UTC m=+308.895236987" lastFinishedPulling="2025-11-22 08:27:19.941870586 +0000 UTC m=+313.648231638" observedRunningTime="2025-11-22 08:27:20.255782902 +0000 UTC m=+313.962143964" watchObservedRunningTime="2025-11-22 08:27:20.25739512 +0000 UTC m=+313.963756172" Nov 22 08:27:21 crc kubenswrapper[4743]: I1122 08:27:21.415360 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:21 crc kubenswrapper[4743]: I1122 08:27:21.415414 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:21 crc kubenswrapper[4743]: I1122 08:27:21.451473 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:21 crc kubenswrapper[4743]: I1122 08:27:21.612044 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:21 crc kubenswrapper[4743]: I1122 08:27:21.612272 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:21 crc kubenswrapper[4743]: I1122 08:27:21.669817 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:22 crc kubenswrapper[4743]: I1122 08:27:22.285331 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jbq5n" Nov 22 08:27:23 crc kubenswrapper[4743]: I1122 08:27:23.288242 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mpsmp" Nov 22 08:27:23 crc kubenswrapper[4743]: I1122 08:27:23.810683 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:23 crc kubenswrapper[4743]: I1122 08:27:23.811018 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:24 crc kubenswrapper[4743]: I1122 08:27:24.009311 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:24 crc kubenswrapper[4743]: I1122 08:27:24.009375 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:24 crc kubenswrapper[4743]: I1122 08:27:24.049907 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:24 crc kubenswrapper[4743]: I1122 08:27:24.291406 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m5ksl" Nov 22 08:27:24 crc kubenswrapper[4743]: I1122 08:27:24.847743 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hbk4w" podUID="69e2b63b-9379-47e8-92c8-991b9599c53c" containerName="registry-server" probeResult="failure" output=< Nov 22 08:27:24 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 08:27:24 crc kubenswrapper[4743]: > Nov 22 08:27:33 crc kubenswrapper[4743]: I1122 08:27:33.858865 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:27:33 crc kubenswrapper[4743]: I1122 08:27:33.898256 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hbk4w" Nov 22 08:28:01 crc kubenswrapper[4743]: I1122 08:28:01.241240 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:28:01 crc kubenswrapper[4743]: I1122 08:28:01.242251 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:28:31 crc kubenswrapper[4743]: I1122 08:28:31.240865 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:28:31 crc kubenswrapper[4743]: I1122 08:28:31.242801 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:29:01 crc kubenswrapper[4743]: I1122 08:29:01.241489 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:29:01 crc kubenswrapper[4743]: I1122 08:29:01.242186 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:29:01 crc kubenswrapper[4743]: I1122 08:29:01.242248 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:29:01 crc kubenswrapper[4743]: I1122 08:29:01.243037 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b533a206a803330d2ceef7d65a61f0f3c6cea04ce828dfcfa9e6b7b17b4b0e77"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 08:29:01 crc kubenswrapper[4743]: I1122 08:29:01.243113 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://b533a206a803330d2ceef7d65a61f0f3c6cea04ce828dfcfa9e6b7b17b4b0e77" gracePeriod=600 Nov 22 08:29:01 crc kubenswrapper[4743]: I1122 08:29:01.773762 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="b533a206a803330d2ceef7d65a61f0f3c6cea04ce828dfcfa9e6b7b17b4b0e77" exitCode=0 Nov 22 08:29:01 crc kubenswrapper[4743]: I1122 08:29:01.773812 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"b533a206a803330d2ceef7d65a61f0f3c6cea04ce828dfcfa9e6b7b17b4b0e77"} Nov 22 08:29:01 crc kubenswrapper[4743]: I1122 08:29:01.773850 4743 scope.go:117] "RemoveContainer" containerID="cf7cb3f7e0c5ffe2ef861ac8b4bd130a67f9335e42af1690c4d4821f101db202" Nov 22 08:29:02 crc kubenswrapper[4743]: I1122 08:29:02.781946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"100169dfb49bd3feeeb68539e2f7fbcfba3bc0cdede84e267e0c32d1e1bb126a"} Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.434655 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-65clg"] Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.436535 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.445438 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-65clg"] Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.565861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.565918 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d39cca67-066d-44a8-99c0-3dda564f9788-bound-sa-token\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.565964 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d39cca67-066d-44a8-99c0-3dda564f9788-trusted-ca\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.565986 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d39cca67-066d-44a8-99c0-3dda564f9788-registry-tls\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.566007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d39cca67-066d-44a8-99c0-3dda564f9788-installation-pull-secrets\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.566023 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlvkq\" (UniqueName: \"kubernetes.io/projected/d39cca67-066d-44a8-99c0-3dda564f9788-kube-api-access-mlvkq\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.566049 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d39cca67-066d-44a8-99c0-3dda564f9788-ca-trust-extracted\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.566071 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d39cca67-066d-44a8-99c0-3dda564f9788-registry-certificates\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.593987 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.667097 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d39cca67-066d-44a8-99c0-3dda564f9788-trusted-ca\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.667152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d39cca67-066d-44a8-99c0-3dda564f9788-registry-tls\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.667183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d39cca67-066d-44a8-99c0-3dda564f9788-installation-pull-secrets\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.667209 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlvkq\" (UniqueName: \"kubernetes.io/projected/d39cca67-066d-44a8-99c0-3dda564f9788-kube-api-access-mlvkq\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.667242 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d39cca67-066d-44a8-99c0-3dda564f9788-ca-trust-extracted\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.667271 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d39cca67-066d-44a8-99c0-3dda564f9788-registry-certificates\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.667315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d39cca67-066d-44a8-99c0-3dda564f9788-bound-sa-token\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.668050 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d39cca67-066d-44a8-99c0-3dda564f9788-ca-trust-extracted\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.669058 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d39cca67-066d-44a8-99c0-3dda564f9788-registry-certificates\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.669418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d39cca67-066d-44a8-99c0-3dda564f9788-trusted-ca\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.674238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d39cca67-066d-44a8-99c0-3dda564f9788-installation-pull-secrets\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.676106 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d39cca67-066d-44a8-99c0-3dda564f9788-registry-tls\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.684427 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d39cca67-066d-44a8-99c0-3dda564f9788-bound-sa-token\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.692354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlvkq\" (UniqueName: \"kubernetes.io/projected/d39cca67-066d-44a8-99c0-3dda564f9788-kube-api-access-mlvkq\") pod \"image-registry-66df7c8f76-65clg\" (UID: \"d39cca67-066d-44a8-99c0-3dda564f9788\") " pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:29:59 crc kubenswrapper[4743]: I1122 08:29:59.820024 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.006055 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-65clg"] Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.088079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-65clg" event={"ID":"d39cca67-066d-44a8-99c0-3dda564f9788","Type":"ContainerStarted","Data":"6e820b1e005e53c82a617f148e79e2985e52db54f0e1bb02fd1286ecd2a5dcbd"} Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.125429 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn"] Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.127027 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.128609 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.129478 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.133368 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn"] Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.274754 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d19b2aa7-4ff4-470f-a036-2202acdf6490-secret-volume\") pod \"collect-profiles-29396670-j4fcn\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.274928 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqdp\" (UniqueName: \"kubernetes.io/projected/d19b2aa7-4ff4-470f-a036-2202acdf6490-kube-api-access-ckqdp\") pod \"collect-profiles-29396670-j4fcn\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.274956 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d19b2aa7-4ff4-470f-a036-2202acdf6490-config-volume\") pod \"collect-profiles-29396670-j4fcn\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.377399 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d19b2aa7-4ff4-470f-a036-2202acdf6490-secret-volume\") pod \"collect-profiles-29396670-j4fcn\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.377634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqdp\" (UniqueName: \"kubernetes.io/projected/d19b2aa7-4ff4-470f-a036-2202acdf6490-kube-api-access-ckqdp\") pod \"collect-profiles-29396670-j4fcn\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.377676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d19b2aa7-4ff4-470f-a036-2202acdf6490-config-volume\") pod \"collect-profiles-29396670-j4fcn\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.380173 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d19b2aa7-4ff4-470f-a036-2202acdf6490-config-volume\") pod \"collect-profiles-29396670-j4fcn\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.384914 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d19b2aa7-4ff4-470f-a036-2202acdf6490-secret-volume\") pod \"collect-profiles-29396670-j4fcn\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.399879 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqdp\" (UniqueName: \"kubernetes.io/projected/d19b2aa7-4ff4-470f-a036-2202acdf6490-kube-api-access-ckqdp\") pod \"collect-profiles-29396670-j4fcn\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.441233 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:00 crc kubenswrapper[4743]: I1122 08:30:00.662000 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn"] Nov 22 08:30:01 crc kubenswrapper[4743]: I1122 08:30:01.095506 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-65clg" event={"ID":"d39cca67-066d-44a8-99c0-3dda564f9788","Type":"ContainerStarted","Data":"771eb132eed3c527deb8ccc2fa93dc5c743ccf50f8e1949973b0fe0fa0524127"} Nov 22 08:30:01 crc kubenswrapper[4743]: I1122 08:30:01.095809 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:30:01 crc kubenswrapper[4743]: I1122 08:30:01.096897 4743 generic.go:334] "Generic (PLEG): container finished" podID="d19b2aa7-4ff4-470f-a036-2202acdf6490" containerID="8b539228e577d032f81498c065bc69bdcd50279e13b9d83ee584daec8475b55a" exitCode=0 Nov 22 08:30:01 crc kubenswrapper[4743]: I1122 08:30:01.096924 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" event={"ID":"d19b2aa7-4ff4-470f-a036-2202acdf6490","Type":"ContainerDied","Data":"8b539228e577d032f81498c065bc69bdcd50279e13b9d83ee584daec8475b55a"} Nov 22 08:30:01 crc kubenswrapper[4743]: I1122 08:30:01.096941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" event={"ID":"d19b2aa7-4ff4-470f-a036-2202acdf6490","Type":"ContainerStarted","Data":"7d6b025d6de308445b5833e2d1eb4d880aea71dc41399dddc2def5b6f6a886f8"} Nov 22 08:30:01 crc kubenswrapper[4743]: I1122 08:30:01.121380 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-65clg" podStartSLOduration=2.121359243 podStartE2EDuration="2.121359243s" podCreationTimestamp="2025-11-22 08:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:30:01.118919943 +0000 UTC m=+474.825281015" watchObservedRunningTime="2025-11-22 08:30:01.121359243 +0000 UTC m=+474.827720305" Nov 22 08:30:02 crc kubenswrapper[4743]: I1122 08:30:02.308807 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:02 crc kubenswrapper[4743]: I1122 08:30:02.405782 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d19b2aa7-4ff4-470f-a036-2202acdf6490-config-volume\") pod \"d19b2aa7-4ff4-470f-a036-2202acdf6490\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " Nov 22 08:30:02 crc kubenswrapper[4743]: I1122 08:30:02.405878 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckqdp\" (UniqueName: \"kubernetes.io/projected/d19b2aa7-4ff4-470f-a036-2202acdf6490-kube-api-access-ckqdp\") pod \"d19b2aa7-4ff4-470f-a036-2202acdf6490\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " Nov 22 08:30:02 crc kubenswrapper[4743]: I1122 08:30:02.405924 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d19b2aa7-4ff4-470f-a036-2202acdf6490-secret-volume\") pod \"d19b2aa7-4ff4-470f-a036-2202acdf6490\" (UID: \"d19b2aa7-4ff4-470f-a036-2202acdf6490\") " Nov 22 08:30:02 crc kubenswrapper[4743]: I1122 08:30:02.406637 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d19b2aa7-4ff4-470f-a036-2202acdf6490-config-volume" (OuterVolumeSpecName: "config-volume") pod "d19b2aa7-4ff4-470f-a036-2202acdf6490" (UID: "d19b2aa7-4ff4-470f-a036-2202acdf6490"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:30:02 crc kubenswrapper[4743]: I1122 08:30:02.410925 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19b2aa7-4ff4-470f-a036-2202acdf6490-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d19b2aa7-4ff4-470f-a036-2202acdf6490" (UID: "d19b2aa7-4ff4-470f-a036-2202acdf6490"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:30:02 crc kubenswrapper[4743]: I1122 08:30:02.411015 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19b2aa7-4ff4-470f-a036-2202acdf6490-kube-api-access-ckqdp" (OuterVolumeSpecName: "kube-api-access-ckqdp") pod "d19b2aa7-4ff4-470f-a036-2202acdf6490" (UID: "d19b2aa7-4ff4-470f-a036-2202acdf6490"). InnerVolumeSpecName "kube-api-access-ckqdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:30:02 crc kubenswrapper[4743]: I1122 08:30:02.506610 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d19b2aa7-4ff4-470f-a036-2202acdf6490-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 08:30:02 crc kubenswrapper[4743]: I1122 08:30:02.506639 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckqdp\" (UniqueName: \"kubernetes.io/projected/d19b2aa7-4ff4-470f-a036-2202acdf6490-kube-api-access-ckqdp\") on node \"crc\" DevicePath \"\"" Nov 22 08:30:02 crc kubenswrapper[4743]: I1122 08:30:02.506650 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d19b2aa7-4ff4-470f-a036-2202acdf6490-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 08:30:03 crc kubenswrapper[4743]: I1122 08:30:03.108961 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" event={"ID":"d19b2aa7-4ff4-470f-a036-2202acdf6490","Type":"ContainerDied","Data":"7d6b025d6de308445b5833e2d1eb4d880aea71dc41399dddc2def5b6f6a886f8"} Nov 22 08:30:03 crc kubenswrapper[4743]: I1122 08:30:03.109335 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6b025d6de308445b5833e2d1eb4d880aea71dc41399dddc2def5b6f6a886f8" Nov 22 08:30:03 crc kubenswrapper[4743]: I1122 08:30:03.109039 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn" Nov 22 08:30:19 crc kubenswrapper[4743]: I1122 08:30:19.824890 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-65clg" Nov 22 08:30:19 crc kubenswrapper[4743]: I1122 08:30:19.863040 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-592fz"] Nov 22 08:30:44 crc kubenswrapper[4743]: I1122 08:30:44.902506 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" podUID="63019b95-c8f5-4782-85ba-def26be394f0" containerName="registry" containerID="cri-o://393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807" gracePeriod=30 Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.240287 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.341970 4743 generic.go:334] "Generic (PLEG): container finished" podID="63019b95-c8f5-4782-85ba-def26be394f0" containerID="393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807" exitCode=0 Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.342079 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.342069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" event={"ID":"63019b95-c8f5-4782-85ba-def26be394f0","Type":"ContainerDied","Data":"393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807"} Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.342148 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-592fz" event={"ID":"63019b95-c8f5-4782-85ba-def26be394f0","Type":"ContainerDied","Data":"d11eb4e4e74c9184f354a4d245e061236d7cc26ac1bf6fc64a44b4ddcfe5643d"} Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.342181 4743 scope.go:117] "RemoveContainer" containerID="393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.358406 4743 scope.go:117] "RemoveContainer" containerID="393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807" Nov 22 08:30:45 crc kubenswrapper[4743]: E1122 08:30:45.358936 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807\": container with ID starting with 393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807 not found: ID does not exist" containerID="393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.358968 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807"} err="failed to get container status \"393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807\": rpc error: code = NotFound desc = could not find container \"393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807\": container with ID starting with 393ede644d1cbabd114ce628a3850b0104b1164d445c21c05c65cf344ba6d807 not found: ID does not exist" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.420878 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwh82\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-kube-api-access-kwh82\") pod \"63019b95-c8f5-4782-85ba-def26be394f0\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.421096 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"63019b95-c8f5-4782-85ba-def26be394f0\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.421178 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63019b95-c8f5-4782-85ba-def26be394f0-ca-trust-extracted\") pod \"63019b95-c8f5-4782-85ba-def26be394f0\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.421219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-registry-tls\") pod \"63019b95-c8f5-4782-85ba-def26be394f0\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.421270 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-registry-certificates\") pod \"63019b95-c8f5-4782-85ba-def26be394f0\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.421305 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-trusted-ca\") pod \"63019b95-c8f5-4782-85ba-def26be394f0\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.421328 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63019b95-c8f5-4782-85ba-def26be394f0-installation-pull-secrets\") pod \"63019b95-c8f5-4782-85ba-def26be394f0\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.421385 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-bound-sa-token\") pod \"63019b95-c8f5-4782-85ba-def26be394f0\" (UID: \"63019b95-c8f5-4782-85ba-def26be394f0\") " Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.422841 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "63019b95-c8f5-4782-85ba-def26be394f0" (UID: "63019b95-c8f5-4782-85ba-def26be394f0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.423004 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "63019b95-c8f5-4782-85ba-def26be394f0" (UID: "63019b95-c8f5-4782-85ba-def26be394f0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.428051 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "63019b95-c8f5-4782-85ba-def26be394f0" (UID: "63019b95-c8f5-4782-85ba-def26be394f0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.429206 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63019b95-c8f5-4782-85ba-def26be394f0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "63019b95-c8f5-4782-85ba-def26be394f0" (UID: "63019b95-c8f5-4782-85ba-def26be394f0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.430105 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "63019b95-c8f5-4782-85ba-def26be394f0" (UID: "63019b95-c8f5-4782-85ba-def26be394f0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.437952 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "63019b95-c8f5-4782-85ba-def26be394f0" (UID: "63019b95-c8f5-4782-85ba-def26be394f0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.438759 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-kube-api-access-kwh82" (OuterVolumeSpecName: "kube-api-access-kwh82") pod "63019b95-c8f5-4782-85ba-def26be394f0" (UID: "63019b95-c8f5-4782-85ba-def26be394f0"). InnerVolumeSpecName "kube-api-access-kwh82". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.438928 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63019b95-c8f5-4782-85ba-def26be394f0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "63019b95-c8f5-4782-85ba-def26be394f0" (UID: "63019b95-c8f5-4782-85ba-def26be394f0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.522639 4743 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/63019b95-c8f5-4782-85ba-def26be394f0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.522710 4743 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-registry-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.522720 4743 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-registry-certificates\") on node \"crc\" DevicePath \"\"" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.522730 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/63019b95-c8f5-4782-85ba-def26be394f0-trusted-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.522739 4743 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/63019b95-c8f5-4782-85ba-def26be394f0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.522748 4743 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.522757 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwh82\" (UniqueName: \"kubernetes.io/projected/63019b95-c8f5-4782-85ba-def26be394f0-kube-api-access-kwh82\") on node \"crc\" DevicePath \"\"" Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.670238 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-592fz"] Nov 22 08:30:45 crc kubenswrapper[4743]: I1122 08:30:45.673688 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-592fz"] Nov 22 08:30:47 crc kubenswrapper[4743]: I1122 08:30:47.158217 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63019b95-c8f5-4782-85ba-def26be394f0" path="/var/lib/kubelet/pods/63019b95-c8f5-4782-85ba-def26be394f0/volumes" Nov 22 08:31:31 crc kubenswrapper[4743]: I1122 08:31:31.241607 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:31:31 crc kubenswrapper[4743]: I1122 08:31:31.242239 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:32:01 crc kubenswrapper[4743]: I1122 08:32:01.241445 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:32:01 crc kubenswrapper[4743]: I1122 08:32:01.242298 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:32:31 crc kubenswrapper[4743]: I1122 08:32:31.241705 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:32:31 crc kubenswrapper[4743]: I1122 08:32:31.242339 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:32:31 crc kubenswrapper[4743]: I1122 08:32:31.242392 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:32:31 crc kubenswrapper[4743]: I1122 08:32:31.243072 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"100169dfb49bd3feeeb68539e2f7fbcfba3bc0cdede84e267e0c32d1e1bb126a"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 08:32:31 crc kubenswrapper[4743]: I1122 08:32:31.243158 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://100169dfb49bd3feeeb68539e2f7fbcfba3bc0cdede84e267e0c32d1e1bb126a" gracePeriod=600 Nov 22 08:32:31 crc kubenswrapper[4743]: I1122 08:32:31.930218 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="100169dfb49bd3feeeb68539e2f7fbcfba3bc0cdede84e267e0c32d1e1bb126a" exitCode=0 Nov 22 08:32:31 crc kubenswrapper[4743]: I1122 08:32:31.930272 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"100169dfb49bd3feeeb68539e2f7fbcfba3bc0cdede84e267e0c32d1e1bb126a"} Nov 22 08:32:31 crc kubenswrapper[4743]: I1122 08:32:31.930307 4743 scope.go:117] "RemoveContainer" containerID="b533a206a803330d2ceef7d65a61f0f3c6cea04ce828dfcfa9e6b7b17b4b0e77" Nov 22 08:32:32 crc kubenswrapper[4743]: I1122 08:32:32.939310 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"37c97b2e81f6751d68ee6b6779e8d74c99c6cc572fe2c42aebcabd8215411f9d"} Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.417321 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p8glw"] Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.418522 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovn-acl-logging" containerID="cri-o://36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809" gracePeriod=30 Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.418525 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="nbdb" containerID="cri-o://432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f" gracePeriod=30 Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.418627 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovn-controller" containerID="cri-o://db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d" gracePeriod=30 Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.418589 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="northd" containerID="cri-o://8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12" gracePeriod=30 Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.418712 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="sbdb" containerID="cri-o://bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366" gracePeriod=30 Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.418677 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="kube-rbac-proxy-node" containerID="cri-o://ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382" gracePeriod=30 Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.418875 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce" gracePeriod=30 Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.454044 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" containerID="cri-o://2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472" gracePeriod=30 Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.813770 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/3.log" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.817339 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovn-acl-logging/0.log" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.817936 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovn-controller/0.log" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.818667 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.878539 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-np2wf"] Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.878983 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="northd" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879001 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="northd" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879012 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879020 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879027 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19b2aa7-4ff4-470f-a036-2202acdf6490" containerName="collect-profiles" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879035 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19b2aa7-4ff4-470f-a036-2202acdf6490" containerName="collect-profiles" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879049 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="kubecfg-setup" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879055 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="kubecfg-setup" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879065 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovn-acl-logging" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879072 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovn-acl-logging" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879084 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879090 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879097 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="kube-rbac-proxy-node" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879105 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="kube-rbac-proxy-node" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879121 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879127 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879142 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovn-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879149 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovn-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879159 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="nbdb" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879166 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="nbdb" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879175 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="sbdb" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879183 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="sbdb" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879194 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879201 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879211 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63019b95-c8f5-4782-85ba-def26be394f0" containerName="registry" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879217 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="63019b95-c8f5-4782-85ba-def26be394f0" containerName="registry" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879337 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovn-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879351 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879360 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="northd" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879368 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="63019b95-c8f5-4782-85ba-def26be394f0" containerName="registry" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879377 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879384 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19b2aa7-4ff4-470f-a036-2202acdf6490" containerName="collect-profiles" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879393 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879402 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="kube-rbac-proxy-ovn-metrics" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879412 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="nbdb" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879419 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="sbdb" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879426 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovn-acl-logging" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879434 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879443 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="kube-rbac-proxy-node" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879547 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879555 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879712 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: E1122 08:33:57.879885 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.879897 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerName="ovnkube-controller" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.883558 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963209 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-slash\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963284 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-systemd\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-config\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963339 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-slash" (OuterVolumeSpecName: "host-slash") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963365 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-netd\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963438 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-env-overrides\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963497 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35d29494-f9cd-46b7-be04-d7a848a72fee-ovn-node-metrics-cert\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-log-socket\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-bin\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963437 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963614 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-systemd-units\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963636 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-ovn-kubernetes\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963656 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-etc-openvswitch\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963675 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-netns\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963697 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-var-lib-openvswitch\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963729 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-script-lib\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963663 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-log-socket" (OuterVolumeSpecName: "log-socket") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963709 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963741 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963763 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-ovn\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963789 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963807 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-var-lib-cni-networks-ovn-kubernetes\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963820 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963830 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-kubelet\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963858 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-openvswitch\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963900 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkdp6\" (UniqueName: \"kubernetes.io/projected/35d29494-f9cd-46b7-be04-d7a848a72fee-kube-api-access-vkdp6\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963933 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-node-log\") pod \"35d29494-f9cd-46b7-be04-d7a848a72fee\" (UID: \"35d29494-f9cd-46b7-be04-d7a848a72fee\") " Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963946 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.963986 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964006 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964024 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964027 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964046 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964113 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-node-log" (OuterVolumeSpecName: "node-log") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964192 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964754 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964774 4743 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-netd\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964783 4743 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-env-overrides\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964794 4743 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-log-socket\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964804 4743 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-cni-bin\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964827 4743 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-systemd-units\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964837 4743 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964848 4743 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964856 4743 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-run-netns\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964865 4743 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964874 4743 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/35d29494-f9cd-46b7-be04-d7a848a72fee-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964883 4743 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964893 4743 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964901 4743 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-kubelet\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964910 4743 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-openvswitch\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964918 4743 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-node-log\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.964927 4743 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-host-slash\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.970325 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35d29494-f9cd-46b7-be04-d7a848a72fee-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.979335 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35d29494-f9cd-46b7-be04-d7a848a72fee-kube-api-access-vkdp6" (OuterVolumeSpecName: "kube-api-access-vkdp6") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "kube-api-access-vkdp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:33:57 crc kubenswrapper[4743]: I1122 08:33:57.982396 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "35d29494-f9cd-46b7-be04-d7a848a72fee" (UID: "35d29494-f9cd-46b7-be04-d7a848a72fee"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.065968 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-run-systemd\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066078 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-run-netns\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-cni-netd\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e103be83-3221-46ab-bd64-1ef6f8bc7950-ovnkube-config\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-cni-bin\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-systemd-units\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066305 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4lbq\" (UniqueName: \"kubernetes.io/projected/e103be83-3221-46ab-bd64-1ef6f8bc7950-kube-api-access-z4lbq\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-slash\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-run-ovn\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066530 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-var-lib-openvswitch\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066635 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-log-socket\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066667 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-run-ovn-kubernetes\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066701 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e103be83-3221-46ab-bd64-1ef6f8bc7950-ovnkube-script-lib\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066746 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e103be83-3221-46ab-bd64-1ef6f8bc7950-env-overrides\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066829 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-etc-openvswitch\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066872 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e103be83-3221-46ab-bd64-1ef6f8bc7950-ovn-node-metrics-cert\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066907 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-run-openvswitch\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066952 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.066993 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-kubelet\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.067027 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-node-log\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.067101 4743 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/35d29494-f9cd-46b7-be04-d7a848a72fee-run-systemd\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.067124 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/35d29494-f9cd-46b7-be04-d7a848a72fee-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.067146 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkdp6\" (UniqueName: \"kubernetes.io/projected/35d29494-f9cd-46b7-be04-d7a848a72fee-kube-api-access-vkdp6\") on node \"crc\" DevicePath \"\"" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.168899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-etc-openvswitch\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.168982 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e103be83-3221-46ab-bd64-1ef6f8bc7950-ovn-node-metrics-cert\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169051 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-etc-openvswitch\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-run-openvswitch\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169119 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-run-openvswitch\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-kubelet\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169171 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169192 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-node-log\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169235 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-run-systemd\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169266 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-cni-netd\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169284 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-run-netns\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e103be83-3221-46ab-bd64-1ef6f8bc7950-ovnkube-config\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169329 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-cni-bin\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-systemd-units\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169375 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4lbq\" (UniqueName: \"kubernetes.io/projected/e103be83-3221-46ab-bd64-1ef6f8bc7950-kube-api-access-z4lbq\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169393 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-run-ovn\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169395 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-cni-netd\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-slash\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-slash\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-run-netns\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169543 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-var-lib-openvswitch\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-log-socket\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169636 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-run-ovn-kubernetes\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169681 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e103be83-3221-46ab-bd64-1ef6f8bc7950-ovnkube-script-lib\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e103be83-3221-46ab-bd64-1ef6f8bc7950-env-overrides\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.169901 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-log-socket\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170210 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-kubelet\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170242 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-run-systemd\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170297 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e103be83-3221-46ab-bd64-1ef6f8bc7950-env-overrides\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170333 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-run-ovn-kubernetes\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170337 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-host-cni-bin\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170332 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-var-lib-openvswitch\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-node-log\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170374 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-run-ovn\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170384 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e103be83-3221-46ab-bd64-1ef6f8bc7950-systemd-units\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.170971 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e103be83-3221-46ab-bd64-1ef6f8bc7950-ovnkube-script-lib\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.171485 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e103be83-3221-46ab-bd64-1ef6f8bc7950-ovnkube-config\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.174845 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e103be83-3221-46ab-bd64-1ef6f8bc7950-ovn-node-metrics-cert\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.189447 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4lbq\" (UniqueName: \"kubernetes.io/projected/e103be83-3221-46ab-bd64-1ef6f8bc7950-kube-api-access-z4lbq\") pod \"ovnkube-node-np2wf\" (UID: \"e103be83-3221-46ab-bd64-1ef6f8bc7950\") " pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.200018 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.414274 4743 generic.go:334] "Generic (PLEG): container finished" podID="e103be83-3221-46ab-bd64-1ef6f8bc7950" containerID="24e30a0c049ec81b1d8ab434b3708786c7e47e5a00ae4f2ceadda8da55e13027" exitCode=0 Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.414365 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" event={"ID":"e103be83-3221-46ab-bd64-1ef6f8bc7950","Type":"ContainerDied","Data":"24e30a0c049ec81b1d8ab434b3708786c7e47e5a00ae4f2ceadda8da55e13027"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.414946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" event={"ID":"e103be83-3221-46ab-bd64-1ef6f8bc7950","Type":"ContainerStarted","Data":"0c9f7d95432924d2d2ab8ac28eed2abb8ab0d2051f26afc9c39cd0ba0fce95ce"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.423894 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cbpnf_a1de4b47-eed0-431f-a7a9-a944ce8791bd/kube-multus/2.log" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.427089 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cbpnf_a1de4b47-eed0-431f-a7a9-a944ce8791bd/kube-multus/1.log" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.427200 4743 generic.go:334] "Generic (PLEG): container finished" podID="a1de4b47-eed0-431f-a7a9-a944ce8791bd" containerID="0902f0de82c42e6e1f407e388c2f9fa1998f6da031b9008f5ddf06d7a8fda6ee" exitCode=2 Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.427236 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cbpnf" event={"ID":"a1de4b47-eed0-431f-a7a9-a944ce8791bd","Type":"ContainerDied","Data":"0902f0de82c42e6e1f407e388c2f9fa1998f6da031b9008f5ddf06d7a8fda6ee"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.427306 4743 scope.go:117] "RemoveContainer" containerID="d69946320b9db9ab4b35189efa616676972d055242413aa37ff4b1ae5b7af00d" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.428488 4743 scope.go:117] "RemoveContainer" containerID="0902f0de82c42e6e1f407e388c2f9fa1998f6da031b9008f5ddf06d7a8fda6ee" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.428834 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-cbpnf_openshift-multus(a1de4b47-eed0-431f-a7a9-a944ce8791bd)\"" pod="openshift-multus/multus-cbpnf" podUID="a1de4b47-eed0-431f-a7a9-a944ce8791bd" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.435306 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovnkube-controller/3.log" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.442983 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovn-acl-logging/0.log" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.445398 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p8glw_35d29494-f9cd-46b7-be04-d7a848a72fee/ovn-controller/0.log" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447621 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472" exitCode=0 Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447692 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366" exitCode=0 Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447709 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f" exitCode=0 Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447749 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12" exitCode=0 Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447763 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce" exitCode=0 Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447779 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382" exitCode=0 Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447792 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809" exitCode=143 Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447832 4743 generic.go:334] "Generic (PLEG): container finished" podID="35d29494-f9cd-46b7-be04-d7a848a72fee" containerID="db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d" exitCode=143 Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447876 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447927 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.447967 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448017 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448040 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448087 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448111 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448166 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448187 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448198 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448207 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448217 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448255 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448266 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448280 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448290 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448301 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448359 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448371 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448381 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448416 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448429 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448439 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448450 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448460 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448470 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448506 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448540 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448600 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448615 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448627 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448638 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448649 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448660 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448699 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448710 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448722 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448740 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p8glw" event={"ID":"35d29494-f9cd-46b7-be04-d7a848a72fee","Type":"ContainerDied","Data":"1eafb21df4fb93347557916ab6b8edc7a938e0ad4966be779b032072746d9792"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448783 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448796 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448806 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448816 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448827 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448865 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448876 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448886 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448914 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.448925 4743 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4"} Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.475340 4743 scope.go:117] "RemoveContainer" containerID="2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.502464 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p8glw"] Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.505595 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-p8glw"] Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.506083 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.525136 4743 scope.go:117] "RemoveContainer" containerID="bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.543756 4743 scope.go:117] "RemoveContainer" containerID="432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.559359 4743 scope.go:117] "RemoveContainer" containerID="8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.593393 4743 scope.go:117] "RemoveContainer" containerID="52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.624881 4743 scope.go:117] "RemoveContainer" containerID="ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.655132 4743 scope.go:117] "RemoveContainer" containerID="36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.672477 4743 scope.go:117] "RemoveContainer" containerID="db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.698065 4743 scope.go:117] "RemoveContainer" containerID="0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.711208 4743 scope.go:117] "RemoveContainer" containerID="2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.711915 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472\": container with ID starting with 2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472 not found: ID does not exist" containerID="2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.711959 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472"} err="failed to get container status \"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472\": rpc error: code = NotFound desc = could not find container \"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472\": container with ID starting with 2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.711989 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.712403 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\": container with ID starting with 836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e not found: ID does not exist" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.712444 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e"} err="failed to get container status \"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\": rpc error: code = NotFound desc = could not find container \"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\": container with ID starting with 836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.712488 4743 scope.go:117] "RemoveContainer" containerID="bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.713044 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\": container with ID starting with bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366 not found: ID does not exist" containerID="bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.713080 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366"} err="failed to get container status \"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\": rpc error: code = NotFound desc = could not find container \"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\": container with ID starting with bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.713097 4743 scope.go:117] "RemoveContainer" containerID="432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.713393 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\": container with ID starting with 432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f not found: ID does not exist" containerID="432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.713425 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f"} err="failed to get container status \"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\": rpc error: code = NotFound desc = could not find container \"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\": container with ID starting with 432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.713448 4743 scope.go:117] "RemoveContainer" containerID="8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.713685 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\": container with ID starting with 8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12 not found: ID does not exist" containerID="8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.713718 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12"} err="failed to get container status \"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\": rpc error: code = NotFound desc = could not find container \"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\": container with ID starting with 8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.713739 4743 scope.go:117] "RemoveContainer" containerID="52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.714235 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\": container with ID starting with 52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce not found: ID does not exist" containerID="52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.714257 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce"} err="failed to get container status \"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\": rpc error: code = NotFound desc = could not find container \"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\": container with ID starting with 52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.714271 4743 scope.go:117] "RemoveContainer" containerID="ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.714815 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\": container with ID starting with ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382 not found: ID does not exist" containerID="ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.714838 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382"} err="failed to get container status \"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\": rpc error: code = NotFound desc = could not find container \"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\": container with ID starting with ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.714851 4743 scope.go:117] "RemoveContainer" containerID="36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.715048 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\": container with ID starting with 36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809 not found: ID does not exist" containerID="36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.715065 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809"} err="failed to get container status \"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\": rpc error: code = NotFound desc = could not find container \"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\": container with ID starting with 36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.715078 4743 scope.go:117] "RemoveContainer" containerID="db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.715326 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\": container with ID starting with db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d not found: ID does not exist" containerID="db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.715350 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d"} err="failed to get container status \"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\": rpc error: code = NotFound desc = could not find container \"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\": container with ID starting with db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.715366 4743 scope.go:117] "RemoveContainer" containerID="0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4" Nov 22 08:33:58 crc kubenswrapper[4743]: E1122 08:33:58.715616 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\": container with ID starting with 0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4 not found: ID does not exist" containerID="0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.715644 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4"} err="failed to get container status \"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\": rpc error: code = NotFound desc = could not find container \"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\": container with ID starting with 0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.715665 4743 scope.go:117] "RemoveContainer" containerID="2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.715919 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472"} err="failed to get container status \"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472\": rpc error: code = NotFound desc = could not find container \"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472\": container with ID starting with 2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.715945 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.716334 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e"} err="failed to get container status \"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\": rpc error: code = NotFound desc = could not find container \"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\": container with ID starting with 836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.716365 4743 scope.go:117] "RemoveContainer" containerID="bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.716635 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366"} err="failed to get container status \"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\": rpc error: code = NotFound desc = could not find container \"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\": container with ID starting with bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.716654 4743 scope.go:117] "RemoveContainer" containerID="432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.717295 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f"} err="failed to get container status \"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\": rpc error: code = NotFound desc = could not find container \"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\": container with ID starting with 432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.717315 4743 scope.go:117] "RemoveContainer" containerID="8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.717549 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12"} err="failed to get container status \"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\": rpc error: code = NotFound desc = could not find container \"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\": container with ID starting with 8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.717612 4743 scope.go:117] "RemoveContainer" containerID="52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.717884 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce"} err="failed to get container status \"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\": rpc error: code = NotFound desc = could not find container \"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\": container with ID starting with 52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.717908 4743 scope.go:117] "RemoveContainer" containerID="ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.718162 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382"} err="failed to get container status \"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\": rpc error: code = NotFound desc = could not find container \"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\": container with ID starting with ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.718185 4743 scope.go:117] "RemoveContainer" containerID="36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.718498 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809"} err="failed to get container status \"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\": rpc error: code = NotFound desc = could not find container \"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\": container with ID starting with 36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.718519 4743 scope.go:117] "RemoveContainer" containerID="db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.718834 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d"} err="failed to get container status \"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\": rpc error: code = NotFound desc = could not find container \"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\": container with ID starting with db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.718853 4743 scope.go:117] "RemoveContainer" containerID="0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.719133 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4"} err="failed to get container status \"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\": rpc error: code = NotFound desc = could not find container \"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\": container with ID starting with 0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.719155 4743 scope.go:117] "RemoveContainer" containerID="2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.719460 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472"} err="failed to get container status \"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472\": rpc error: code = NotFound desc = could not find container \"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472\": container with ID starting with 2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.719483 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.719707 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e"} err="failed to get container status \"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\": rpc error: code = NotFound desc = could not find container \"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\": container with ID starting with 836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.719727 4743 scope.go:117] "RemoveContainer" containerID="bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.719945 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366"} err="failed to get container status \"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\": rpc error: code = NotFound desc = could not find container \"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\": container with ID starting with bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.719961 4743 scope.go:117] "RemoveContainer" containerID="432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.720191 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f"} err="failed to get container status \"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\": rpc error: code = NotFound desc = could not find container \"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\": container with ID starting with 432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.720213 4743 scope.go:117] "RemoveContainer" containerID="8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.720456 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12"} err="failed to get container status \"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\": rpc error: code = NotFound desc = could not find container \"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\": container with ID starting with 8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.720477 4743 scope.go:117] "RemoveContainer" containerID="52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.720714 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce"} err="failed to get container status \"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\": rpc error: code = NotFound desc = could not find container \"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\": container with ID starting with 52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.720738 4743 scope.go:117] "RemoveContainer" containerID="ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.721061 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382"} err="failed to get container status \"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\": rpc error: code = NotFound desc = could not find container \"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\": container with ID starting with ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.721082 4743 scope.go:117] "RemoveContainer" containerID="36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.721298 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809"} err="failed to get container status \"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\": rpc error: code = NotFound desc = could not find container \"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\": container with ID starting with 36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.721319 4743 scope.go:117] "RemoveContainer" containerID="db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.721559 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d"} err="failed to get container status \"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\": rpc error: code = NotFound desc = could not find container \"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\": container with ID starting with db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.721592 4743 scope.go:117] "RemoveContainer" containerID="0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.721792 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4"} err="failed to get container status \"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\": rpc error: code = NotFound desc = could not find container \"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\": container with ID starting with 0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.721811 4743 scope.go:117] "RemoveContainer" containerID="2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.722013 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472"} err="failed to get container status \"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472\": rpc error: code = NotFound desc = could not find container \"2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472\": container with ID starting with 2bfb6ec52532c283dbcc171ee951ec42f6d0e2a9cefc34f2660c1d56994c8472 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.722043 4743 scope.go:117] "RemoveContainer" containerID="836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.722234 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e"} err="failed to get container status \"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\": rpc error: code = NotFound desc = could not find container \"836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e\": container with ID starting with 836ebea3b2bc5ff03ad7ec1cdac334a7793d438f0c2d442a69ab82d066c6ec9e not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.722257 4743 scope.go:117] "RemoveContainer" containerID="bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.722466 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366"} err="failed to get container status \"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\": rpc error: code = NotFound desc = could not find container \"bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366\": container with ID starting with bf70fe8aa6ad23c01dc185cc6df6aeffb2cb2f9b94b33f958be493e65bf1d366 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.722490 4743 scope.go:117] "RemoveContainer" containerID="432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.722751 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f"} err="failed to get container status \"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\": rpc error: code = NotFound desc = could not find container \"432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f\": container with ID starting with 432459597c9e9eca35c907f03e076694b17cb91c9ea3c535edbf6147491fac8f not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.722773 4743 scope.go:117] "RemoveContainer" containerID="8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.722993 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12"} err="failed to get container status \"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\": rpc error: code = NotFound desc = could not find container \"8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12\": container with ID starting with 8b382d0dc700e61158fff0affd5dad65cb9e2a19ae0b1b5e39377f997ec4fe12 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.723019 4743 scope.go:117] "RemoveContainer" containerID="52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.723229 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce"} err="failed to get container status \"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\": rpc error: code = NotFound desc = could not find container \"52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce\": container with ID starting with 52b713ec41f56309c850b67f54163236e78437e953280363f9167338528d0cce not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.723251 4743 scope.go:117] "RemoveContainer" containerID="ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.723509 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382"} err="failed to get container status \"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\": rpc error: code = NotFound desc = could not find container \"ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382\": container with ID starting with ded2f5b6123faaa4100536aa4c5f6c2236f779913da985754349d6d6e5daa382 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.723537 4743 scope.go:117] "RemoveContainer" containerID="36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.723788 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809"} err="failed to get container status \"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\": rpc error: code = NotFound desc = could not find container \"36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809\": container with ID starting with 36b3e3e779cc512beafa6c1fe3b08001793b5d7d3437650be3c56dac335b2809 not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.723809 4743 scope.go:117] "RemoveContainer" containerID="db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.724019 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d"} err="failed to get container status \"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\": rpc error: code = NotFound desc = could not find container \"db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d\": container with ID starting with db66af5b9ee77d38480b61ffba80a7b0b690912ec5c33341bdf36ceaef22ed9d not found: ID does not exist" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.724041 4743 scope.go:117] "RemoveContainer" containerID="0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4" Nov 22 08:33:58 crc kubenswrapper[4743]: I1122 08:33:58.724260 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4"} err="failed to get container status \"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\": rpc error: code = NotFound desc = could not find container \"0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4\": container with ID starting with 0e4a36e1b6fe9ff849a5614813da8b13727bbe4f4f4ec782d4092756f31549f4 not found: ID does not exist" Nov 22 08:33:59 crc kubenswrapper[4743]: I1122 08:33:59.158880 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35d29494-f9cd-46b7-be04-d7a848a72fee" path="/var/lib/kubelet/pods/35d29494-f9cd-46b7-be04-d7a848a72fee/volumes" Nov 22 08:33:59 crc kubenswrapper[4743]: I1122 08:33:59.457970 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" event={"ID":"e103be83-3221-46ab-bd64-1ef6f8bc7950","Type":"ContainerStarted","Data":"bec33f9f4dc4b61ad6cf8542611ac6a15bac7c848e14ed89bfcfdcbcf3a04bb1"} Nov 22 08:33:59 crc kubenswrapper[4743]: I1122 08:33:59.458083 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" event={"ID":"e103be83-3221-46ab-bd64-1ef6f8bc7950","Type":"ContainerStarted","Data":"b41927a0b0b71b32412e91313a559805bc47120308d0d15da40a69ab5b3d951f"} Nov 22 08:33:59 crc kubenswrapper[4743]: I1122 08:33:59.458098 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" event={"ID":"e103be83-3221-46ab-bd64-1ef6f8bc7950","Type":"ContainerStarted","Data":"efc63ffa4b6fb4225844f8f3d053065d723f3764a5610cf7ff949226bb4e93ba"} Nov 22 08:33:59 crc kubenswrapper[4743]: I1122 08:33:59.458109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" event={"ID":"e103be83-3221-46ab-bd64-1ef6f8bc7950","Type":"ContainerStarted","Data":"29bb3cfb3ad409837ffd74917ec86c0297cdb4c9c1eba5b4561e9e4083a78fc9"} Nov 22 08:33:59 crc kubenswrapper[4743]: I1122 08:33:59.458118 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" event={"ID":"e103be83-3221-46ab-bd64-1ef6f8bc7950","Type":"ContainerStarted","Data":"b109ae128a6123fbc54c991048f391c62dc5ce6e5d05e68d3cafb9093c8c1912"} Nov 22 08:33:59 crc kubenswrapper[4743]: I1122 08:33:59.458127 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" event={"ID":"e103be83-3221-46ab-bd64-1ef6f8bc7950","Type":"ContainerStarted","Data":"811937cbb3287d3e04f9a5f2999e727b1fac5050335481940e936955d8c0553b"} Nov 22 08:33:59 crc kubenswrapper[4743]: I1122 08:33:59.459315 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cbpnf_a1de4b47-eed0-431f-a7a9-a944ce8791bd/kube-multus/2.log" Nov 22 08:34:01 crc kubenswrapper[4743]: I1122 08:34:01.476455 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" event={"ID":"e103be83-3221-46ab-bd64-1ef6f8bc7950","Type":"ContainerStarted","Data":"d7fddda7d3415d0c0c43927f8f6b7e7b254c4d8700c38d2dceec277ac774d4b8"} Nov 22 08:34:04 crc kubenswrapper[4743]: I1122 08:34:04.493534 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" event={"ID":"e103be83-3221-46ab-bd64-1ef6f8bc7950","Type":"ContainerStarted","Data":"c5e57bab1dae4a2c504a03a8b230c52df32ce611c5b9fa075ac9a1328c79f59d"} Nov 22 08:34:04 crc kubenswrapper[4743]: I1122 08:34:04.494133 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:34:04 crc kubenswrapper[4743]: I1122 08:34:04.520660 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" podStartSLOduration=7.520644322 podStartE2EDuration="7.520644322s" podCreationTimestamp="2025-11-22 08:33:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:34:04.51954482 +0000 UTC m=+718.225905872" watchObservedRunningTime="2025-11-22 08:34:04.520644322 +0000 UTC m=+718.227005374" Nov 22 08:34:04 crc kubenswrapper[4743]: I1122 08:34:04.533806 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.497978 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.499172 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.531437 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.755533 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-5njjw"] Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.756541 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.758416 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.759683 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.759839 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.762023 4743 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-4ltb4" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.772677 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5njjw"] Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.859643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b57e70a1-7869-42b1-bdaf-2b7199c5e463-crc-storage\") pod \"crc-storage-crc-5njjw\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.859810 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b57e70a1-7869-42b1-bdaf-2b7199c5e463-node-mnt\") pod \"crc-storage-crc-5njjw\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.859885 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drw9h\" (UniqueName: \"kubernetes.io/projected/b57e70a1-7869-42b1-bdaf-2b7199c5e463-kube-api-access-drw9h\") pod \"crc-storage-crc-5njjw\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.961630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b57e70a1-7869-42b1-bdaf-2b7199c5e463-crc-storage\") pod \"crc-storage-crc-5njjw\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.961760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b57e70a1-7869-42b1-bdaf-2b7199c5e463-node-mnt\") pod \"crc-storage-crc-5njjw\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.961804 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drw9h\" (UniqueName: \"kubernetes.io/projected/b57e70a1-7869-42b1-bdaf-2b7199c5e463-kube-api-access-drw9h\") pod \"crc-storage-crc-5njjw\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.962156 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b57e70a1-7869-42b1-bdaf-2b7199c5e463-node-mnt\") pod \"crc-storage-crc-5njjw\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.962512 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b57e70a1-7869-42b1-bdaf-2b7199c5e463-crc-storage\") pod \"crc-storage-crc-5njjw\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:05 crc kubenswrapper[4743]: I1122 08:34:05.986218 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drw9h\" (UniqueName: \"kubernetes.io/projected/b57e70a1-7869-42b1-bdaf-2b7199c5e463-kube-api-access-drw9h\") pod \"crc-storage-crc-5njjw\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:06 crc kubenswrapper[4743]: I1122 08:34:06.079554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:06 crc kubenswrapper[4743]: E1122 08:34:06.106647 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(22ce7af7bc79097b58aaa758105b47526909c66ad7e74b1e5e93c6ecea32e25f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 08:34:06 crc kubenswrapper[4743]: E1122 08:34:06.106725 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(22ce7af7bc79097b58aaa758105b47526909c66ad7e74b1e5e93c6ecea32e25f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:06 crc kubenswrapper[4743]: E1122 08:34:06.106752 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(22ce7af7bc79097b58aaa758105b47526909c66ad7e74b1e5e93c6ecea32e25f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:06 crc kubenswrapper[4743]: E1122 08:34:06.106814 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-5njjw_crc-storage(b57e70a1-7869-42b1-bdaf-2b7199c5e463)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-5njjw_crc-storage(b57e70a1-7869-42b1-bdaf-2b7199c5e463)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(22ce7af7bc79097b58aaa758105b47526909c66ad7e74b1e5e93c6ecea32e25f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-5njjw" podUID="b57e70a1-7869-42b1-bdaf-2b7199c5e463" Nov 22 08:34:06 crc kubenswrapper[4743]: I1122 08:34:06.503078 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:06 crc kubenswrapper[4743]: I1122 08:34:06.503614 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:06 crc kubenswrapper[4743]: E1122 08:34:06.523013 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(f1fdbe69bbe3194728563616036845e0f7b97b47b5bcda75c03104ef44e789b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 08:34:06 crc kubenswrapper[4743]: E1122 08:34:06.523150 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(f1fdbe69bbe3194728563616036845e0f7b97b47b5bcda75c03104ef44e789b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:06 crc kubenswrapper[4743]: E1122 08:34:06.523204 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(f1fdbe69bbe3194728563616036845e0f7b97b47b5bcda75c03104ef44e789b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:06 crc kubenswrapper[4743]: E1122 08:34:06.523290 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-5njjw_crc-storage(b57e70a1-7869-42b1-bdaf-2b7199c5e463)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-5njjw_crc-storage(b57e70a1-7869-42b1-bdaf-2b7199c5e463)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(f1fdbe69bbe3194728563616036845e0f7b97b47b5bcda75c03104ef44e789b4): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-5njjw" podUID="b57e70a1-7869-42b1-bdaf-2b7199c5e463" Nov 22 08:34:10 crc kubenswrapper[4743]: I1122 08:34:10.150945 4743 scope.go:117] "RemoveContainer" containerID="0902f0de82c42e6e1f407e388c2f9fa1998f6da031b9008f5ddf06d7a8fda6ee" Nov 22 08:34:10 crc kubenswrapper[4743]: E1122 08:34:10.151125 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-cbpnf_openshift-multus(a1de4b47-eed0-431f-a7a9-a944ce8791bd)\"" pod="openshift-multus/multus-cbpnf" podUID="a1de4b47-eed0-431f-a7a9-a944ce8791bd" Nov 22 08:34:21 crc kubenswrapper[4743]: I1122 08:34:21.150504 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:21 crc kubenswrapper[4743]: I1122 08:34:21.151412 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:21 crc kubenswrapper[4743]: E1122 08:34:21.172782 4743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(5f6fa3635680142bf7a10bb0b9d63b745083f9266c2f73a8ec3dae4f4175c5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Nov 22 08:34:21 crc kubenswrapper[4743]: E1122 08:34:21.172849 4743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(5f6fa3635680142bf7a10bb0b9d63b745083f9266c2f73a8ec3dae4f4175c5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:21 crc kubenswrapper[4743]: E1122 08:34:21.172871 4743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(5f6fa3635680142bf7a10bb0b9d63b745083f9266c2f73a8ec3dae4f4175c5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:21 crc kubenswrapper[4743]: E1122 08:34:21.172913 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-5njjw_crc-storage(b57e70a1-7869-42b1-bdaf-2b7199c5e463)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-5njjw_crc-storage(b57e70a1-7869-42b1-bdaf-2b7199c5e463)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-5njjw_crc-storage_b57e70a1-7869-42b1-bdaf-2b7199c5e463_0(5f6fa3635680142bf7a10bb0b9d63b745083f9266c2f73a8ec3dae4f4175c5f3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-5njjw" podUID="b57e70a1-7869-42b1-bdaf-2b7199c5e463" Nov 22 08:34:22 crc kubenswrapper[4743]: I1122 08:34:22.152147 4743 scope.go:117] "RemoveContainer" containerID="0902f0de82c42e6e1f407e388c2f9fa1998f6da031b9008f5ddf06d7a8fda6ee" Nov 22 08:34:22 crc kubenswrapper[4743]: I1122 08:34:22.608056 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cbpnf_a1de4b47-eed0-431f-a7a9-a944ce8791bd/kube-multus/2.log" Nov 22 08:34:22 crc kubenswrapper[4743]: I1122 08:34:22.608133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cbpnf" event={"ID":"a1de4b47-eed0-431f-a7a9-a944ce8791bd","Type":"ContainerStarted","Data":"375cee9b6e4ddc375cef88c091244ef1b12db31831680c1f9af8bde1203382ec"} Nov 22 08:34:28 crc kubenswrapper[4743]: I1122 08:34:28.230161 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-np2wf" Nov 22 08:34:35 crc kubenswrapper[4743]: I1122 08:34:35.151642 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:35 crc kubenswrapper[4743]: I1122 08:34:35.153197 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:35 crc kubenswrapper[4743]: I1122 08:34:35.352251 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-5njjw"] Nov 22 08:34:35 crc kubenswrapper[4743]: I1122 08:34:35.363007 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 08:34:35 crc kubenswrapper[4743]: I1122 08:34:35.678310 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5njjw" event={"ID":"b57e70a1-7869-42b1-bdaf-2b7199c5e463","Type":"ContainerStarted","Data":"b82893e6e0c6aa887a9b42e22bc395541b7772dcf1cbe1c246c3bca4de846cf2"} Nov 22 08:34:37 crc kubenswrapper[4743]: I1122 08:34:37.691243 4743 generic.go:334] "Generic (PLEG): container finished" podID="b57e70a1-7869-42b1-bdaf-2b7199c5e463" containerID="50f7684c7231686be26e388276494b4a329a83b9d59dc01f91ca9a1d238c0115" exitCode=0 Nov 22 08:34:37 crc kubenswrapper[4743]: I1122 08:34:37.691376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5njjw" event={"ID":"b57e70a1-7869-42b1-bdaf-2b7199c5e463","Type":"ContainerDied","Data":"50f7684c7231686be26e388276494b4a329a83b9d59dc01f91ca9a1d238c0115"} Nov 22 08:34:38 crc kubenswrapper[4743]: I1122 08:34:38.969664 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.120190 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drw9h\" (UniqueName: \"kubernetes.io/projected/b57e70a1-7869-42b1-bdaf-2b7199c5e463-kube-api-access-drw9h\") pod \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.120387 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b57e70a1-7869-42b1-bdaf-2b7199c5e463-crc-storage\") pod \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.120517 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b57e70a1-7869-42b1-bdaf-2b7199c5e463-node-mnt\") pod \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\" (UID: \"b57e70a1-7869-42b1-bdaf-2b7199c5e463\") " Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.120718 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b57e70a1-7869-42b1-bdaf-2b7199c5e463-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "b57e70a1-7869-42b1-bdaf-2b7199c5e463" (UID: "b57e70a1-7869-42b1-bdaf-2b7199c5e463"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.121197 4743 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/b57e70a1-7869-42b1-bdaf-2b7199c5e463-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.128804 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b57e70a1-7869-42b1-bdaf-2b7199c5e463-kube-api-access-drw9h" (OuterVolumeSpecName: "kube-api-access-drw9h") pod "b57e70a1-7869-42b1-bdaf-2b7199c5e463" (UID: "b57e70a1-7869-42b1-bdaf-2b7199c5e463"). InnerVolumeSpecName "kube-api-access-drw9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.146697 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b57e70a1-7869-42b1-bdaf-2b7199c5e463-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "b57e70a1-7869-42b1-bdaf-2b7199c5e463" (UID: "b57e70a1-7869-42b1-bdaf-2b7199c5e463"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.221915 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drw9h\" (UniqueName: \"kubernetes.io/projected/b57e70a1-7869-42b1-bdaf-2b7199c5e463-kube-api-access-drw9h\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.221951 4743 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/b57e70a1-7869-42b1-bdaf-2b7199c5e463-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.719232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-5njjw" event={"ID":"b57e70a1-7869-42b1-bdaf-2b7199c5e463","Type":"ContainerDied","Data":"b82893e6e0c6aa887a9b42e22bc395541b7772dcf1cbe1c246c3bca4de846cf2"} Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.719298 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b82893e6e0c6aa887a9b42e22bc395541b7772dcf1cbe1c246c3bca4de846cf2" Nov 22 08:34:39 crc kubenswrapper[4743]: I1122 08:34:39.719307 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-5njjw" Nov 22 08:34:45 crc kubenswrapper[4743]: I1122 08:34:45.819648 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8c5mq"] Nov 22 08:34:45 crc kubenswrapper[4743]: I1122 08:34:45.820453 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" podUID="71f63f00-6812-4f35-ba1e-d1ea01a27a19" containerName="controller-manager" containerID="cri-o://a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119" gracePeriod=30 Nov 22 08:34:45 crc kubenswrapper[4743]: I1122 08:34:45.920227 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z"] Nov 22 08:34:45 crc kubenswrapper[4743]: I1122 08:34:45.920703 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" podUID="28f586ec-7a65-4c1e-9f09-845b812246b0" containerName="route-controller-manager" containerID="cri-o://2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318" gracePeriod=30 Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.285940 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.382308 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.417040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f63f00-6812-4f35-ba1e-d1ea01a27a19-serving-cert\") pod \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.417079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-config\") pod \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.417152 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-proxy-ca-bundles\") pod \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.417213 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-client-ca\") pod \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.417256 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkxdr\" (UniqueName: \"kubernetes.io/projected/71f63f00-6812-4f35-ba1e-d1ea01a27a19-kube-api-access-rkxdr\") pod \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\" (UID: \"71f63f00-6812-4f35-ba1e-d1ea01a27a19\") " Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.417976 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "71f63f00-6812-4f35-ba1e-d1ea01a27a19" (UID: "71f63f00-6812-4f35-ba1e-d1ea01a27a19"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.418029 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-client-ca" (OuterVolumeSpecName: "client-ca") pod "71f63f00-6812-4f35-ba1e-d1ea01a27a19" (UID: "71f63f00-6812-4f35-ba1e-d1ea01a27a19"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.418053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-config" (OuterVolumeSpecName: "config") pod "71f63f00-6812-4f35-ba1e-d1ea01a27a19" (UID: "71f63f00-6812-4f35-ba1e-d1ea01a27a19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.422922 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71f63f00-6812-4f35-ba1e-d1ea01a27a19-kube-api-access-rkxdr" (OuterVolumeSpecName: "kube-api-access-rkxdr") pod "71f63f00-6812-4f35-ba1e-d1ea01a27a19" (UID: "71f63f00-6812-4f35-ba1e-d1ea01a27a19"). InnerVolumeSpecName "kube-api-access-rkxdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.425850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71f63f00-6812-4f35-ba1e-d1ea01a27a19-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "71f63f00-6812-4f35-ba1e-d1ea01a27a19" (UID: "71f63f00-6812-4f35-ba1e-d1ea01a27a19"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.518875 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-config\") pod \"28f586ec-7a65-4c1e-9f09-845b812246b0\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.518951 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28f586ec-7a65-4c1e-9f09-845b812246b0-serving-cert\") pod \"28f586ec-7a65-4c1e-9f09-845b812246b0\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.519042 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfqkz\" (UniqueName: \"kubernetes.io/projected/28f586ec-7a65-4c1e-9f09-845b812246b0-kube-api-access-xfqkz\") pod \"28f586ec-7a65-4c1e-9f09-845b812246b0\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.519079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-client-ca\") pod \"28f586ec-7a65-4c1e-9f09-845b812246b0\" (UID: \"28f586ec-7a65-4c1e-9f09-845b812246b0\") " Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.519458 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.519479 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkxdr\" (UniqueName: \"kubernetes.io/projected/71f63f00-6812-4f35-ba1e-d1ea01a27a19-kube-api-access-rkxdr\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.519495 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f63f00-6812-4f35-ba1e-d1ea01a27a19-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.519507 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.519521 4743 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/71f63f00-6812-4f35-ba1e-d1ea01a27a19-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.520009 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-config" (OuterVolumeSpecName: "config") pod "28f586ec-7a65-4c1e-9f09-845b812246b0" (UID: "28f586ec-7a65-4c1e-9f09-845b812246b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.520022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-client-ca" (OuterVolumeSpecName: "client-ca") pod "28f586ec-7a65-4c1e-9f09-845b812246b0" (UID: "28f586ec-7a65-4c1e-9f09-845b812246b0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.526125 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28f586ec-7a65-4c1e-9f09-845b812246b0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "28f586ec-7a65-4c1e-9f09-845b812246b0" (UID: "28f586ec-7a65-4c1e-9f09-845b812246b0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.526210 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f586ec-7a65-4c1e-9f09-845b812246b0-kube-api-access-xfqkz" (OuterVolumeSpecName: "kube-api-access-xfqkz") pod "28f586ec-7a65-4c1e-9f09-845b812246b0" (UID: "28f586ec-7a65-4c1e-9f09-845b812246b0"). InnerVolumeSpecName "kube-api-access-xfqkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.620165 4743 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-client-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.620206 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28f586ec-7a65-4c1e-9f09-845b812246b0-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.620217 4743 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28f586ec-7a65-4c1e-9f09-845b812246b0-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.620230 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfqkz\" (UniqueName: \"kubernetes.io/projected/28f586ec-7a65-4c1e-9f09-845b812246b0-kube-api-access-xfqkz\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.758243 4743 generic.go:334] "Generic (PLEG): container finished" podID="28f586ec-7a65-4c1e-9f09-845b812246b0" containerID="2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318" exitCode=0 Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.758297 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.758318 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" event={"ID":"28f586ec-7a65-4c1e-9f09-845b812246b0","Type":"ContainerDied","Data":"2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318"} Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.758795 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z" event={"ID":"28f586ec-7a65-4c1e-9f09-845b812246b0","Type":"ContainerDied","Data":"5388f12bb92ed66906ad47991f5bdf8a688fc4df0c5293fa382bc54aed76ff95"} Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.758817 4743 scope.go:117] "RemoveContainer" containerID="2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.760206 4743 generic.go:334] "Generic (PLEG): container finished" podID="71f63f00-6812-4f35-ba1e-d1ea01a27a19" containerID="a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119" exitCode=0 Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.760240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" event={"ID":"71f63f00-6812-4f35-ba1e-d1ea01a27a19","Type":"ContainerDied","Data":"a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119"} Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.760264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" event={"ID":"71f63f00-6812-4f35-ba1e-d1ea01a27a19","Type":"ContainerDied","Data":"6eeea9a14b0c1807fc82fb11a9836cd1ccc104d7522d9467f790d236e700029c"} Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.760305 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8c5mq" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.775362 4743 scope.go:117] "RemoveContainer" containerID="2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318" Nov 22 08:34:46 crc kubenswrapper[4743]: E1122 08:34:46.775872 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318\": container with ID starting with 2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318 not found: ID does not exist" containerID="2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.775908 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318"} err="failed to get container status \"2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318\": rpc error: code = NotFound desc = could not find container \"2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318\": container with ID starting with 2dd614f06d3e85e906074d725169453ea24597668caed23e949b0bb5b4162318 not found: ID does not exist" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.775932 4743 scope.go:117] "RemoveContainer" containerID="a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.791458 4743 scope.go:117] "RemoveContainer" containerID="a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119" Nov 22 08:34:46 crc kubenswrapper[4743]: E1122 08:34:46.791914 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119\": container with ID starting with a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119 not found: ID does not exist" containerID="a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.791960 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119"} err="failed to get container status \"a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119\": rpc error: code = NotFound desc = could not find container \"a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119\": container with ID starting with a5a5ff46780988f1f285448a9a2d0d76ecc9999fa058c6d161990d228e652119 not found: ID does not exist" Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.819630 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z"] Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.822535 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-rrn7z"] Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.837128 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8c5mq"] Nov 22 08:34:46 crc kubenswrapper[4743]: I1122 08:34:46.840445 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8c5mq"] Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.159298 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f586ec-7a65-4c1e-9f09-845b812246b0" path="/var/lib/kubelet/pods/28f586ec-7a65-4c1e-9f09-845b812246b0/volumes" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.159909 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71f63f00-6812-4f35-ba1e-d1ea01a27a19" path="/var/lib/kubelet/pods/71f63f00-6812-4f35-ba1e-d1ea01a27a19/volumes" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.358862 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z"] Nov 22 08:34:47 crc kubenswrapper[4743]: E1122 08:34:47.359522 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f586ec-7a65-4c1e-9f09-845b812246b0" containerName="route-controller-manager" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.359618 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f586ec-7a65-4c1e-9f09-845b812246b0" containerName="route-controller-manager" Nov 22 08:34:47 crc kubenswrapper[4743]: E1122 08:34:47.359686 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b57e70a1-7869-42b1-bdaf-2b7199c5e463" containerName="storage" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.359735 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b57e70a1-7869-42b1-bdaf-2b7199c5e463" containerName="storage" Nov 22 08:34:47 crc kubenswrapper[4743]: E1122 08:34:47.359788 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f63f00-6812-4f35-ba1e-d1ea01a27a19" containerName="controller-manager" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.359890 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f63f00-6812-4f35-ba1e-d1ea01a27a19" containerName="controller-manager" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.360052 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f586ec-7a65-4c1e-9f09-845b812246b0" containerName="route-controller-manager" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.360107 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="71f63f00-6812-4f35-ba1e-d1ea01a27a19" containerName="controller-manager" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.360161 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b57e70a1-7869-42b1-bdaf-2b7199c5e463" containerName="storage" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.361193 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.363383 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.370092 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z"] Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.530542 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.530594 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.530637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsw2k\" (UniqueName: \"kubernetes.io/projected/1b12b44d-49bb-4965-bec8-c49868b581c8-kube-api-access-wsw2k\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.631965 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.632012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.632044 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsw2k\" (UniqueName: \"kubernetes.io/projected/1b12b44d-49bb-4965-bec8-c49868b581c8-kube-api-access-wsw2k\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.632565 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-bundle\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.632936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-util\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.655313 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsw2k\" (UniqueName: \"kubernetes.io/projected/1b12b44d-49bb-4965-bec8-c49868b581c8-kube-api-access-wsw2k\") pod \"5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.664078 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f"] Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.665279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.669117 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.669154 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.670164 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.671745 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.671796 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.672556 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb"] Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.673000 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.673663 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.675372 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.676053 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f"] Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.677743 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.681986 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb"] Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.682616 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.683127 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.685840 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.685839 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.686024 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.692526 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.834588 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-serving-cert\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.835016 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgp8j\" (UniqueName: \"kubernetes.io/projected/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-kube-api-access-kgp8j\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.835045 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8bfx\" (UniqueName: \"kubernetes.io/projected/4f057712-e84a-4a80-b4c9-36d850f2ef5f-kube-api-access-p8bfx\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.835068 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-proxy-ca-bundles\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.835098 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-client-ca\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.835121 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f057712-e84a-4a80-b4c9-36d850f2ef5f-serving-cert\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.835143 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-config\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.835266 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f057712-e84a-4a80-b4c9-36d850f2ef5f-config\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.835367 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f057712-e84a-4a80-b4c9-36d850f2ef5f-client-ca\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.873318 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z"] Nov 22 08:34:47 crc kubenswrapper[4743]: W1122 08:34:47.879629 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b12b44d_49bb_4965_bec8_c49868b581c8.slice/crio-6eed8e9e9448f73dcde37a67d3356ea05c954dc66fbf3e9a648ac425785b73d2 WatchSource:0}: Error finding container 6eed8e9e9448f73dcde37a67d3356ea05c954dc66fbf3e9a648ac425785b73d2: Status 404 returned error can't find the container with id 6eed8e9e9448f73dcde37a67d3356ea05c954dc66fbf3e9a648ac425785b73d2 Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.936432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-serving-cert\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.936488 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgp8j\" (UniqueName: \"kubernetes.io/projected/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-kube-api-access-kgp8j\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.936520 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8bfx\" (UniqueName: \"kubernetes.io/projected/4f057712-e84a-4a80-b4c9-36d850f2ef5f-kube-api-access-p8bfx\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.936543 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-proxy-ca-bundles\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.936569 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-client-ca\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.936622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f057712-e84a-4a80-b4c9-36d850f2ef5f-serving-cert\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.936648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-config\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.936677 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f057712-e84a-4a80-b4c9-36d850f2ef5f-config\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.936697 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f057712-e84a-4a80-b4c9-36d850f2ef5f-client-ca\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.938837 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f057712-e84a-4a80-b4c9-36d850f2ef5f-client-ca\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.939059 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-client-ca\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.939357 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f057712-e84a-4a80-b4c9-36d850f2ef5f-config\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.939430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-config\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.941794 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-proxy-ca-bundles\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.943277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f057712-e84a-4a80-b4c9-36d850f2ef5f-serving-cert\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.943947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-serving-cert\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.957332 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgp8j\" (UniqueName: \"kubernetes.io/projected/0ec91b9b-d271-49f8-a0ce-2bad917b21ab-kube-api-access-kgp8j\") pod \"controller-manager-5cfd4b74bd-2xjqb\" (UID: \"0ec91b9b-d271-49f8-a0ce-2bad917b21ab\") " pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:47 crc kubenswrapper[4743]: I1122 08:34:47.958159 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8bfx\" (UniqueName: \"kubernetes.io/projected/4f057712-e84a-4a80-b4c9-36d850f2ef5f-kube-api-access-p8bfx\") pod \"route-controller-manager-5794d489cd-znw5f\" (UID: \"4f057712-e84a-4a80-b4c9-36d850f2ef5f\") " pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.046148 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.056298 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.284712 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f"] Nov 22 08:34:48 crc kubenswrapper[4743]: W1122 08:34:48.294927 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f057712_e84a_4a80_b4c9_36d850f2ef5f.slice/crio-861a49ae003f4add4dc139ab0689397dcf878d890fa8b30aa5a1407b0816f0fd WatchSource:0}: Error finding container 861a49ae003f4add4dc139ab0689397dcf878d890fa8b30aa5a1407b0816f0fd: Status 404 returned error can't find the container with id 861a49ae003f4add4dc139ab0689397dcf878d890fa8b30aa5a1407b0816f0fd Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.449981 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb"] Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.793890 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" event={"ID":"0ec91b9b-d271-49f8-a0ce-2bad917b21ab","Type":"ContainerStarted","Data":"8df9a6bae25976e8a5fa5454605c4b5454b4f535061c71e6c4b74bdf7ce588b5"} Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.794252 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.794267 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" event={"ID":"0ec91b9b-d271-49f8-a0ce-2bad917b21ab","Type":"ContainerStarted","Data":"b187bef3d67ab28d5bf69ad15715adaee4d41563111c1b186efb847d3f660d30"} Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.797058 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" event={"ID":"4f057712-e84a-4a80-b4c9-36d850f2ef5f","Type":"ContainerStarted","Data":"4009a7c67e088a69a9e95fc60066d90dad6d0ca24d74637afa45335322356b61"} Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.797099 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" event={"ID":"4f057712-e84a-4a80-b4c9-36d850f2ef5f","Type":"ContainerStarted","Data":"861a49ae003f4add4dc139ab0689397dcf878d890fa8b30aa5a1407b0816f0fd"} Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.797210 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.799753 4743 generic.go:334] "Generic (PLEG): container finished" podID="1b12b44d-49bb-4965-bec8-c49868b581c8" containerID="5ed0c6231de5f2aa19374828835ea1ebb0b275347bacdfa6a791ebf64e8bc36b" exitCode=0 Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.799761 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.799798 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" event={"ID":"1b12b44d-49bb-4965-bec8-c49868b581c8","Type":"ContainerDied","Data":"5ed0c6231de5f2aa19374828835ea1ebb0b275347bacdfa6a791ebf64e8bc36b"} Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.799812 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" event={"ID":"1b12b44d-49bb-4965-bec8-c49868b581c8","Type":"ContainerStarted","Data":"6eed8e9e9448f73dcde37a67d3356ea05c954dc66fbf3e9a648ac425785b73d2"} Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.817149 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cfd4b74bd-2xjqb" podStartSLOduration=3.817134858 podStartE2EDuration="3.817134858s" podCreationTimestamp="2025-11-22 08:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:34:48.815340286 +0000 UTC m=+762.521701338" watchObservedRunningTime="2025-11-22 08:34:48.817134858 +0000 UTC m=+762.523495910" Nov 22 08:34:48 crc kubenswrapper[4743]: I1122 08:34:48.886179 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" podStartSLOduration=2.886158375 podStartE2EDuration="2.886158375s" podCreationTimestamp="2025-11-22 08:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:34:48.882925831 +0000 UTC m=+762.589286883" watchObservedRunningTime="2025-11-22 08:34:48.886158375 +0000 UTC m=+762.592519427" Nov 22 08:34:49 crc kubenswrapper[4743]: I1122 08:34:49.326315 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5794d489cd-znw5f" Nov 22 08:34:51 crc kubenswrapper[4743]: I1122 08:34:51.819143 4743 generic.go:334] "Generic (PLEG): container finished" podID="1b12b44d-49bb-4965-bec8-c49868b581c8" containerID="8cdacaf188199eeabad8341446990b862741c37f5fc0a16b9fe5538dc1063dcb" exitCode=0 Nov 22 08:34:51 crc kubenswrapper[4743]: I1122 08:34:51.819252 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" event={"ID":"1b12b44d-49bb-4965-bec8-c49868b581c8","Type":"ContainerDied","Data":"8cdacaf188199eeabad8341446990b862741c37f5fc0a16b9fe5538dc1063dcb"} Nov 22 08:34:52 crc kubenswrapper[4743]: I1122 08:34:52.827066 4743 generic.go:334] "Generic (PLEG): container finished" podID="1b12b44d-49bb-4965-bec8-c49868b581c8" containerID="16ab7e2a60132c4710459baab2a9c28cb835b46aa001469d22574999b1b4eef8" exitCode=0 Nov 22 08:34:52 crc kubenswrapper[4743]: I1122 08:34:52.827107 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" event={"ID":"1b12b44d-49bb-4965-bec8-c49868b581c8","Type":"ContainerDied","Data":"16ab7e2a60132c4710459baab2a9c28cb835b46aa001469d22574999b1b4eef8"} Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.126803 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.212709 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsw2k\" (UniqueName: \"kubernetes.io/projected/1b12b44d-49bb-4965-bec8-c49868b581c8-kube-api-access-wsw2k\") pod \"1b12b44d-49bb-4965-bec8-c49868b581c8\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.212806 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-bundle\") pod \"1b12b44d-49bb-4965-bec8-c49868b581c8\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.212840 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-util\") pod \"1b12b44d-49bb-4965-bec8-c49868b581c8\" (UID: \"1b12b44d-49bb-4965-bec8-c49868b581c8\") " Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.213854 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-bundle" (OuterVolumeSpecName: "bundle") pod "1b12b44d-49bb-4965-bec8-c49868b581c8" (UID: "1b12b44d-49bb-4965-bec8-c49868b581c8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.219873 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b12b44d-49bb-4965-bec8-c49868b581c8-kube-api-access-wsw2k" (OuterVolumeSpecName: "kube-api-access-wsw2k") pod "1b12b44d-49bb-4965-bec8-c49868b581c8" (UID: "1b12b44d-49bb-4965-bec8-c49868b581c8"). InnerVolumeSpecName "kube-api-access-wsw2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.225549 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-util" (OuterVolumeSpecName: "util") pod "1b12b44d-49bb-4965-bec8-c49868b581c8" (UID: "1b12b44d-49bb-4965-bec8-c49868b581c8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.314150 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.314184 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b12b44d-49bb-4965-bec8-c49868b581c8-util\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.314194 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsw2k\" (UniqueName: \"kubernetes.io/projected/1b12b44d-49bb-4965-bec8-c49868b581c8-kube-api-access-wsw2k\") on node \"crc\" DevicePath \"\"" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.396602 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fnhxs"] Nov 22 08:34:54 crc kubenswrapper[4743]: E1122 08:34:54.396876 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b12b44d-49bb-4965-bec8-c49868b581c8" containerName="extract" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.396907 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b12b44d-49bb-4965-bec8-c49868b581c8" containerName="extract" Nov 22 08:34:54 crc kubenswrapper[4743]: E1122 08:34:54.396921 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b12b44d-49bb-4965-bec8-c49868b581c8" containerName="util" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.396927 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b12b44d-49bb-4965-bec8-c49868b581c8" containerName="util" Nov 22 08:34:54 crc kubenswrapper[4743]: E1122 08:34:54.396959 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b12b44d-49bb-4965-bec8-c49868b581c8" containerName="pull" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.396972 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b12b44d-49bb-4965-bec8-c49868b581c8" containerName="pull" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.397266 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b12b44d-49bb-4965-bec8-c49868b581c8" containerName="extract" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.398029 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.416360 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fnhxs"] Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.516775 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-catalog-content\") pod \"redhat-operators-fnhxs\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.516832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkmpt\" (UniqueName: \"kubernetes.io/projected/aa4bbb12-c772-454d-a0cb-ad296318a20a-kube-api-access-bkmpt\") pod \"redhat-operators-fnhxs\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.516853 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-utilities\") pod \"redhat-operators-fnhxs\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.617682 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkmpt\" (UniqueName: \"kubernetes.io/projected/aa4bbb12-c772-454d-a0cb-ad296318a20a-kube-api-access-bkmpt\") pod \"redhat-operators-fnhxs\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.617728 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-utilities\") pod \"redhat-operators-fnhxs\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.617794 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-catalog-content\") pod \"redhat-operators-fnhxs\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.618264 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-catalog-content\") pod \"redhat-operators-fnhxs\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.618300 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-utilities\") pod \"redhat-operators-fnhxs\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.634492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkmpt\" (UniqueName: \"kubernetes.io/projected/aa4bbb12-c772-454d-a0cb-ad296318a20a-kube-api-access-bkmpt\") pod \"redhat-operators-fnhxs\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.729243 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.842267 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" event={"ID":"1b12b44d-49bb-4965-bec8-c49868b581c8","Type":"ContainerDied","Data":"6eed8e9e9448f73dcde37a67d3356ea05c954dc66fbf3e9a648ac425785b73d2"} Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.842691 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eed8e9e9448f73dcde37a67d3356ea05c954dc66fbf3e9a648ac425785b73d2" Nov 22 08:34:54 crc kubenswrapper[4743]: I1122 08:34:54.842501 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z" Nov 22 08:34:55 crc kubenswrapper[4743]: I1122 08:34:55.221143 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fnhxs"] Nov 22 08:34:55 crc kubenswrapper[4743]: W1122 08:34:55.233561 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa4bbb12_c772_454d_a0cb_ad296318a20a.slice/crio-31e726ac92da18023ded2d71c91722cc5a0de1c806ee7f5cb593d553ff61bc55 WatchSource:0}: Error finding container 31e726ac92da18023ded2d71c91722cc5a0de1c806ee7f5cb593d553ff61bc55: Status 404 returned error can't find the container with id 31e726ac92da18023ded2d71c91722cc5a0de1c806ee7f5cb593d553ff61bc55 Nov 22 08:34:55 crc kubenswrapper[4743]: I1122 08:34:55.850071 4743 generic.go:334] "Generic (PLEG): container finished" podID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerID="5c3651ba8ca2a7355d68949d290411c53306a818bcd9845cc0ae66275aa11040" exitCode=0 Nov 22 08:34:55 crc kubenswrapper[4743]: I1122 08:34:55.850192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fnhxs" event={"ID":"aa4bbb12-c772-454d-a0cb-ad296318a20a","Type":"ContainerDied","Data":"5c3651ba8ca2a7355d68949d290411c53306a818bcd9845cc0ae66275aa11040"} Nov 22 08:34:55 crc kubenswrapper[4743]: I1122 08:34:55.850715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fnhxs" event={"ID":"aa4bbb12-c772-454d-a0cb-ad296318a20a","Type":"ContainerStarted","Data":"31e726ac92da18023ded2d71c91722cc5a0de1c806ee7f5cb593d553ff61bc55"} Nov 22 08:34:56 crc kubenswrapper[4743]: I1122 08:34:56.860770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fnhxs" event={"ID":"aa4bbb12-c772-454d-a0cb-ad296318a20a","Type":"ContainerStarted","Data":"27d3b561d6d93f060e640b5e3a9f673d94a4e88d64d31c072c9b1525f61cc533"} Nov 22 08:34:57 crc kubenswrapper[4743]: I1122 08:34:57.869563 4743 generic.go:334] "Generic (PLEG): container finished" podID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerID="27d3b561d6d93f060e640b5e3a9f673d94a4e88d64d31c072c9b1525f61cc533" exitCode=0 Nov 22 08:34:57 crc kubenswrapper[4743]: I1122 08:34:57.869633 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fnhxs" event={"ID":"aa4bbb12-c772-454d-a0cb-ad296318a20a","Type":"ContainerDied","Data":"27d3b561d6d93f060e640b5e3a9f673d94a4e88d64d31c072c9b1525f61cc533"} Nov 22 08:34:57 crc kubenswrapper[4743]: I1122 08:34:57.976085 4743 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.145770 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-rzpwt"] Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.146445 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-rzpwt" Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.147968 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-xj7m6" Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.148289 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.149532 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.165944 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-rzpwt"] Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.271077 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwcxj\" (UniqueName: \"kubernetes.io/projected/f8fe518d-0109-44b8-84a8-7f8d285abb8d-kube-api-access-lwcxj\") pod \"nmstate-operator-557fdffb88-rzpwt\" (UID: \"f8fe518d-0109-44b8-84a8-7f8d285abb8d\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-rzpwt" Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.372828 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwcxj\" (UniqueName: \"kubernetes.io/projected/f8fe518d-0109-44b8-84a8-7f8d285abb8d-kube-api-access-lwcxj\") pod \"nmstate-operator-557fdffb88-rzpwt\" (UID: \"f8fe518d-0109-44b8-84a8-7f8d285abb8d\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-rzpwt" Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.392659 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwcxj\" (UniqueName: \"kubernetes.io/projected/f8fe518d-0109-44b8-84a8-7f8d285abb8d-kube-api-access-lwcxj\") pod \"nmstate-operator-557fdffb88-rzpwt\" (UID: \"f8fe518d-0109-44b8-84a8-7f8d285abb8d\") " pod="openshift-nmstate/nmstate-operator-557fdffb88-rzpwt" Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.460992 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-557fdffb88-rzpwt" Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.878496 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fnhxs" event={"ID":"aa4bbb12-c772-454d-a0cb-ad296318a20a","Type":"ContainerStarted","Data":"5e7e01af61eed9122463e405689f7d0f155924fc794735dcfec3988ec7bee137"} Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.897653 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fnhxs" podStartSLOduration=2.462767081 podStartE2EDuration="4.897631659s" podCreationTimestamp="2025-11-22 08:34:54 +0000 UTC" firstStartedPulling="2025-11-22 08:34:55.852403726 +0000 UTC m=+769.558764778" lastFinishedPulling="2025-11-22 08:34:58.287268304 +0000 UTC m=+771.993629356" observedRunningTime="2025-11-22 08:34:58.894834928 +0000 UTC m=+772.601195980" watchObservedRunningTime="2025-11-22 08:34:58.897631659 +0000 UTC m=+772.603992711" Nov 22 08:34:58 crc kubenswrapper[4743]: I1122 08:34:58.914238 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-557fdffb88-rzpwt"] Nov 22 08:34:58 crc kubenswrapper[4743]: W1122 08:34:58.920151 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8fe518d_0109_44b8_84a8_7f8d285abb8d.slice/crio-47fc3ae2c83d1d943c9056368370b612e9f59ecf84c90307d180f30f110bc31d WatchSource:0}: Error finding container 47fc3ae2c83d1d943c9056368370b612e9f59ecf84c90307d180f30f110bc31d: Status 404 returned error can't find the container with id 47fc3ae2c83d1d943c9056368370b612e9f59ecf84c90307d180f30f110bc31d Nov 22 08:34:59 crc kubenswrapper[4743]: I1122 08:34:59.885024 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-rzpwt" event={"ID":"f8fe518d-0109-44b8-84a8-7f8d285abb8d","Type":"ContainerStarted","Data":"47fc3ae2c83d1d943c9056368370b612e9f59ecf84c90307d180f30f110bc31d"} Nov 22 08:35:01 crc kubenswrapper[4743]: I1122 08:35:01.242054 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:35:01 crc kubenswrapper[4743]: I1122 08:35:01.242688 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:35:01 crc kubenswrapper[4743]: I1122 08:35:01.921305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-557fdffb88-rzpwt" event={"ID":"f8fe518d-0109-44b8-84a8-7f8d285abb8d","Type":"ContainerStarted","Data":"44c2bd8243a9ff9cebe8db9b03e5b66a9d5db46ac405fb46d578539ff7b42eb0"} Nov 22 08:35:01 crc kubenswrapper[4743]: I1122 08:35:01.941870 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-557fdffb88-rzpwt" podStartSLOduration=1.776795329 podStartE2EDuration="3.941846984s" podCreationTimestamp="2025-11-22 08:34:58 +0000 UTC" firstStartedPulling="2025-11-22 08:34:58.923976931 +0000 UTC m=+772.630337983" lastFinishedPulling="2025-11-22 08:35:01.089028586 +0000 UTC m=+774.795389638" observedRunningTime="2025-11-22 08:35:01.941168774 +0000 UTC m=+775.647529826" watchObservedRunningTime="2025-11-22 08:35:01.941846984 +0000 UTC m=+775.648208036" Nov 22 08:35:04 crc kubenswrapper[4743]: I1122 08:35:04.730381 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:35:04 crc kubenswrapper[4743]: I1122 08:35:04.730767 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:35:04 crc kubenswrapper[4743]: I1122 08:35:04.768425 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:35:04 crc kubenswrapper[4743]: I1122 08:35:04.993959 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:35:05 crc kubenswrapper[4743]: I1122 08:35:05.787232 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fnhxs"] Nov 22 08:35:06 crc kubenswrapper[4743]: I1122 08:35:06.950645 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fnhxs" podUID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerName="registry-server" containerID="cri-o://5e7e01af61eed9122463e405689f7d0f155924fc794735dcfec3988ec7bee137" gracePeriod=2 Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.585392 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm"] Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.586741 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.588102 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-t22mg" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.599028 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk"] Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.599937 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.602811 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.603416 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm"] Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.609889 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk"] Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.616823 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-nhd9z"] Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.617492 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.706894 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vmf6\" (UniqueName: \"kubernetes.io/projected/b5589372-866d-4842-ad30-fdb503b25d3a-kube-api-access-9vmf6\") pod \"nmstate-metrics-5dcf9c57c5-ntztm\" (UID: \"b5589372-866d-4842-ad30-fdb503b25d3a\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.706945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx7kc\" (UniqueName: \"kubernetes.io/projected/48065f06-9619-4a08-a9a5-c50269da8fbe-kube-api-access-mx7kc\") pod \"nmstate-webhook-6b89b748d8-bj7bk\" (UID: \"48065f06-9619-4a08-a9a5-c50269da8fbe\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.706964 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/48065f06-9619-4a08-a9a5-c50269da8fbe-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-bj7bk\" (UID: \"48065f06-9619-4a08-a9a5-c50269da8fbe\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.754056 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt"] Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.756888 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.759892 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.762338 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mtgc2" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.762567 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.768250 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt"] Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.808478 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9jkr\" (UniqueName: \"kubernetes.io/projected/9919df2d-511a-481f-9506-039359ecbfb1-kube-api-access-j9jkr\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.809800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vmf6\" (UniqueName: \"kubernetes.io/projected/b5589372-866d-4842-ad30-fdb503b25d3a-kube-api-access-9vmf6\") pod \"nmstate-metrics-5dcf9c57c5-ntztm\" (UID: \"b5589372-866d-4842-ad30-fdb503b25d3a\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.810526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx7kc\" (UniqueName: \"kubernetes.io/projected/48065f06-9619-4a08-a9a5-c50269da8fbe-kube-api-access-mx7kc\") pod \"nmstate-webhook-6b89b748d8-bj7bk\" (UID: \"48065f06-9619-4a08-a9a5-c50269da8fbe\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.810868 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/48065f06-9619-4a08-a9a5-c50269da8fbe-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-bj7bk\" (UID: \"48065f06-9619-4a08-a9a5-c50269da8fbe\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.811540 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9919df2d-511a-481f-9506-039359ecbfb1-nmstate-lock\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.811614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9919df2d-511a-481f-9506-039359ecbfb1-ovs-socket\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.811786 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9919df2d-511a-481f-9506-039359ecbfb1-dbus-socket\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.825968 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/48065f06-9619-4a08-a9a5-c50269da8fbe-tls-key-pair\") pod \"nmstate-webhook-6b89b748d8-bj7bk\" (UID: \"48065f06-9619-4a08-a9a5-c50269da8fbe\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.834566 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx7kc\" (UniqueName: \"kubernetes.io/projected/48065f06-9619-4a08-a9a5-c50269da8fbe-kube-api-access-mx7kc\") pod \"nmstate-webhook-6b89b748d8-bj7bk\" (UID: \"48065f06-9619-4a08-a9a5-c50269da8fbe\") " pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.841983 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vmf6\" (UniqueName: \"kubernetes.io/projected/b5589372-866d-4842-ad30-fdb503b25d3a-kube-api-access-9vmf6\") pod \"nmstate-metrics-5dcf9c57c5-ntztm\" (UID: \"b5589372-866d-4842-ad30-fdb503b25d3a\") " pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.906374 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.910713 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-78b875f686-lp8d2"] Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.911330 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913625 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-trusted-ca-bundle\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913667 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82tx7\" (UniqueName: \"kubernetes.io/projected/ff2c5769-5a22-4b13-872a-c3f2bbce162a-kube-api-access-82tx7\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsvhg\" (UniqueName: \"kubernetes.io/projected/79b6d2ec-a4d8-4c91-8f86-aed66745f48b-kube-api-access-lsvhg\") pod \"nmstate-console-plugin-5874bd7bc5-ggvvt\" (UID: \"79b6d2ec-a4d8-4c91-8f86-aed66745f48b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-oauth-serving-cert\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913811 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9919df2d-511a-481f-9506-039359ecbfb1-nmstate-lock\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913837 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9919df2d-511a-481f-9506-039359ecbfb1-ovs-socket\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913866 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/79b6d2ec-a4d8-4c91-8f86-aed66745f48b-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-ggvvt\" (UID: \"79b6d2ec-a4d8-4c91-8f86-aed66745f48b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913891 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9919df2d-511a-481f-9506-039359ecbfb1-dbus-socket\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913922 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff2c5769-5a22-4b13-872a-c3f2bbce162a-console-oauth-config\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913956 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-console-config\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.913985 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/79b6d2ec-a4d8-4c91-8f86-aed66745f48b-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-ggvvt\" (UID: \"79b6d2ec-a4d8-4c91-8f86-aed66745f48b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.914043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9jkr\" (UniqueName: \"kubernetes.io/projected/9919df2d-511a-481f-9506-039359ecbfb1-kube-api-access-j9jkr\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.916288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff2c5769-5a22-4b13-872a-c3f2bbce162a-console-serving-cert\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.916388 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-service-ca\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.914538 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9919df2d-511a-481f-9506-039359ecbfb1-dbus-socket\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.914559 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9919df2d-511a-481f-9506-039359ecbfb1-ovs-socket\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.914058 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9919df2d-511a-481f-9506-039359ecbfb1-nmstate-lock\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.914233 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.942003 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78b875f686-lp8d2"] Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.946221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9jkr\" (UniqueName: \"kubernetes.io/projected/9919df2d-511a-481f-9506-039359ecbfb1-kube-api-access-j9jkr\") pod \"nmstate-handler-nhd9z\" (UID: \"9919df2d-511a-481f-9506-039359ecbfb1\") " pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.960549 4743 generic.go:334] "Generic (PLEG): container finished" podID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerID="5e7e01af61eed9122463e405689f7d0f155924fc794735dcfec3988ec7bee137" exitCode=0 Nov 22 08:35:07 crc kubenswrapper[4743]: I1122 08:35:07.960620 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fnhxs" event={"ID":"aa4bbb12-c772-454d-a0cb-ad296318a20a","Type":"ContainerDied","Data":"5e7e01af61eed9122463e405689f7d0f155924fc794735dcfec3988ec7bee137"} Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.017551 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-oauth-serving-cert\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.017644 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/79b6d2ec-a4d8-4c91-8f86-aed66745f48b-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-ggvvt\" (UID: \"79b6d2ec-a4d8-4c91-8f86-aed66745f48b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.017674 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff2c5769-5a22-4b13-872a-c3f2bbce162a-console-oauth-config\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.017727 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-console-config\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.017753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/79b6d2ec-a4d8-4c91-8f86-aed66745f48b-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-ggvvt\" (UID: \"79b6d2ec-a4d8-4c91-8f86-aed66745f48b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.017806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff2c5769-5a22-4b13-872a-c3f2bbce162a-console-serving-cert\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.017828 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-service-ca\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.017861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-trusted-ca-bundle\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.017882 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82tx7\" (UniqueName: \"kubernetes.io/projected/ff2c5769-5a22-4b13-872a-c3f2bbce162a-kube-api-access-82tx7\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.018264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsvhg\" (UniqueName: \"kubernetes.io/projected/79b6d2ec-a4d8-4c91-8f86-aed66745f48b-kube-api-access-lsvhg\") pod \"nmstate-console-plugin-5874bd7bc5-ggvvt\" (UID: \"79b6d2ec-a4d8-4c91-8f86-aed66745f48b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.018815 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-oauth-serving-cert\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.019016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/79b6d2ec-a4d8-4c91-8f86-aed66745f48b-nginx-conf\") pod \"nmstate-console-plugin-5874bd7bc5-ggvvt\" (UID: \"79b6d2ec-a4d8-4c91-8f86-aed66745f48b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.019546 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-console-config\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.020247 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-trusted-ca-bundle\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.020831 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff2c5769-5a22-4b13-872a-c3f2bbce162a-service-ca\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.024497 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/79b6d2ec-a4d8-4c91-8f86-aed66745f48b-plugin-serving-cert\") pod \"nmstate-console-plugin-5874bd7bc5-ggvvt\" (UID: \"79b6d2ec-a4d8-4c91-8f86-aed66745f48b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.024552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff2c5769-5a22-4b13-872a-c3f2bbce162a-console-oauth-config\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.033465 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff2c5769-5a22-4b13-872a-c3f2bbce162a-console-serving-cert\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.035430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsvhg\" (UniqueName: \"kubernetes.io/projected/79b6d2ec-a4d8-4c91-8f86-aed66745f48b-kube-api-access-lsvhg\") pod \"nmstate-console-plugin-5874bd7bc5-ggvvt\" (UID: \"79b6d2ec-a4d8-4c91-8f86-aed66745f48b\") " pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.045492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82tx7\" (UniqueName: \"kubernetes.io/projected/ff2c5769-5a22-4b13-872a-c3f2bbce162a-kube-api-access-82tx7\") pod \"console-78b875f686-lp8d2\" (UID: \"ff2c5769-5a22-4b13-872a-c3f2bbce162a\") " pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.054965 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.083133 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.119094 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkmpt\" (UniqueName: \"kubernetes.io/projected/aa4bbb12-c772-454d-a0cb-ad296318a20a-kube-api-access-bkmpt\") pod \"aa4bbb12-c772-454d-a0cb-ad296318a20a\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.119277 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-catalog-content\") pod \"aa4bbb12-c772-454d-a0cb-ad296318a20a\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.119301 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-utilities\") pod \"aa4bbb12-c772-454d-a0cb-ad296318a20a\" (UID: \"aa4bbb12-c772-454d-a0cb-ad296318a20a\") " Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.120558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-utilities" (OuterVolumeSpecName: "utilities") pod "aa4bbb12-c772-454d-a0cb-ad296318a20a" (UID: "aa4bbb12-c772-454d-a0cb-ad296318a20a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.123774 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa4bbb12-c772-454d-a0cb-ad296318a20a-kube-api-access-bkmpt" (OuterVolumeSpecName: "kube-api-access-bkmpt") pod "aa4bbb12-c772-454d-a0cb-ad296318a20a" (UID: "aa4bbb12-c772-454d-a0cb-ad296318a20a"). InnerVolumeSpecName "kube-api-access-bkmpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.220912 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.220951 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkmpt\" (UniqueName: \"kubernetes.io/projected/aa4bbb12-c772-454d-a0cb-ad296318a20a-kube-api-access-bkmpt\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.234546 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.234684 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa4bbb12-c772-454d-a0cb-ad296318a20a" (UID: "aa4bbb12-c772-454d-a0cb-ad296318a20a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:35:08 crc kubenswrapper[4743]: W1122 08:35:08.251520 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9919df2d_511a_481f_9506_039359ecbfb1.slice/crio-f0420aedc2454f9399c817d5a097c0ea235c35bb3a00c2aab46cdae844be3889 WatchSource:0}: Error finding container f0420aedc2454f9399c817d5a097c0ea235c35bb3a00c2aab46cdae844be3889: Status 404 returned error can't find the container with id f0420aedc2454f9399c817d5a097c0ea235c35bb3a00c2aab46cdae844be3889 Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.269609 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.322167 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa4bbb12-c772-454d-a0cb-ad296318a20a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.381915 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk"] Nov 22 08:35:08 crc kubenswrapper[4743]: W1122 08:35:08.386186 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48065f06_9619_4a08_a9a5_c50269da8fbe.slice/crio-74475b27b87b2dada99a360f6f63e52ebc1cd2651774c8e9a665fb48b7aece12 WatchSource:0}: Error finding container 74475b27b87b2dada99a360f6f63e52ebc1cd2651774c8e9a665fb48b7aece12: Status 404 returned error can't find the container with id 74475b27b87b2dada99a360f6f63e52ebc1cd2651774c8e9a665fb48b7aece12 Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.461524 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm"] Nov 22 08:35:08 crc kubenswrapper[4743]: W1122 08:35:08.463805 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5589372_866d_4842_ad30_fdb503b25d3a.slice/crio-d07a38dab456a1c2913ee8d42928e8deac40adc28c9a2314f6f5ed6549e75679 WatchSource:0}: Error finding container d07a38dab456a1c2913ee8d42928e8deac40adc28c9a2314f6f5ed6549e75679: Status 404 returned error can't find the container with id d07a38dab456a1c2913ee8d42928e8deac40adc28c9a2314f6f5ed6549e75679 Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.530435 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt"] Nov 22 08:35:08 crc kubenswrapper[4743]: W1122 08:35:08.534976 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b6d2ec_a4d8_4c91_8f86_aed66745f48b.slice/crio-6f7c7437ae3bea21d5dd4bd149513feddeca799f17a3dd71dd2fe448e6d2d6a4 WatchSource:0}: Error finding container 6f7c7437ae3bea21d5dd4bd149513feddeca799f17a3dd71dd2fe448e6d2d6a4: Status 404 returned error can't find the container with id 6f7c7437ae3bea21d5dd4bd149513feddeca799f17a3dd71dd2fe448e6d2d6a4 Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.666020 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78b875f686-lp8d2"] Nov 22 08:35:08 crc kubenswrapper[4743]: W1122 08:35:08.668794 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff2c5769_5a22_4b13_872a_c3f2bbce162a.slice/crio-02b6453eb7f738df48ef71ffc614507eca0c2fa14e4837be84933cf76c1c8ca1 WatchSource:0}: Error finding container 02b6453eb7f738df48ef71ffc614507eca0c2fa14e4837be84933cf76c1c8ca1: Status 404 returned error can't find the container with id 02b6453eb7f738df48ef71ffc614507eca0c2fa14e4837be84933cf76c1c8ca1 Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.967304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fnhxs" event={"ID":"aa4bbb12-c772-454d-a0cb-ad296318a20a","Type":"ContainerDied","Data":"31e726ac92da18023ded2d71c91722cc5a0de1c806ee7f5cb593d553ff61bc55"} Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.967412 4743 scope.go:117] "RemoveContainer" containerID="5e7e01af61eed9122463e405689f7d0f155924fc794735dcfec3988ec7bee137" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.967451 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fnhxs" Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.968816 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" event={"ID":"48065f06-9619-4a08-a9a5-c50269da8fbe","Type":"ContainerStarted","Data":"74475b27b87b2dada99a360f6f63e52ebc1cd2651774c8e9a665fb48b7aece12"} Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.970078 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" event={"ID":"79b6d2ec-a4d8-4c91-8f86-aed66745f48b","Type":"ContainerStarted","Data":"6f7c7437ae3bea21d5dd4bd149513feddeca799f17a3dd71dd2fe448e6d2d6a4"} Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.971761 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78b875f686-lp8d2" event={"ID":"ff2c5769-5a22-4b13-872a-c3f2bbce162a","Type":"ContainerStarted","Data":"02b6453eb7f738df48ef71ffc614507eca0c2fa14e4837be84933cf76c1c8ca1"} Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.973036 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm" event={"ID":"b5589372-866d-4842-ad30-fdb503b25d3a","Type":"ContainerStarted","Data":"d07a38dab456a1c2913ee8d42928e8deac40adc28c9a2314f6f5ed6549e75679"} Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.974048 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-nhd9z" event={"ID":"9919df2d-511a-481f-9506-039359ecbfb1","Type":"ContainerStarted","Data":"f0420aedc2454f9399c817d5a097c0ea235c35bb3a00c2aab46cdae844be3889"} Nov 22 08:35:08 crc kubenswrapper[4743]: I1122 08:35:08.998461 4743 scope.go:117] "RemoveContainer" containerID="27d3b561d6d93f060e640b5e3a9f673d94a4e88d64d31c072c9b1525f61cc533" Nov 22 08:35:09 crc kubenswrapper[4743]: I1122 08:35:09.013794 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fnhxs"] Nov 22 08:35:09 crc kubenswrapper[4743]: I1122 08:35:09.019091 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fnhxs"] Nov 22 08:35:09 crc kubenswrapper[4743]: I1122 08:35:09.029516 4743 scope.go:117] "RemoveContainer" containerID="5c3651ba8ca2a7355d68949d290411c53306a818bcd9845cc0ae66275aa11040" Nov 22 08:35:09 crc kubenswrapper[4743]: I1122 08:35:09.158219 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa4bbb12-c772-454d-a0cb-ad296318a20a" path="/var/lib/kubelet/pods/aa4bbb12-c772-454d-a0cb-ad296318a20a/volumes" Nov 22 08:35:09 crc kubenswrapper[4743]: I1122 08:35:09.992075 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78b875f686-lp8d2" event={"ID":"ff2c5769-5a22-4b13-872a-c3f2bbce162a","Type":"ContainerStarted","Data":"66f9d4160f6ada80b82c67e51191bdfb5a3df304f1d5740c94779e7cde6c4093"} Nov 22 08:35:10 crc kubenswrapper[4743]: I1122 08:35:10.014662 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78b875f686-lp8d2" podStartSLOduration=3.014642693 podStartE2EDuration="3.014642693s" podCreationTimestamp="2025-11-22 08:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:35:10.012414049 +0000 UTC m=+783.718775101" watchObservedRunningTime="2025-11-22 08:35:10.014642693 +0000 UTC m=+783.721003745" Nov 22 08:35:12 crc kubenswrapper[4743]: I1122 08:35:12.018202 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" event={"ID":"48065f06-9619-4a08-a9a5-c50269da8fbe","Type":"ContainerStarted","Data":"b34db6bdcda7985ae26a8662991426fd9bb0cf51821827fb598ab389ff862728"} Nov 22 08:35:12 crc kubenswrapper[4743]: I1122 08:35:12.018816 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" Nov 22 08:35:12 crc kubenswrapper[4743]: I1122 08:35:12.022645 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" event={"ID":"79b6d2ec-a4d8-4c91-8f86-aed66745f48b","Type":"ContainerStarted","Data":"5a11afd026ffb0ceec5c022ffc322dec428adf34f46972acb20c5a8434b9143d"} Nov 22 08:35:12 crc kubenswrapper[4743]: I1122 08:35:12.024742 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm" event={"ID":"b5589372-866d-4842-ad30-fdb503b25d3a","Type":"ContainerStarted","Data":"91d70272c01fe398350d0c1f29bf5ad256ce1e287a11b2405f29394141dd87e2"} Nov 22 08:35:12 crc kubenswrapper[4743]: I1122 08:35:12.026003 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-nhd9z" event={"ID":"9919df2d-511a-481f-9506-039359ecbfb1","Type":"ContainerStarted","Data":"7338d1db24be2ab1c285ed4e6ab7be84813bb56aafeffd407a2c3b142df68d4c"} Nov 22 08:35:12 crc kubenswrapper[4743]: I1122 08:35:12.026206 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:12 crc kubenswrapper[4743]: I1122 08:35:12.042665 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" podStartSLOduration=2.254394181 podStartE2EDuration="5.042637643s" podCreationTimestamp="2025-11-22 08:35:07 +0000 UTC" firstStartedPulling="2025-11-22 08:35:08.388408903 +0000 UTC m=+782.094769945" lastFinishedPulling="2025-11-22 08:35:11.176652355 +0000 UTC m=+784.883013407" observedRunningTime="2025-11-22 08:35:12.038167054 +0000 UTC m=+785.744528126" watchObservedRunningTime="2025-11-22 08:35:12.042637643 +0000 UTC m=+785.748998705" Nov 22 08:35:12 crc kubenswrapper[4743]: I1122 08:35:12.058850 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-nhd9z" podStartSLOduration=2.145178182 podStartE2EDuration="5.058825061s" podCreationTimestamp="2025-11-22 08:35:07 +0000 UTC" firstStartedPulling="2025-11-22 08:35:08.253561523 +0000 UTC m=+781.959922565" lastFinishedPulling="2025-11-22 08:35:11.167208392 +0000 UTC m=+784.873569444" observedRunningTime="2025-11-22 08:35:12.058242824 +0000 UTC m=+785.764603886" watchObservedRunningTime="2025-11-22 08:35:12.058825061 +0000 UTC m=+785.765186113" Nov 22 08:35:12 crc kubenswrapper[4743]: I1122 08:35:12.080988 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5874bd7bc5-ggvvt" podStartSLOduration=2.451099022 podStartE2EDuration="5.080963482s" podCreationTimestamp="2025-11-22 08:35:07 +0000 UTC" firstStartedPulling="2025-11-22 08:35:08.537325001 +0000 UTC m=+782.243686053" lastFinishedPulling="2025-11-22 08:35:11.167189461 +0000 UTC m=+784.873550513" observedRunningTime="2025-11-22 08:35:12.076488892 +0000 UTC m=+785.782849934" watchObservedRunningTime="2025-11-22 08:35:12.080963482 +0000 UTC m=+785.787324524" Nov 22 08:35:14 crc kubenswrapper[4743]: I1122 08:35:14.051455 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm" event={"ID":"b5589372-866d-4842-ad30-fdb503b25d3a","Type":"ContainerStarted","Data":"b788491eb0ce8350b8662d99b6aff8648187cdc9b4bc5299fa33a5635e41e7e7"} Nov 22 08:35:14 crc kubenswrapper[4743]: I1122 08:35:14.068798 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-5dcf9c57c5-ntztm" podStartSLOduration=1.935716954 podStartE2EDuration="7.0687816s" podCreationTimestamp="2025-11-22 08:35:07 +0000 UTC" firstStartedPulling="2025-11-22 08:35:08.465707599 +0000 UTC m=+782.172068651" lastFinishedPulling="2025-11-22 08:35:13.598772245 +0000 UTC m=+787.305133297" observedRunningTime="2025-11-22 08:35:14.066884616 +0000 UTC m=+787.773245668" watchObservedRunningTime="2025-11-22 08:35:14.0687816 +0000 UTC m=+787.775142662" Nov 22 08:35:18 crc kubenswrapper[4743]: I1122 08:35:18.256981 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-nhd9z" Nov 22 08:35:18 crc kubenswrapper[4743]: I1122 08:35:18.270821 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:18 crc kubenswrapper[4743]: I1122 08:35:18.270910 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:18 crc kubenswrapper[4743]: I1122 08:35:18.279651 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:19 crc kubenswrapper[4743]: I1122 08:35:19.090164 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78b875f686-lp8d2" Nov 22 08:35:19 crc kubenswrapper[4743]: I1122 08:35:19.143548 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hhpxp"] Nov 22 08:35:27 crc kubenswrapper[4743]: I1122 08:35:27.921769 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6b89b748d8-bj7bk" Nov 22 08:35:31 crc kubenswrapper[4743]: I1122 08:35:31.241617 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:35:31 crc kubenswrapper[4743]: I1122 08:35:31.242132 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.388350 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5"] Nov 22 08:35:40 crc kubenswrapper[4743]: E1122 08:35:40.389156 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerName="registry-server" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.389170 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerName="registry-server" Nov 22 08:35:40 crc kubenswrapper[4743]: E1122 08:35:40.389192 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerName="extract-content" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.389200 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerName="extract-content" Nov 22 08:35:40 crc kubenswrapper[4743]: E1122 08:35:40.389219 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerName="extract-utilities" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.389227 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerName="extract-utilities" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.389337 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4bbb12-c772-454d-a0cb-ad296318a20a" containerName="registry-server" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.390154 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.394253 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.399540 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5"] Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.474727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.474832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4d7c\" (UniqueName: \"kubernetes.io/projected/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-kube-api-access-n4d7c\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.475368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.576050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4d7c\" (UniqueName: \"kubernetes.io/projected/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-kube-api-access-n4d7c\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.576141 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.576178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.576669 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-bundle\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.576667 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-util\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.600236 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4d7c\" (UniqueName: \"kubernetes.io/projected/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-kube-api-access-n4d7c\") pod \"e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:40 crc kubenswrapper[4743]: I1122 08:35:40.707090 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:41 crc kubenswrapper[4743]: I1122 08:35:41.100521 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5"] Nov 22 08:35:41 crc kubenswrapper[4743]: I1122 08:35:41.207997 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" event={"ID":"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb","Type":"ContainerStarted","Data":"22561cf07721c9f8adc9a0d7cd5e4f6b830eaa6cd7c8fa4a70d1e86686ba2441"} Nov 22 08:35:42 crc kubenswrapper[4743]: I1122 08:35:42.213040 4743 generic.go:334] "Generic (PLEG): container finished" podID="866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" containerID="17b9409b4c2f096fa7eda48b0bdd38bcfe89920711213a1e40128def4fbe0ee2" exitCode=0 Nov 22 08:35:42 crc kubenswrapper[4743]: I1122 08:35:42.213079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" event={"ID":"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb","Type":"ContainerDied","Data":"17b9409b4c2f096fa7eda48b0bdd38bcfe89920711213a1e40128def4fbe0ee2"} Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.187523 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hhpxp" podUID="bead015e-e8e8-44f2-8dae-41047cd66706" containerName="console" containerID="cri-o://3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7" gracePeriod=15 Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.233613 4743 generic.go:334] "Generic (PLEG): container finished" podID="866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" containerID="0174bcd79c473ae65768ed1534d5fe05adb1151257be12761f5661affc3404d3" exitCode=0 Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.233738 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" event={"ID":"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb","Type":"ContainerDied","Data":"0174bcd79c473ae65768ed1534d5fe05adb1151257be12761f5661affc3404d3"} Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.317208 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-54fvd"] Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.321388 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.328902 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54fvd"] Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.441366 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-catalog-content\") pod \"community-operators-54fvd\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.441440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q659d\" (UniqueName: \"kubernetes.io/projected/7d0ebb2c-716a-432d-b027-7cbf3d936a74-kube-api-access-q659d\") pod \"community-operators-54fvd\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.441528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-utilities\") pod \"community-operators-54fvd\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.542540 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-utilities\") pod \"community-operators-54fvd\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.542602 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-catalog-content\") pod \"community-operators-54fvd\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.542630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q659d\" (UniqueName: \"kubernetes.io/projected/7d0ebb2c-716a-432d-b027-7cbf3d936a74-kube-api-access-q659d\") pod \"community-operators-54fvd\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.543118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-utilities\") pod \"community-operators-54fvd\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.550071 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-catalog-content\") pod \"community-operators-54fvd\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.569383 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q659d\" (UniqueName: \"kubernetes.io/projected/7d0ebb2c-716a-432d-b027-7cbf3d936a74-kube-api-access-q659d\") pod \"community-operators-54fvd\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.626590 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hhpxp_bead015e-e8e8-44f2-8dae-41047cd66706/console/0.log" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.626668 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.693827 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.745153 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-oauth-serving-cert\") pod \"bead015e-e8e8-44f2-8dae-41047cd66706\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.745213 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-oauth-config\") pod \"bead015e-e8e8-44f2-8dae-41047cd66706\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.745248 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-console-config\") pod \"bead015e-e8e8-44f2-8dae-41047cd66706\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.745265 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-service-ca\") pod \"bead015e-e8e8-44f2-8dae-41047cd66706\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.745285 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-serving-cert\") pod \"bead015e-e8e8-44f2-8dae-41047cd66706\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.745311 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4962z\" (UniqueName: \"kubernetes.io/projected/bead015e-e8e8-44f2-8dae-41047cd66706-kube-api-access-4962z\") pod \"bead015e-e8e8-44f2-8dae-41047cd66706\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.745332 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-trusted-ca-bundle\") pod \"bead015e-e8e8-44f2-8dae-41047cd66706\" (UID: \"bead015e-e8e8-44f2-8dae-41047cd66706\") " Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.746001 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bead015e-e8e8-44f2-8dae-41047cd66706" (UID: "bead015e-e8e8-44f2-8dae-41047cd66706"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.746009 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-service-ca" (OuterVolumeSpecName: "service-ca") pod "bead015e-e8e8-44f2-8dae-41047cd66706" (UID: "bead015e-e8e8-44f2-8dae-41047cd66706"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.746115 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-console-config" (OuterVolumeSpecName: "console-config") pod "bead015e-e8e8-44f2-8dae-41047cd66706" (UID: "bead015e-e8e8-44f2-8dae-41047cd66706"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.746125 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bead015e-e8e8-44f2-8dae-41047cd66706" (UID: "bead015e-e8e8-44f2-8dae-41047cd66706"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.755268 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bead015e-e8e8-44f2-8dae-41047cd66706" (UID: "bead015e-e8e8-44f2-8dae-41047cd66706"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.757065 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bead015e-e8e8-44f2-8dae-41047cd66706-kube-api-access-4962z" (OuterVolumeSpecName: "kube-api-access-4962z") pod "bead015e-e8e8-44f2-8dae-41047cd66706" (UID: "bead015e-e8e8-44f2-8dae-41047cd66706"). InnerVolumeSpecName "kube-api-access-4962z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.757176 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bead015e-e8e8-44f2-8dae-41047cd66706" (UID: "bead015e-e8e8-44f2-8dae-41047cd66706"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.846334 4743 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.846363 4743 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-oauth-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.846373 4743 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-console-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.846382 4743 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-service-ca\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.846390 4743 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bead015e-e8e8-44f2-8dae-41047cd66706-console-serving-cert\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.846398 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4962z\" (UniqueName: \"kubernetes.io/projected/bead015e-e8e8-44f2-8dae-41047cd66706-kube-api-access-4962z\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:44 crc kubenswrapper[4743]: I1122 08:35:44.846409 4743 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bead015e-e8e8-44f2-8dae-41047cd66706-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.130451 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54fvd"] Nov 22 08:35:45 crc kubenswrapper[4743]: W1122 08:35:45.143774 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d0ebb2c_716a_432d_b027_7cbf3d936a74.slice/crio-dde84e5345800cfdc2262a68b3df907e4522101dcdcec762256716e784460cae WatchSource:0}: Error finding container dde84e5345800cfdc2262a68b3df907e4522101dcdcec762256716e784460cae: Status 404 returned error can't find the container with id dde84e5345800cfdc2262a68b3df907e4522101dcdcec762256716e784460cae Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.240863 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hhpxp_bead015e-e8e8-44f2-8dae-41047cd66706/console/0.log" Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.240925 4743 generic.go:334] "Generic (PLEG): container finished" podID="bead015e-e8e8-44f2-8dae-41047cd66706" containerID="3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7" exitCode=2 Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.240997 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hhpxp" event={"ID":"bead015e-e8e8-44f2-8dae-41047cd66706","Type":"ContainerDied","Data":"3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7"} Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.241030 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hhpxp" event={"ID":"bead015e-e8e8-44f2-8dae-41047cd66706","Type":"ContainerDied","Data":"1febf60a2946ceb6e5b87daa709f1e3e29f258386c8dd565b74601d96a835da7"} Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.241036 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hhpxp" Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.241063 4743 scope.go:117] "RemoveContainer" containerID="3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7" Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.242430 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54fvd" event={"ID":"7d0ebb2c-716a-432d-b027-7cbf3d936a74","Type":"ContainerStarted","Data":"dde84e5345800cfdc2262a68b3df907e4522101dcdcec762256716e784460cae"} Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.245201 4743 generic.go:334] "Generic (PLEG): container finished" podID="866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" containerID="14551a85ff0e83db96d6ea875fc49b25d9225bf070456e000dd86b17b245dd1d" exitCode=0 Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.245304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" event={"ID":"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb","Type":"ContainerDied","Data":"14551a85ff0e83db96d6ea875fc49b25d9225bf070456e000dd86b17b245dd1d"} Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.259891 4743 scope.go:117] "RemoveContainer" containerID="3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7" Nov 22 08:35:45 crc kubenswrapper[4743]: E1122 08:35:45.260392 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7\": container with ID starting with 3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7 not found: ID does not exist" containerID="3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7" Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.260434 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7"} err="failed to get container status \"3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7\": rpc error: code = NotFound desc = could not find container \"3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7\": container with ID starting with 3d01d507fe3547e9c50e8791ef1c92f0be0a3753ccfdcc314c768be54bb364b7 not found: ID does not exist" Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.276991 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hhpxp"] Nov 22 08:35:45 crc kubenswrapper[4743]: I1122 08:35:45.280126 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hhpxp"] Nov 22 08:35:46 crc kubenswrapper[4743]: I1122 08:35:46.251846 4743 generic.go:334] "Generic (PLEG): container finished" podID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerID="488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0" exitCode=0 Nov 22 08:35:46 crc kubenswrapper[4743]: I1122 08:35:46.251950 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54fvd" event={"ID":"7d0ebb2c-716a-432d-b027-7cbf3d936a74","Type":"ContainerDied","Data":"488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0"} Nov 22 08:35:46 crc kubenswrapper[4743]: I1122 08:35:46.497948 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:46 crc kubenswrapper[4743]: I1122 08:35:46.569249 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4d7c\" (UniqueName: \"kubernetes.io/projected/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-kube-api-access-n4d7c\") pod \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " Nov 22 08:35:46 crc kubenswrapper[4743]: I1122 08:35:46.569374 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-util\") pod \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " Nov 22 08:35:46 crc kubenswrapper[4743]: I1122 08:35:46.569455 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-bundle\") pod \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\" (UID: \"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb\") " Nov 22 08:35:46 crc kubenswrapper[4743]: I1122 08:35:46.570513 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-bundle" (OuterVolumeSpecName: "bundle") pod "866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" (UID: "866ceb06-9d22-46bb-aa63-73c7f2f2e3fb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:35:46 crc kubenswrapper[4743]: I1122 08:35:46.574395 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-kube-api-access-n4d7c" (OuterVolumeSpecName: "kube-api-access-n4d7c") pod "866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" (UID: "866ceb06-9d22-46bb-aa63-73c7f2f2e3fb"). InnerVolumeSpecName "kube-api-access-n4d7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:35:46 crc kubenswrapper[4743]: I1122 08:35:46.671251 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:46 crc kubenswrapper[4743]: I1122 08:35:46.671565 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4d7c\" (UniqueName: \"kubernetes.io/projected/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-kube-api-access-n4d7c\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:47 crc kubenswrapper[4743]: I1122 08:35:47.160886 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bead015e-e8e8-44f2-8dae-41047cd66706" path="/var/lib/kubelet/pods/bead015e-e8e8-44f2-8dae-41047cd66706/volumes" Nov 22 08:35:47 crc kubenswrapper[4743]: I1122 08:35:47.263903 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" event={"ID":"866ceb06-9d22-46bb-aa63-73c7f2f2e3fb","Type":"ContainerDied","Data":"22561cf07721c9f8adc9a0d7cd5e4f6b830eaa6cd7c8fa4a70d1e86686ba2441"} Nov 22 08:35:47 crc kubenswrapper[4743]: I1122 08:35:47.263971 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22561cf07721c9f8adc9a0d7cd5e4f6b830eaa6cd7c8fa4a70d1e86686ba2441" Nov 22 08:35:47 crc kubenswrapper[4743]: I1122 08:35:47.263999 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5" Nov 22 08:35:47 crc kubenswrapper[4743]: I1122 08:35:47.658052 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-util" (OuterVolumeSpecName: "util") pod "866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" (UID: "866ceb06-9d22-46bb-aa63-73c7f2f2e3fb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:35:47 crc kubenswrapper[4743]: I1122 08:35:47.685878 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/866ceb06-9d22-46bb-aa63-73c7f2f2e3fb-util\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:49 crc kubenswrapper[4743]: I1122 08:35:49.275723 4743 generic.go:334] "Generic (PLEG): container finished" podID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerID="5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54" exitCode=0 Nov 22 08:35:49 crc kubenswrapper[4743]: I1122 08:35:49.276074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54fvd" event={"ID":"7d0ebb2c-716a-432d-b027-7cbf3d936a74","Type":"ContainerDied","Data":"5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54"} Nov 22 08:35:50 crc kubenswrapper[4743]: I1122 08:35:50.286743 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54fvd" event={"ID":"7d0ebb2c-716a-432d-b027-7cbf3d936a74","Type":"ContainerStarted","Data":"4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de"} Nov 22 08:35:50 crc kubenswrapper[4743]: I1122 08:35:50.308276 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-54fvd" podStartSLOduration=2.77367162 podStartE2EDuration="6.308260349s" podCreationTimestamp="2025-11-22 08:35:44 +0000 UTC" firstStartedPulling="2025-11-22 08:35:46.253106053 +0000 UTC m=+819.959467105" lastFinishedPulling="2025-11-22 08:35:49.787694782 +0000 UTC m=+823.494055834" observedRunningTime="2025-11-22 08:35:50.306737655 +0000 UTC m=+824.013098717" watchObservedRunningTime="2025-11-22 08:35:50.308260349 +0000 UTC m=+824.014621401" Nov 22 08:35:54 crc kubenswrapper[4743]: I1122 08:35:54.694506 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:54 crc kubenswrapper[4743]: I1122 08:35:54.695392 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:54 crc kubenswrapper[4743]: I1122 08:35:54.754788 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:55 crc kubenswrapper[4743]: I1122 08:35:55.358099 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.674853 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j"] Nov 22 08:35:56 crc kubenswrapper[4743]: E1122 08:35:56.675106 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" containerName="util" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.675118 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" containerName="util" Nov 22 08:35:56 crc kubenswrapper[4743]: E1122 08:35:56.675131 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" containerName="extract" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.675137 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" containerName="extract" Nov 22 08:35:56 crc kubenswrapper[4743]: E1122 08:35:56.675151 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" containerName="pull" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.675157 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" containerName="pull" Nov 22 08:35:56 crc kubenswrapper[4743]: E1122 08:35:56.675164 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bead015e-e8e8-44f2-8dae-41047cd66706" containerName="console" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.675170 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bead015e-e8e8-44f2-8dae-41047cd66706" containerName="console" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.675286 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="866ceb06-9d22-46bb-aa63-73c7f2f2e3fb" containerName="extract" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.675310 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bead015e-e8e8-44f2-8dae-41047cd66706" containerName="console" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.675949 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.678424 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.678664 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.678714 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-h6qsh" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.678528 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.679006 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.692330 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j"] Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.797650 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8406cff2-721d-4c2b-90e4-343769c8ae38-apiservice-cert\") pod \"metallb-operator-controller-manager-66fbf9c95c-xnt8j\" (UID: \"8406cff2-721d-4c2b-90e4-343769c8ae38\") " pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.797695 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8406cff2-721d-4c2b-90e4-343769c8ae38-webhook-cert\") pod \"metallb-operator-controller-manager-66fbf9c95c-xnt8j\" (UID: \"8406cff2-721d-4c2b-90e4-343769c8ae38\") " pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.797720 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlmld\" (UniqueName: \"kubernetes.io/projected/8406cff2-721d-4c2b-90e4-343769c8ae38-kube-api-access-nlmld\") pod \"metallb-operator-controller-manager-66fbf9c95c-xnt8j\" (UID: \"8406cff2-721d-4c2b-90e4-343769c8ae38\") " pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.898909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8406cff2-721d-4c2b-90e4-343769c8ae38-apiservice-cert\") pod \"metallb-operator-controller-manager-66fbf9c95c-xnt8j\" (UID: \"8406cff2-721d-4c2b-90e4-343769c8ae38\") " pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.898964 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8406cff2-721d-4c2b-90e4-343769c8ae38-webhook-cert\") pod \"metallb-operator-controller-manager-66fbf9c95c-xnt8j\" (UID: \"8406cff2-721d-4c2b-90e4-343769c8ae38\") " pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.898995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlmld\" (UniqueName: \"kubernetes.io/projected/8406cff2-721d-4c2b-90e4-343769c8ae38-kube-api-access-nlmld\") pod \"metallb-operator-controller-manager-66fbf9c95c-xnt8j\" (UID: \"8406cff2-721d-4c2b-90e4-343769c8ae38\") " pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.905340 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8406cff2-721d-4c2b-90e4-343769c8ae38-webhook-cert\") pod \"metallb-operator-controller-manager-66fbf9c95c-xnt8j\" (UID: \"8406cff2-721d-4c2b-90e4-343769c8ae38\") " pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.913127 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8406cff2-721d-4c2b-90e4-343769c8ae38-apiservice-cert\") pod \"metallb-operator-controller-manager-66fbf9c95c-xnt8j\" (UID: \"8406cff2-721d-4c2b-90e4-343769c8ae38\") " pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.915950 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlmld\" (UniqueName: \"kubernetes.io/projected/8406cff2-721d-4c2b-90e4-343769c8ae38-kube-api-access-nlmld\") pod \"metallb-operator-controller-manager-66fbf9c95c-xnt8j\" (UID: \"8406cff2-721d-4c2b-90e4-343769c8ae38\") " pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:56 crc kubenswrapper[4743]: I1122 08:35:56.994279 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.110211 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54fvd"] Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.206339 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4"] Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.211264 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.213701 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.213937 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.214104 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-tg4v2" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.223567 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4"] Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.303420 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a874228-69b2-4dd8-b768-3b21fa2c45d8-webhook-cert\") pod \"metallb-operator-webhook-server-7ff5869b9c-b7qc4\" (UID: \"3a874228-69b2-4dd8-b768-3b21fa2c45d8\") " pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.303485 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75d8p\" (UniqueName: \"kubernetes.io/projected/3a874228-69b2-4dd8-b768-3b21fa2c45d8-kube-api-access-75d8p\") pod \"metallb-operator-webhook-server-7ff5869b9c-b7qc4\" (UID: \"3a874228-69b2-4dd8-b768-3b21fa2c45d8\") " pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.303506 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a874228-69b2-4dd8-b768-3b21fa2c45d8-apiservice-cert\") pod \"metallb-operator-webhook-server-7ff5869b9c-b7qc4\" (UID: \"3a874228-69b2-4dd8-b768-3b21fa2c45d8\") " pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.405008 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75d8p\" (UniqueName: \"kubernetes.io/projected/3a874228-69b2-4dd8-b768-3b21fa2c45d8-kube-api-access-75d8p\") pod \"metallb-operator-webhook-server-7ff5869b9c-b7qc4\" (UID: \"3a874228-69b2-4dd8-b768-3b21fa2c45d8\") " pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.405050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a874228-69b2-4dd8-b768-3b21fa2c45d8-apiservice-cert\") pod \"metallb-operator-webhook-server-7ff5869b9c-b7qc4\" (UID: \"3a874228-69b2-4dd8-b768-3b21fa2c45d8\") " pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.405106 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a874228-69b2-4dd8-b768-3b21fa2c45d8-webhook-cert\") pod \"metallb-operator-webhook-server-7ff5869b9c-b7qc4\" (UID: \"3a874228-69b2-4dd8-b768-3b21fa2c45d8\") " pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.408846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3a874228-69b2-4dd8-b768-3b21fa2c45d8-apiservice-cert\") pod \"metallb-operator-webhook-server-7ff5869b9c-b7qc4\" (UID: \"3a874228-69b2-4dd8-b768-3b21fa2c45d8\") " pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.412247 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3a874228-69b2-4dd8-b768-3b21fa2c45d8-webhook-cert\") pod \"metallb-operator-webhook-server-7ff5869b9c-b7qc4\" (UID: \"3a874228-69b2-4dd8-b768-3b21fa2c45d8\") " pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.422356 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75d8p\" (UniqueName: \"kubernetes.io/projected/3a874228-69b2-4dd8-b768-3b21fa2c45d8-kube-api-access-75d8p\") pod \"metallb-operator-webhook-server-7ff5869b9c-b7qc4\" (UID: \"3a874228-69b2-4dd8-b768-3b21fa2c45d8\") " pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.506195 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j"] Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.543981 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:35:57 crc kubenswrapper[4743]: I1122 08:35:57.794674 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4"] Nov 22 08:35:57 crc kubenswrapper[4743]: W1122 08:35:57.805713 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a874228_69b2_4dd8_b768_3b21fa2c45d8.slice/crio-391b66a9b75cd585ea43ab8df3ff3ada5ce40a3ba50baa9e165567332a9a2604 WatchSource:0}: Error finding container 391b66a9b75cd585ea43ab8df3ff3ada5ce40a3ba50baa9e165567332a9a2604: Status 404 returned error can't find the container with id 391b66a9b75cd585ea43ab8df3ff3ada5ce40a3ba50baa9e165567332a9a2604 Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.324772 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" event={"ID":"3a874228-69b2-4dd8-b768-3b21fa2c45d8","Type":"ContainerStarted","Data":"391b66a9b75cd585ea43ab8df3ff3ada5ce40a3ba50baa9e165567332a9a2604"} Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.325816 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" event={"ID":"8406cff2-721d-4c2b-90e4-343769c8ae38","Type":"ContainerStarted","Data":"1837bad136f045cdca10026ff74189f2e76a9db96253e8ce3f4d3068f56db97a"} Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.325977 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-54fvd" podUID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerName="registry-server" containerID="cri-o://4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de" gracePeriod=2 Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.743353 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.821012 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q659d\" (UniqueName: \"kubernetes.io/projected/7d0ebb2c-716a-432d-b027-7cbf3d936a74-kube-api-access-q659d\") pod \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.821128 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-catalog-content\") pod \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.821167 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-utilities\") pod \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\" (UID: \"7d0ebb2c-716a-432d-b027-7cbf3d936a74\") " Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.822215 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-utilities" (OuterVolumeSpecName: "utilities") pod "7d0ebb2c-716a-432d-b027-7cbf3d936a74" (UID: "7d0ebb2c-716a-432d-b027-7cbf3d936a74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.831181 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d0ebb2c-716a-432d-b027-7cbf3d936a74-kube-api-access-q659d" (OuterVolumeSpecName: "kube-api-access-q659d") pod "7d0ebb2c-716a-432d-b027-7cbf3d936a74" (UID: "7d0ebb2c-716a-432d-b027-7cbf3d936a74"). InnerVolumeSpecName "kube-api-access-q659d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.893383 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d0ebb2c-716a-432d-b027-7cbf3d936a74" (UID: "7d0ebb2c-716a-432d-b027-7cbf3d936a74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.922565 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q659d\" (UniqueName: \"kubernetes.io/projected/7d0ebb2c-716a-432d-b027-7cbf3d936a74-kube-api-access-q659d\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.922610 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:58 crc kubenswrapper[4743]: I1122 08:35:58.922621 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d0ebb2c-716a-432d-b027-7cbf3d936a74-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.337886 4743 generic.go:334] "Generic (PLEG): container finished" podID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerID="4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de" exitCode=0 Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.337965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54fvd" event={"ID":"7d0ebb2c-716a-432d-b027-7cbf3d936a74","Type":"ContainerDied","Data":"4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de"} Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.338068 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54fvd" event={"ID":"7d0ebb2c-716a-432d-b027-7cbf3d936a74","Type":"ContainerDied","Data":"dde84e5345800cfdc2262a68b3df907e4522101dcdcec762256716e784460cae"} Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.338037 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54fvd" Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.338099 4743 scope.go:117] "RemoveContainer" containerID="4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de" Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.360370 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54fvd"] Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.364398 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-54fvd"] Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.370433 4743 scope.go:117] "RemoveContainer" containerID="5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54" Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.407815 4743 scope.go:117] "RemoveContainer" containerID="488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0" Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.435441 4743 scope.go:117] "RemoveContainer" containerID="4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de" Nov 22 08:35:59 crc kubenswrapper[4743]: E1122 08:35:59.435901 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de\": container with ID starting with 4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de not found: ID does not exist" containerID="4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de" Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.435933 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de"} err="failed to get container status \"4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de\": rpc error: code = NotFound desc = could not find container \"4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de\": container with ID starting with 4d48591e4a347c8dd5b3df7ee1e7e3c86ad73055d8d9da2d4688d5ef2c2c25de not found: ID does not exist" Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.435954 4743 scope.go:117] "RemoveContainer" containerID="5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54" Nov 22 08:35:59 crc kubenswrapper[4743]: E1122 08:35:59.436414 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54\": container with ID starting with 5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54 not found: ID does not exist" containerID="5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54" Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.436435 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54"} err="failed to get container status \"5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54\": rpc error: code = NotFound desc = could not find container \"5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54\": container with ID starting with 5dc7748556b5045192ec3cbc8bba4e9de1a4f7d5c9ac99c4228aa64d52819a54 not found: ID does not exist" Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.436447 4743 scope.go:117] "RemoveContainer" containerID="488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0" Nov 22 08:35:59 crc kubenswrapper[4743]: E1122 08:35:59.436808 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0\": container with ID starting with 488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0 not found: ID does not exist" containerID="488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0" Nov 22 08:35:59 crc kubenswrapper[4743]: I1122 08:35:59.436831 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0"} err="failed to get container status \"488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0\": rpc error: code = NotFound desc = could not find container \"488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0\": container with ID starting with 488f81de61ee7d327135e9c924bce690884b54544e30e7a1ebbe777876bee4b0 not found: ID does not exist" Nov 22 08:36:01 crc kubenswrapper[4743]: I1122 08:36:01.164172 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" path="/var/lib/kubelet/pods/7d0ebb2c-716a-432d-b027-7cbf3d936a74/volumes" Nov 22 08:36:01 crc kubenswrapper[4743]: I1122 08:36:01.241492 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:36:01 crc kubenswrapper[4743]: I1122 08:36:01.241545 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:36:01 crc kubenswrapper[4743]: I1122 08:36:01.241611 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:36:01 crc kubenswrapper[4743]: I1122 08:36:01.242045 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37c97b2e81f6751d68ee6b6779e8d74c99c6cc572fe2c42aebcabd8215411f9d"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 08:36:01 crc kubenswrapper[4743]: I1122 08:36:01.242093 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://37c97b2e81f6751d68ee6b6779e8d74c99c6cc572fe2c42aebcabd8215411f9d" gracePeriod=600 Nov 22 08:36:01 crc kubenswrapper[4743]: I1122 08:36:01.359858 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" event={"ID":"8406cff2-721d-4c2b-90e4-343769c8ae38","Type":"ContainerStarted","Data":"9ab45d64f450d822ae8d348c7d4256eca4bc3ce7528a745e14ddc764e4073641"} Nov 22 08:36:01 crc kubenswrapper[4743]: I1122 08:36:01.359999 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:36:02 crc kubenswrapper[4743]: I1122 08:36:02.367324 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="37c97b2e81f6751d68ee6b6779e8d74c99c6cc572fe2c42aebcabd8215411f9d" exitCode=0 Nov 22 08:36:02 crc kubenswrapper[4743]: I1122 08:36:02.367390 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"37c97b2e81f6751d68ee6b6779e8d74c99c6cc572fe2c42aebcabd8215411f9d"} Nov 22 08:36:02 crc kubenswrapper[4743]: I1122 08:36:02.367456 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"86f8f3accbf0662fa413321f99b5f1afa28e63a1e90c6983235baff64b7561bc"} Nov 22 08:36:02 crc kubenswrapper[4743]: I1122 08:36:02.367477 4743 scope.go:117] "RemoveContainer" containerID="100169dfb49bd3feeeb68539e2f7fbcfba3bc0cdede84e267e0c32d1e1bb126a" Nov 22 08:36:02 crc kubenswrapper[4743]: I1122 08:36:02.389512 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" podStartSLOduration=3.507318364 podStartE2EDuration="6.389489781s" podCreationTimestamp="2025-11-22 08:35:56 +0000 UTC" firstStartedPulling="2025-11-22 08:35:57.523126111 +0000 UTC m=+831.229487173" lastFinishedPulling="2025-11-22 08:36:00.405297538 +0000 UTC m=+834.111658590" observedRunningTime="2025-11-22 08:36:01.390525875 +0000 UTC m=+835.096886927" watchObservedRunningTime="2025-11-22 08:36:02.389489781 +0000 UTC m=+836.095850833" Nov 22 08:36:03 crc kubenswrapper[4743]: I1122 08:36:03.380082 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" event={"ID":"3a874228-69b2-4dd8-b768-3b21fa2c45d8","Type":"ContainerStarted","Data":"5e8f3a40bb59de1d8f3e17fa0c151d3e877b94f13f841e4be27f5a1466cbfa6f"} Nov 22 08:36:03 crc kubenswrapper[4743]: I1122 08:36:03.380488 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:36:17 crc kubenswrapper[4743]: I1122 08:36:17.549708 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" Nov 22 08:36:17 crc kubenswrapper[4743]: I1122 08:36:17.572263 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7ff5869b9c-b7qc4" podStartSLOduration=16.131655299 podStartE2EDuration="20.572247645s" podCreationTimestamp="2025-11-22 08:35:57 +0000 UTC" firstStartedPulling="2025-11-22 08:35:57.808464103 +0000 UTC m=+831.514825155" lastFinishedPulling="2025-11-22 08:36:02.249056449 +0000 UTC m=+835.955417501" observedRunningTime="2025-11-22 08:36:03.414281634 +0000 UTC m=+837.120642726" watchObservedRunningTime="2025-11-22 08:36:17.572247645 +0000 UTC m=+851.278608697" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.138697 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2l48n"] Nov 22 08:36:25 crc kubenswrapper[4743]: E1122 08:36:25.139489 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerName="extract-utilities" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.139507 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerName="extract-utilities" Nov 22 08:36:25 crc kubenswrapper[4743]: E1122 08:36:25.139525 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerName="registry-server" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.139533 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerName="registry-server" Nov 22 08:36:25 crc kubenswrapper[4743]: E1122 08:36:25.139553 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerName="extract-content" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.139561 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerName="extract-content" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.139693 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d0ebb2c-716a-432d-b027-7cbf3d936a74" containerName="registry-server" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.140547 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.153993 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdvpg\" (UniqueName: \"kubernetes.io/projected/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-kube-api-access-sdvpg\") pod \"certified-operators-2l48n\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.154073 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-catalog-content\") pod \"certified-operators-2l48n\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.154108 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-utilities\") pod \"certified-operators-2l48n\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.158236 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2l48n"] Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.255201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdvpg\" (UniqueName: \"kubernetes.io/projected/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-kube-api-access-sdvpg\") pod \"certified-operators-2l48n\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.255635 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-catalog-content\") pod \"certified-operators-2l48n\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.255669 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-utilities\") pod \"certified-operators-2l48n\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.256203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-utilities\") pod \"certified-operators-2l48n\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.256163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-catalog-content\") pod \"certified-operators-2l48n\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.276094 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdvpg\" (UniqueName: \"kubernetes.io/projected/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-kube-api-access-sdvpg\") pod \"certified-operators-2l48n\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.496949 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:25 crc kubenswrapper[4743]: I1122 08:36:25.749420 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2l48n"] Nov 22 08:36:26 crc kubenswrapper[4743]: I1122 08:36:26.496450 4743 generic.go:334] "Generic (PLEG): container finished" podID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerID="704b53b40f6353d714994cf8e77d063e82c991e4f5ecea704b119b3eaf5af56e" exitCode=0 Nov 22 08:36:26 crc kubenswrapper[4743]: I1122 08:36:26.496793 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l48n" event={"ID":"2db074dd-6e0e-41f0-81b4-f59d2e4a9863","Type":"ContainerDied","Data":"704b53b40f6353d714994cf8e77d063e82c991e4f5ecea704b119b3eaf5af56e"} Nov 22 08:36:26 crc kubenswrapper[4743]: I1122 08:36:26.496830 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l48n" event={"ID":"2db074dd-6e0e-41f0-81b4-f59d2e4a9863","Type":"ContainerStarted","Data":"810508f94ad1b29a84d94fa03f6be15a30d43b87b65aa9b3399df6b273c4d459"} Nov 22 08:36:27 crc kubenswrapper[4743]: I1122 08:36:27.502202 4743 generic.go:334] "Generic (PLEG): container finished" podID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerID="9dfb3f7b4a804f702ec9f11f39f7349cb2c9c46f17e831969cbfba2232c484ed" exitCode=0 Nov 22 08:36:27 crc kubenswrapper[4743]: I1122 08:36:27.502262 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l48n" event={"ID":"2db074dd-6e0e-41f0-81b4-f59d2e4a9863","Type":"ContainerDied","Data":"9dfb3f7b4a804f702ec9f11f39f7349cb2c9c46f17e831969cbfba2232c484ed"} Nov 22 08:36:28 crc kubenswrapper[4743]: I1122 08:36:28.510636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l48n" event={"ID":"2db074dd-6e0e-41f0-81b4-f59d2e4a9863","Type":"ContainerStarted","Data":"737c1436f793d113512095f3d2cdde09be0826d51052d21c6647e77a15288a78"} Nov 22 08:36:28 crc kubenswrapper[4743]: I1122 08:36:28.529089 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2l48n" podStartSLOduration=2.123973095 podStartE2EDuration="3.529074168s" podCreationTimestamp="2025-11-22 08:36:25 +0000 UTC" firstStartedPulling="2025-11-22 08:36:26.499680312 +0000 UTC m=+860.206041364" lastFinishedPulling="2025-11-22 08:36:27.904781375 +0000 UTC m=+861.611142437" observedRunningTime="2025-11-22 08:36:28.524439164 +0000 UTC m=+862.230800236" watchObservedRunningTime="2025-11-22 08:36:28.529074168 +0000 UTC m=+862.235435220" Nov 22 08:36:35 crc kubenswrapper[4743]: I1122 08:36:35.497507 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:35 crc kubenswrapper[4743]: I1122 08:36:35.498132 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:35 crc kubenswrapper[4743]: I1122 08:36:35.535485 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:35 crc kubenswrapper[4743]: I1122 08:36:35.620810 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:35 crc kubenswrapper[4743]: I1122 08:36:35.759914 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2l48n"] Nov 22 08:36:36 crc kubenswrapper[4743]: I1122 08:36:36.998672 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-66fbf9c95c-xnt8j" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.585009 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2l48n" podUID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerName="registry-server" containerID="cri-o://737c1436f793d113512095f3d2cdde09be0826d51052d21c6647e77a15288a78" gracePeriod=2 Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.752167 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-g4q9l"] Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.754716 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.756965 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-lsrm2" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.757010 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.757166 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.759873 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn"] Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.760774 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.762503 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.762598 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn"] Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.836643 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-hkg92"] Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.837468 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hkg92" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.839976 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.839993 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.840009 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.846325 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-4746v" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.890434 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6c7b4b5f48-m9fwk"] Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.891568 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.898053 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.902865 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-m9fwk"] Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.927858 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-frr-startup\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.928720 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-frr-sockets\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.928747 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-metrics\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.928876 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggzx9\" (UniqueName: \"kubernetes.io/projected/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-kube-api-access-ggzx9\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.928911 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j7dt\" (UniqueName: \"kubernetes.io/projected/8b6ebac3-81ab-499b-bfcf-89e3416072c2-kube-api-access-5j7dt\") pod \"frr-k8s-webhook-server-6998585d5-5l6gn\" (UID: \"8b6ebac3-81ab-499b-bfcf-89e3416072c2\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.930953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-metrics-certs\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.931093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-frr-conf\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.931122 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-reloader\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:37 crc kubenswrapper[4743]: I1122 08:36:37.931242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b6ebac3-81ab-499b-bfcf-89e3416072c2-cert\") pod \"frr-k8s-webhook-server-6998585d5-5l6gn\" (UID: \"8b6ebac3-81ab-499b-bfcf-89e3416072c2\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032570 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-metrics-certs\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-frr-startup\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032649 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-frr-sockets\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-metrics\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032694 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggzx9\" (UniqueName: \"kubernetes.io/projected/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-kube-api-access-ggzx9\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032721 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j7dt\" (UniqueName: \"kubernetes.io/projected/8b6ebac3-81ab-499b-bfcf-89e3416072c2-kube-api-access-5j7dt\") pod \"frr-k8s-webhook-server-6998585d5-5l6gn\" (UID: \"8b6ebac3-81ab-499b-bfcf-89e3416072c2\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032744 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032760 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzrwk\" (UniqueName: \"kubernetes.io/projected/ec8608f7-e718-49d1-bdba-00dcdb9805b2-kube-api-access-tzrwk\") pod \"controller-6c7b4b5f48-m9fwk\" (UID: \"ec8608f7-e718-49d1-bdba-00dcdb9805b2\") " pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032779 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-metrics-certs\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032801 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-frr-conf\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032817 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/683dee48-c9d8-42c2-a0d4-8776fcf48a01-metallb-excludel2\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032853 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-reloader\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032871 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec8608f7-e718-49d1-bdba-00dcdb9805b2-metrics-certs\") pod \"controller-6c7b4b5f48-m9fwk\" (UID: \"ec8608f7-e718-49d1-bdba-00dcdb9805b2\") " pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b6ebac3-81ab-499b-bfcf-89e3416072c2-cert\") pod \"frr-k8s-webhook-server-6998585d5-5l6gn\" (UID: \"8b6ebac3-81ab-499b-bfcf-89e3416072c2\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032915 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec8608f7-e718-49d1-bdba-00dcdb9805b2-cert\") pod \"controller-6c7b4b5f48-m9fwk\" (UID: \"ec8608f7-e718-49d1-bdba-00dcdb9805b2\") " pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.032937 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjx29\" (UniqueName: \"kubernetes.io/projected/683dee48-c9d8-42c2-a0d4-8776fcf48a01-kube-api-access-xjx29\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.033806 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-frr-startup\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.034040 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-frr-sockets\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.034218 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-metrics\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: E1122 08:36:38.034603 4743 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Nov 22 08:36:38 crc kubenswrapper[4743]: E1122 08:36:38.034647 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-metrics-certs podName:c019d9ca-5ddf-4c98-b5f1-c425686a58d4 nodeName:}" failed. No retries permitted until 2025-11-22 08:36:38.534634015 +0000 UTC m=+872.240995067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-metrics-certs") pod "frr-k8s-g4q9l" (UID: "c019d9ca-5ddf-4c98-b5f1-c425686a58d4") : secret "frr-k8s-certs-secret" not found Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.034921 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-frr-conf\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.035091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-reloader\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.046961 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b6ebac3-81ab-499b-bfcf-89e3416072c2-cert\") pod \"frr-k8s-webhook-server-6998585d5-5l6gn\" (UID: \"8b6ebac3-81ab-499b-bfcf-89e3416072c2\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.050384 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j7dt\" (UniqueName: \"kubernetes.io/projected/8b6ebac3-81ab-499b-bfcf-89e3416072c2-kube-api-access-5j7dt\") pod \"frr-k8s-webhook-server-6998585d5-5l6gn\" (UID: \"8b6ebac3-81ab-499b-bfcf-89e3416072c2\") " pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.051822 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggzx9\" (UniqueName: \"kubernetes.io/projected/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-kube-api-access-ggzx9\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.079555 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.133710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.133767 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzrwk\" (UniqueName: \"kubernetes.io/projected/ec8608f7-e718-49d1-bdba-00dcdb9805b2-kube-api-access-tzrwk\") pod \"controller-6c7b4b5f48-m9fwk\" (UID: \"ec8608f7-e718-49d1-bdba-00dcdb9805b2\") " pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.133848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/683dee48-c9d8-42c2-a0d4-8776fcf48a01-metallb-excludel2\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.133882 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec8608f7-e718-49d1-bdba-00dcdb9805b2-metrics-certs\") pod \"controller-6c7b4b5f48-m9fwk\" (UID: \"ec8608f7-e718-49d1-bdba-00dcdb9805b2\") " pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.133923 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec8608f7-e718-49d1-bdba-00dcdb9805b2-cert\") pod \"controller-6c7b4b5f48-m9fwk\" (UID: \"ec8608f7-e718-49d1-bdba-00dcdb9805b2\") " pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:38 crc kubenswrapper[4743]: E1122 08:36:38.133925 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.133956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjx29\" (UniqueName: \"kubernetes.io/projected/683dee48-c9d8-42c2-a0d4-8776fcf48a01-kube-api-access-xjx29\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.133983 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-metrics-certs\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: E1122 08:36:38.134060 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist podName:683dee48-c9d8-42c2-a0d4-8776fcf48a01 nodeName:}" failed. No retries permitted until 2025-11-22 08:36:38.634038499 +0000 UTC m=+872.340399551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist") pod "speaker-hkg92" (UID: "683dee48-c9d8-42c2-a0d4-8776fcf48a01") : secret "metallb-memberlist" not found Nov 22 08:36:38 crc kubenswrapper[4743]: E1122 08:36:38.134527 4743 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Nov 22 08:36:38 crc kubenswrapper[4743]: E1122 08:36:38.134591 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-metrics-certs podName:683dee48-c9d8-42c2-a0d4-8776fcf48a01 nodeName:}" failed. No retries permitted until 2025-11-22 08:36:38.634556204 +0000 UTC m=+872.340917256 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-metrics-certs") pod "speaker-hkg92" (UID: "683dee48-c9d8-42c2-a0d4-8776fcf48a01") : secret "speaker-certs-secret" not found Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.135226 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/683dee48-c9d8-42c2-a0d4-8776fcf48a01-metallb-excludel2\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.137668 4743 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.152352 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec8608f7-e718-49d1-bdba-00dcdb9805b2-cert\") pod \"controller-6c7b4b5f48-m9fwk\" (UID: \"ec8608f7-e718-49d1-bdba-00dcdb9805b2\") " pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.153132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec8608f7-e718-49d1-bdba-00dcdb9805b2-metrics-certs\") pod \"controller-6c7b4b5f48-m9fwk\" (UID: \"ec8608f7-e718-49d1-bdba-00dcdb9805b2\") " pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.153433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjx29\" (UniqueName: \"kubernetes.io/projected/683dee48-c9d8-42c2-a0d4-8776fcf48a01-kube-api-access-xjx29\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.153554 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzrwk\" (UniqueName: \"kubernetes.io/projected/ec8608f7-e718-49d1-bdba-00dcdb9805b2-kube-api-access-tzrwk\") pod \"controller-6c7b4b5f48-m9fwk\" (UID: \"ec8608f7-e718-49d1-bdba-00dcdb9805b2\") " pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.168883 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hhlvt"] Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.170001 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.181262 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhlvt"] Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.227868 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.354357 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-catalog-content\") pod \"redhat-marketplace-hhlvt\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.354424 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-utilities\") pod \"redhat-marketplace-hhlvt\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.354469 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv6bt\" (UniqueName: \"kubernetes.io/projected/771cd119-8b9e-43a8-8fa5-1e91644f2436-kube-api-access-pv6bt\") pod \"redhat-marketplace-hhlvt\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.455432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-utilities\") pod \"redhat-marketplace-hhlvt\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.455492 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv6bt\" (UniqueName: \"kubernetes.io/projected/771cd119-8b9e-43a8-8fa5-1e91644f2436-kube-api-access-pv6bt\") pod \"redhat-marketplace-hhlvt\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.455612 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-catalog-content\") pod \"redhat-marketplace-hhlvt\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.455946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-utilities\") pod \"redhat-marketplace-hhlvt\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.456022 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-catalog-content\") pod \"redhat-marketplace-hhlvt\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.474443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv6bt\" (UniqueName: \"kubernetes.io/projected/771cd119-8b9e-43a8-8fa5-1e91644f2436-kube-api-access-pv6bt\") pod \"redhat-marketplace-hhlvt\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.557062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-metrics-certs\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.559960 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c019d9ca-5ddf-4c98-b5f1-c425686a58d4-metrics-certs\") pod \"frr-k8s-g4q9l\" (UID: \"c019d9ca-5ddf-4c98-b5f1-c425686a58d4\") " pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.586105 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.594330 4743 generic.go:334] "Generic (PLEG): container finished" podID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerID="737c1436f793d113512095f3d2cdde09be0826d51052d21c6647e77a15288a78" exitCode=0 Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.594369 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l48n" event={"ID":"2db074dd-6e0e-41f0-81b4-f59d2e4a9863","Type":"ContainerDied","Data":"737c1436f793d113512095f3d2cdde09be0826d51052d21c6647e77a15288a78"} Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.636660 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn"] Nov 22 08:36:38 crc kubenswrapper[4743]: W1122 08:36:38.642980 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b6ebac3_81ab_499b_bfcf_89e3416072c2.slice/crio-90a6f57dc7f2e7fb9b9d990b70c7c97d008fbb6e1b8c7cff912ba231cf694aba WatchSource:0}: Error finding container 90a6f57dc7f2e7fb9b9d990b70c7c97d008fbb6e1b8c7cff912ba231cf694aba: Status 404 returned error can't find the container with id 90a6f57dc7f2e7fb9b9d990b70c7c97d008fbb6e1b8c7cff912ba231cf694aba Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.658614 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.659036 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-metrics-certs\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: E1122 08:36:38.659624 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 08:36:38 crc kubenswrapper[4743]: E1122 08:36:38.659702 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist podName:683dee48-c9d8-42c2-a0d4-8776fcf48a01 nodeName:}" failed. No retries permitted until 2025-11-22 08:36:39.65968234 +0000 UTC m=+873.366043452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist") pod "speaker-hkg92" (UID: "683dee48-c9d8-42c2-a0d4-8776fcf48a01") : secret "metallb-memberlist" not found Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.662514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-metrics-certs\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.671846 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.709770 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6c7b4b5f48-m9fwk"] Nov 22 08:36:38 crc kubenswrapper[4743]: I1122 08:36:38.797256 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhlvt"] Nov 22 08:36:39 crc kubenswrapper[4743]: I1122 08:36:39.609966 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-m9fwk" event={"ID":"ec8608f7-e718-49d1-bdba-00dcdb9805b2","Type":"ContainerStarted","Data":"2c4ace5c71eecf3d9b876aebea047cb28af782f2c04cbb46f5a8a7504ec31d43"} Nov 22 08:36:39 crc kubenswrapper[4743]: I1122 08:36:39.611157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhlvt" event={"ID":"771cd119-8b9e-43a8-8fa5-1e91644f2436","Type":"ContainerStarted","Data":"c6aaf99250afc63f727e0408ed35e5fa266a6fe9ee68166d47d1d9d99cdf2230"} Nov 22 08:36:39 crc kubenswrapper[4743]: I1122 08:36:39.612283 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" event={"ID":"8b6ebac3-81ab-499b-bfcf-89e3416072c2","Type":"ContainerStarted","Data":"90a6f57dc7f2e7fb9b9d990b70c7c97d008fbb6e1b8c7cff912ba231cf694aba"} Nov 22 08:36:39 crc kubenswrapper[4743]: I1122 08:36:39.672785 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:39 crc kubenswrapper[4743]: E1122 08:36:39.672970 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 08:36:39 crc kubenswrapper[4743]: E1122 08:36:39.673042 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist podName:683dee48-c9d8-42c2-a0d4-8776fcf48a01 nodeName:}" failed. No retries permitted until 2025-11-22 08:36:41.673017843 +0000 UTC m=+875.379378905 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist") pod "speaker-hkg92" (UID: "683dee48-c9d8-42c2-a0d4-8776fcf48a01") : secret "metallb-memberlist" not found Nov 22 08:36:41 crc kubenswrapper[4743]: I1122 08:36:41.701612 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:41 crc kubenswrapper[4743]: E1122 08:36:41.702057 4743 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Nov 22 08:36:41 crc kubenswrapper[4743]: E1122 08:36:41.702103 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist podName:683dee48-c9d8-42c2-a0d4-8776fcf48a01 nodeName:}" failed. No retries permitted until 2025-11-22 08:36:45.702090249 +0000 UTC m=+879.408451301 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist") pod "speaker-hkg92" (UID: "683dee48-c9d8-42c2-a0d4-8776fcf48a01") : secret "metallb-memberlist" not found Nov 22 08:36:42 crc kubenswrapper[4743]: I1122 08:36:42.628289 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4q9l" event={"ID":"c019d9ca-5ddf-4c98-b5f1-c425686a58d4","Type":"ContainerStarted","Data":"daf0262fc76d3fc9ff8402cafcf4127bc7d43e6fca3ee58dd594d1c2fed294ba"} Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.103439 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.128318 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-utilities\") pod \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.128353 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-catalog-content\") pod \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.128414 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdvpg\" (UniqueName: \"kubernetes.io/projected/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-kube-api-access-sdvpg\") pod \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\" (UID: \"2db074dd-6e0e-41f0-81b4-f59d2e4a9863\") " Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.129303 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-utilities" (OuterVolumeSpecName: "utilities") pod "2db074dd-6e0e-41f0-81b4-f59d2e4a9863" (UID: "2db074dd-6e0e-41f0-81b4-f59d2e4a9863"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.135694 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-kube-api-access-sdvpg" (OuterVolumeSpecName: "kube-api-access-sdvpg") pod "2db074dd-6e0e-41f0-81b4-f59d2e4a9863" (UID: "2db074dd-6e0e-41f0-81b4-f59d2e4a9863"). InnerVolumeSpecName "kube-api-access-sdvpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.229809 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdvpg\" (UniqueName: \"kubernetes.io/projected/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-kube-api-access-sdvpg\") on node \"crc\" DevicePath \"\"" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.229861 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.490252 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2db074dd-6e0e-41f0-81b4-f59d2e4a9863" (UID: "2db074dd-6e0e-41f0-81b4-f59d2e4a9863"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.533716 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db074dd-6e0e-41f0-81b4-f59d2e4a9863-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.643087 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-m9fwk" event={"ID":"ec8608f7-e718-49d1-bdba-00dcdb9805b2","Type":"ContainerStarted","Data":"7617e48a7c3a71290c1a8d32f3ac68c41fd93fdaecfb5c6da47fc96599180d3e"} Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.645445 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2l48n" event={"ID":"2db074dd-6e0e-41f0-81b4-f59d2e4a9863","Type":"ContainerDied","Data":"810508f94ad1b29a84d94fa03f6be15a30d43b87b65aa9b3399df6b273c4d459"} Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.645463 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2l48n" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.645530 4743 scope.go:117] "RemoveContainer" containerID="737c1436f793d113512095f3d2cdde09be0826d51052d21c6647e77a15288a78" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.647321 4743 generic.go:334] "Generic (PLEG): container finished" podID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerID="54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895" exitCode=0 Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.647357 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhlvt" event={"ID":"771cd119-8b9e-43a8-8fa5-1e91644f2436","Type":"ContainerDied","Data":"54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895"} Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.660762 4743 scope.go:117] "RemoveContainer" containerID="9dfb3f7b4a804f702ec9f11f39f7349cb2c9c46f17e831969cbfba2232c484ed" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.676176 4743 scope.go:117] "RemoveContainer" containerID="704b53b40f6353d714994cf8e77d063e82c991e4f5ecea704b119b3eaf5af56e" Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.704215 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2l48n"] Nov 22 08:36:44 crc kubenswrapper[4743]: I1122 08:36:44.708465 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2l48n"] Nov 22 08:36:45 crc kubenswrapper[4743]: I1122 08:36:45.162958 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" path="/var/lib/kubelet/pods/2db074dd-6e0e-41f0-81b4-f59d2e4a9863/volumes" Nov 22 08:36:45 crc kubenswrapper[4743]: I1122 08:36:45.655259 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6c7b4b5f48-m9fwk" event={"ID":"ec8608f7-e718-49d1-bdba-00dcdb9805b2","Type":"ContainerStarted","Data":"a9c33b4cb37cd0f60d2a1ab95c6a5db1c8923cf4a308d241cf174bbb040465bd"} Nov 22 08:36:45 crc kubenswrapper[4743]: I1122 08:36:45.655874 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:45 crc kubenswrapper[4743]: I1122 08:36:45.662396 4743 generic.go:334] "Generic (PLEG): container finished" podID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerID="c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27" exitCode=0 Nov 22 08:36:45 crc kubenswrapper[4743]: I1122 08:36:45.662468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhlvt" event={"ID":"771cd119-8b9e-43a8-8fa5-1e91644f2436","Type":"ContainerDied","Data":"c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27"} Nov 22 08:36:45 crc kubenswrapper[4743]: I1122 08:36:45.682373 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6c7b4b5f48-m9fwk" podStartSLOduration=8.682355438 podStartE2EDuration="8.682355438s" podCreationTimestamp="2025-11-22 08:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:36:45.678317261 +0000 UTC m=+879.384678323" watchObservedRunningTime="2025-11-22 08:36:45.682355438 +0000 UTC m=+879.388716490" Nov 22 08:36:45 crc kubenswrapper[4743]: I1122 08:36:45.751132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:45 crc kubenswrapper[4743]: I1122 08:36:45.757377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/683dee48-c9d8-42c2-a0d4-8776fcf48a01-memberlist\") pod \"speaker-hkg92\" (UID: \"683dee48-c9d8-42c2-a0d4-8776fcf48a01\") " pod="metallb-system/speaker-hkg92" Nov 22 08:36:45 crc kubenswrapper[4743]: I1122 08:36:45.951845 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-hkg92" Nov 22 08:36:46 crc kubenswrapper[4743]: W1122 08:36:46.015760 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod683dee48_c9d8_42c2_a0d4_8776fcf48a01.slice/crio-19cd56fccb545ff7e1b3ad2da7bcbe19efde413727e32eb760d2f10211df5dc3 WatchSource:0}: Error finding container 19cd56fccb545ff7e1b3ad2da7bcbe19efde413727e32eb760d2f10211df5dc3: Status 404 returned error can't find the container with id 19cd56fccb545ff7e1b3ad2da7bcbe19efde413727e32eb760d2f10211df5dc3 Nov 22 08:36:46 crc kubenswrapper[4743]: I1122 08:36:46.668315 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hkg92" event={"ID":"683dee48-c9d8-42c2-a0d4-8776fcf48a01","Type":"ContainerStarted","Data":"19cd56fccb545ff7e1b3ad2da7bcbe19efde413727e32eb760d2f10211df5dc3"} Nov 22 08:36:47 crc kubenswrapper[4743]: I1122 08:36:47.674907 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hkg92" event={"ID":"683dee48-c9d8-42c2-a0d4-8776fcf48a01","Type":"ContainerStarted","Data":"3baf01cbea185f53d5afdc811f9f37c68f625c62496382a05a2b79efa3661633"} Nov 22 08:36:49 crc kubenswrapper[4743]: I1122 08:36:49.714221 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhlvt" event={"ID":"771cd119-8b9e-43a8-8fa5-1e91644f2436","Type":"ContainerStarted","Data":"c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608"} Nov 22 08:36:49 crc kubenswrapper[4743]: I1122 08:36:49.715858 4743 generic.go:334] "Generic (PLEG): container finished" podID="c019d9ca-5ddf-4c98-b5f1-c425686a58d4" containerID="3eb34d81e10e6a8f941479fcb87f13c0872162a4f2b53ae39a04aa03ed7aaf33" exitCode=0 Nov 22 08:36:49 crc kubenswrapper[4743]: I1122 08:36:49.716217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4q9l" event={"ID":"c019d9ca-5ddf-4c98-b5f1-c425686a58d4","Type":"ContainerDied","Data":"3eb34d81e10e6a8f941479fcb87f13c0872162a4f2b53ae39a04aa03ed7aaf33"} Nov 22 08:36:49 crc kubenswrapper[4743]: I1122 08:36:49.725077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" event={"ID":"8b6ebac3-81ab-499b-bfcf-89e3416072c2","Type":"ContainerStarted","Data":"218fe09fbe0f5461efe64bf0b0a23a937426d613086525772e8262e89b7d94fd"} Nov 22 08:36:49 crc kubenswrapper[4743]: I1122 08:36:49.725709 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" Nov 22 08:36:49 crc kubenswrapper[4743]: I1122 08:36:49.731368 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-hkg92" event={"ID":"683dee48-c9d8-42c2-a0d4-8776fcf48a01","Type":"ContainerStarted","Data":"df5e03984f700dfea4437321af078a0e744a9e9d72cf4cae8b7fadf487abc74f"} Nov 22 08:36:49 crc kubenswrapper[4743]: I1122 08:36:49.731626 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-hkg92" Nov 22 08:36:49 crc kubenswrapper[4743]: I1122 08:36:49.742603 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hhlvt" podStartSLOduration=9.644154639 podStartE2EDuration="11.742586571s" podCreationTimestamp="2025-11-22 08:36:38 +0000 UTC" firstStartedPulling="2025-11-22 08:36:44.650321964 +0000 UTC m=+878.356683016" lastFinishedPulling="2025-11-22 08:36:46.748753896 +0000 UTC m=+880.455114948" observedRunningTime="2025-11-22 08:36:49.739413039 +0000 UTC m=+883.445774101" watchObservedRunningTime="2025-11-22 08:36:49.742586571 +0000 UTC m=+883.448947623" Nov 22 08:36:49 crc kubenswrapper[4743]: I1122 08:36:49.783404 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-hkg92" podStartSLOduration=12.78338005 podStartE2EDuration="12.78338005s" podCreationTimestamp="2025-11-22 08:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:36:49.780542388 +0000 UTC m=+883.486903460" watchObservedRunningTime="2025-11-22 08:36:49.78338005 +0000 UTC m=+883.489741102" Nov 22 08:36:49 crc kubenswrapper[4743]: I1122 08:36:49.802104 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" podStartSLOduration=2.189217313 podStartE2EDuration="12.802073391s" podCreationTimestamp="2025-11-22 08:36:37 +0000 UTC" firstStartedPulling="2025-11-22 08:36:38.64516722 +0000 UTC m=+872.351528272" lastFinishedPulling="2025-11-22 08:36:49.258023288 +0000 UTC m=+882.964384350" observedRunningTime="2025-11-22 08:36:49.800223877 +0000 UTC m=+883.506584959" watchObservedRunningTime="2025-11-22 08:36:49.802073391 +0000 UTC m=+883.508434443" Nov 22 08:36:50 crc kubenswrapper[4743]: I1122 08:36:50.741132 4743 generic.go:334] "Generic (PLEG): container finished" podID="c019d9ca-5ddf-4c98-b5f1-c425686a58d4" containerID="eb54ab97fb30e8c01b879e8eab15abc33c95e6abf8e57980a107cb8b0436206c" exitCode=0 Nov 22 08:36:50 crc kubenswrapper[4743]: I1122 08:36:50.741235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4q9l" event={"ID":"c019d9ca-5ddf-4c98-b5f1-c425686a58d4","Type":"ContainerDied","Data":"eb54ab97fb30e8c01b879e8eab15abc33c95e6abf8e57980a107cb8b0436206c"} Nov 22 08:36:51 crc kubenswrapper[4743]: I1122 08:36:51.751124 4743 generic.go:334] "Generic (PLEG): container finished" podID="c019d9ca-5ddf-4c98-b5f1-c425686a58d4" containerID="5bd274dc42a08a50b4a4e353d8ed9fd13f876b33375be97165e738f7c84450e9" exitCode=0 Nov 22 08:36:51 crc kubenswrapper[4743]: I1122 08:36:51.751241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4q9l" event={"ID":"c019d9ca-5ddf-4c98-b5f1-c425686a58d4","Type":"ContainerDied","Data":"5bd274dc42a08a50b4a4e353d8ed9fd13f876b33375be97165e738f7c84450e9"} Nov 22 08:36:52 crc kubenswrapper[4743]: I1122 08:36:52.765428 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4q9l" event={"ID":"c019d9ca-5ddf-4c98-b5f1-c425686a58d4","Type":"ContainerStarted","Data":"9d3b144b7a1bd2be9a84d28594e5ea40cbc951a27336d89ef2f6fb8985c64ef0"} Nov 22 08:36:52 crc kubenswrapper[4743]: I1122 08:36:52.766190 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4q9l" event={"ID":"c019d9ca-5ddf-4c98-b5f1-c425686a58d4","Type":"ContainerStarted","Data":"c73fa7401598babb56394103106907c5ef2f1057a5a04dc830646ad8869ad9b1"} Nov 22 08:36:52 crc kubenswrapper[4743]: I1122 08:36:52.766202 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4q9l" event={"ID":"c019d9ca-5ddf-4c98-b5f1-c425686a58d4","Type":"ContainerStarted","Data":"e325b1aec04fc2e17912f3eceb0039878a91c9eb781f24b2b514744181c4ea21"} Nov 22 08:36:52 crc kubenswrapper[4743]: I1122 08:36:52.766211 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4q9l" event={"ID":"c019d9ca-5ddf-4c98-b5f1-c425686a58d4","Type":"ContainerStarted","Data":"df9f35977ca6842a4f39f10e4b566d19e5fa408e775338e9cf9acc560f02bf04"} Nov 22 08:36:52 crc kubenswrapper[4743]: I1122 08:36:52.766220 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4q9l" event={"ID":"c019d9ca-5ddf-4c98-b5f1-c425686a58d4","Type":"ContainerStarted","Data":"d35053613409dcec165dcb912184b48279ff8a327c91ac2472cbe460346fed26"} Nov 22 08:36:53 crc kubenswrapper[4743]: I1122 08:36:53.774817 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g4q9l" event={"ID":"c019d9ca-5ddf-4c98-b5f1-c425686a58d4","Type":"ContainerStarted","Data":"f316621d5f87936d9e84806f55090c1c401afebce833945096d606f60efd3633"} Nov 22 08:36:53 crc kubenswrapper[4743]: I1122 08:36:53.774960 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:53 crc kubenswrapper[4743]: I1122 08:36:53.798794 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-g4q9l" podStartSLOduration=9.519511057 podStartE2EDuration="16.798775386s" podCreationTimestamp="2025-11-22 08:36:37 +0000 UTC" firstStartedPulling="2025-11-22 08:36:42.000163178 +0000 UTC m=+875.706524230" lastFinishedPulling="2025-11-22 08:36:49.279427507 +0000 UTC m=+882.985788559" observedRunningTime="2025-11-22 08:36:53.79510527 +0000 UTC m=+887.501466332" watchObservedRunningTime="2025-11-22 08:36:53.798775386 +0000 UTC m=+887.505136438" Nov 22 08:36:58 crc kubenswrapper[4743]: I1122 08:36:58.232140 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6c7b4b5f48-m9fwk" Nov 22 08:36:58 crc kubenswrapper[4743]: I1122 08:36:58.586376 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:58 crc kubenswrapper[4743]: I1122 08:36:58.586633 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:58 crc kubenswrapper[4743]: I1122 08:36:58.627642 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:58 crc kubenswrapper[4743]: I1122 08:36:58.672934 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:58 crc kubenswrapper[4743]: I1122 08:36:58.708756 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:36:58 crc kubenswrapper[4743]: I1122 08:36:58.848886 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:36:58 crc kubenswrapper[4743]: I1122 08:36:58.889089 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhlvt"] Nov 22 08:37:00 crc kubenswrapper[4743]: I1122 08:37:00.811969 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hhlvt" podUID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerName="registry-server" containerID="cri-o://c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608" gracePeriod=2 Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.179159 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.382008 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-utilities\") pod \"771cd119-8b9e-43a8-8fa5-1e91644f2436\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.382142 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-catalog-content\") pod \"771cd119-8b9e-43a8-8fa5-1e91644f2436\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.382196 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv6bt\" (UniqueName: \"kubernetes.io/projected/771cd119-8b9e-43a8-8fa5-1e91644f2436-kube-api-access-pv6bt\") pod \"771cd119-8b9e-43a8-8fa5-1e91644f2436\" (UID: \"771cd119-8b9e-43a8-8fa5-1e91644f2436\") " Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.382931 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-utilities" (OuterVolumeSpecName: "utilities") pod "771cd119-8b9e-43a8-8fa5-1e91644f2436" (UID: "771cd119-8b9e-43a8-8fa5-1e91644f2436"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.391494 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/771cd119-8b9e-43a8-8fa5-1e91644f2436-kube-api-access-pv6bt" (OuterVolumeSpecName: "kube-api-access-pv6bt") pod "771cd119-8b9e-43a8-8fa5-1e91644f2436" (UID: "771cd119-8b9e-43a8-8fa5-1e91644f2436"). InnerVolumeSpecName "kube-api-access-pv6bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.398397 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "771cd119-8b9e-43a8-8fa5-1e91644f2436" (UID: "771cd119-8b9e-43a8-8fa5-1e91644f2436"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.484322 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.484879 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv6bt\" (UniqueName: \"kubernetes.io/projected/771cd119-8b9e-43a8-8fa5-1e91644f2436-kube-api-access-pv6bt\") on node \"crc\" DevicePath \"\"" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.484898 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/771cd119-8b9e-43a8-8fa5-1e91644f2436-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.821101 4743 generic.go:334] "Generic (PLEG): container finished" podID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerID="c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608" exitCode=0 Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.821142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhlvt" event={"ID":"771cd119-8b9e-43a8-8fa5-1e91644f2436","Type":"ContainerDied","Data":"c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608"} Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.821149 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hhlvt" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.821167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hhlvt" event={"ID":"771cd119-8b9e-43a8-8fa5-1e91644f2436","Type":"ContainerDied","Data":"c6aaf99250afc63f727e0408ed35e5fa266a6fe9ee68166d47d1d9d99cdf2230"} Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.821183 4743 scope.go:117] "RemoveContainer" containerID="c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.844623 4743 scope.go:117] "RemoveContainer" containerID="c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.854164 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhlvt"] Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.857349 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hhlvt"] Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.878952 4743 scope.go:117] "RemoveContainer" containerID="54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.908783 4743 scope.go:117] "RemoveContainer" containerID="c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608" Nov 22 08:37:01 crc kubenswrapper[4743]: E1122 08:37:01.913713 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608\": container with ID starting with c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608 not found: ID does not exist" containerID="c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.913763 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608"} err="failed to get container status \"c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608\": rpc error: code = NotFound desc = could not find container \"c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608\": container with ID starting with c0bc634baa55870a6393dc53e7b6d54abd1660fef54848091c0e522f250cd608 not found: ID does not exist" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.913792 4743 scope.go:117] "RemoveContainer" containerID="c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27" Nov 22 08:37:01 crc kubenswrapper[4743]: E1122 08:37:01.914357 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27\": container with ID starting with c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27 not found: ID does not exist" containerID="c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.914453 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27"} err="failed to get container status \"c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27\": rpc error: code = NotFound desc = could not find container \"c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27\": container with ID starting with c22f7c931a6c8b3b7496bb73971043ef79b5f39d98a3b4f8d5c7921f14872a27 not found: ID does not exist" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.914525 4743 scope.go:117] "RemoveContainer" containerID="54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895" Nov 22 08:37:01 crc kubenswrapper[4743]: E1122 08:37:01.914827 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895\": container with ID starting with 54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895 not found: ID does not exist" containerID="54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895" Nov 22 08:37:01 crc kubenswrapper[4743]: I1122 08:37:01.914924 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895"} err="failed to get container status \"54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895\": rpc error: code = NotFound desc = could not find container \"54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895\": container with ID starting with 54c53c64e274cdcc6591ae822d5be05759f50ba0d2a32aab7427949e98c4c895 not found: ID does not exist" Nov 22 08:37:03 crc kubenswrapper[4743]: I1122 08:37:03.161266 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="771cd119-8b9e-43a8-8fa5-1e91644f2436" path="/var/lib/kubelet/pods/771cd119-8b9e-43a8-8fa5-1e91644f2436/volumes" Nov 22 08:37:05 crc kubenswrapper[4743]: I1122 08:37:05.957123 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-hkg92" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.439182 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4"] Nov 22 08:37:07 crc kubenswrapper[4743]: E1122 08:37:07.440693 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerName="extract-utilities" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.440803 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerName="extract-utilities" Nov 22 08:37:07 crc kubenswrapper[4743]: E1122 08:37:07.440877 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerName="extract-content" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.440935 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerName="extract-content" Nov 22 08:37:07 crc kubenswrapper[4743]: E1122 08:37:07.440984 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerName="registry-server" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.441041 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerName="registry-server" Nov 22 08:37:07 crc kubenswrapper[4743]: E1122 08:37:07.441559 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerName="extract-content" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.441656 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerName="extract-content" Nov 22 08:37:07 crc kubenswrapper[4743]: E1122 08:37:07.441716 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerName="registry-server" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.441768 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerName="registry-server" Nov 22 08:37:07 crc kubenswrapper[4743]: E1122 08:37:07.441837 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerName="extract-utilities" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.441923 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerName="extract-utilities" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.442152 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="771cd119-8b9e-43a8-8fa5-1e91644f2436" containerName="registry-server" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.442381 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db074dd-6e0e-41f0-81b4-f59d2e4a9863" containerName="registry-server" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.443402 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.446372 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.495621 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4"] Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.576385 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qc6g\" (UniqueName: \"kubernetes.io/projected/b67f1857-71d8-47f0-bee7-d03162f14ef0-kube-api-access-9qc6g\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.576450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.576488 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.677207 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qc6g\" (UniqueName: \"kubernetes.io/projected/b67f1857-71d8-47f0-bee7-d03162f14ef0-kube-api-access-9qc6g\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.677517 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.677684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.678049 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.678220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.694648 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qc6g\" (UniqueName: \"kubernetes.io/projected/b67f1857-71d8-47f0-bee7-d03162f14ef0-kube-api-access-9qc6g\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:07 crc kubenswrapper[4743]: I1122 08:37:07.758360 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:08 crc kubenswrapper[4743]: I1122 08:37:08.086338 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-6998585d5-5l6gn" Nov 22 08:37:08 crc kubenswrapper[4743]: I1122 08:37:08.183669 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4"] Nov 22 08:37:08 crc kubenswrapper[4743]: I1122 08:37:08.674591 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-g4q9l" Nov 22 08:37:08 crc kubenswrapper[4743]: I1122 08:37:08.866432 4743 generic.go:334] "Generic (PLEG): container finished" podID="b67f1857-71d8-47f0-bee7-d03162f14ef0" containerID="a4e5b490c9be0c60d685fa58abafa6552bd9b0b62e98188019b6bcc74f5608ae" exitCode=0 Nov 22 08:37:08 crc kubenswrapper[4743]: I1122 08:37:08.866495 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" event={"ID":"b67f1857-71d8-47f0-bee7-d03162f14ef0","Type":"ContainerDied","Data":"a4e5b490c9be0c60d685fa58abafa6552bd9b0b62e98188019b6bcc74f5608ae"} Nov 22 08:37:08 crc kubenswrapper[4743]: I1122 08:37:08.866863 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" event={"ID":"b67f1857-71d8-47f0-bee7-d03162f14ef0","Type":"ContainerStarted","Data":"00ab46bb8552d449cc095467897166331f4d62804007b05d2ef0c04fa32638fa"} Nov 22 08:37:12 crc kubenswrapper[4743]: I1122 08:37:12.891545 4743 generic.go:334] "Generic (PLEG): container finished" podID="b67f1857-71d8-47f0-bee7-d03162f14ef0" containerID="73a47c3010bb5e16d3d4aa150d0ef0751a974e04c99e55e28d724600d3ab42fd" exitCode=0 Nov 22 08:37:12 crc kubenswrapper[4743]: I1122 08:37:12.891636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" event={"ID":"b67f1857-71d8-47f0-bee7-d03162f14ef0","Type":"ContainerDied","Data":"73a47c3010bb5e16d3d4aa150d0ef0751a974e04c99e55e28d724600d3ab42fd"} Nov 22 08:37:13 crc kubenswrapper[4743]: I1122 08:37:13.899213 4743 generic.go:334] "Generic (PLEG): container finished" podID="b67f1857-71d8-47f0-bee7-d03162f14ef0" containerID="1f907c4558e802580dfc24527d611f0fefccd8a2cb710951d4b131e332eff37b" exitCode=0 Nov 22 08:37:13 crc kubenswrapper[4743]: I1122 08:37:13.899261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" event={"ID":"b67f1857-71d8-47f0-bee7-d03162f14ef0","Type":"ContainerDied","Data":"1f907c4558e802580dfc24527d611f0fefccd8a2cb710951d4b131e332eff37b"} Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.134518 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.202941 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-util\") pod \"b67f1857-71d8-47f0-bee7-d03162f14ef0\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.203039 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qc6g\" (UniqueName: \"kubernetes.io/projected/b67f1857-71d8-47f0-bee7-d03162f14ef0-kube-api-access-9qc6g\") pod \"b67f1857-71d8-47f0-bee7-d03162f14ef0\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.209160 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b67f1857-71d8-47f0-bee7-d03162f14ef0-kube-api-access-9qc6g" (OuterVolumeSpecName: "kube-api-access-9qc6g") pod "b67f1857-71d8-47f0-bee7-d03162f14ef0" (UID: "b67f1857-71d8-47f0-bee7-d03162f14ef0"). InnerVolumeSpecName "kube-api-access-9qc6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.215098 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-util" (OuterVolumeSpecName: "util") pod "b67f1857-71d8-47f0-bee7-d03162f14ef0" (UID: "b67f1857-71d8-47f0-bee7-d03162f14ef0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.304251 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-bundle\") pod \"b67f1857-71d8-47f0-bee7-d03162f14ef0\" (UID: \"b67f1857-71d8-47f0-bee7-d03162f14ef0\") " Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.304525 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-util\") on node \"crc\" DevicePath \"\"" Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.304539 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qc6g\" (UniqueName: \"kubernetes.io/projected/b67f1857-71d8-47f0-bee7-d03162f14ef0-kube-api-access-9qc6g\") on node \"crc\" DevicePath \"\"" Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.305936 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-bundle" (OuterVolumeSpecName: "bundle") pod "b67f1857-71d8-47f0-bee7-d03162f14ef0" (UID: "b67f1857-71d8-47f0-bee7-d03162f14ef0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.407361 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b67f1857-71d8-47f0-bee7-d03162f14ef0-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.912365 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" event={"ID":"b67f1857-71d8-47f0-bee7-d03162f14ef0","Type":"ContainerDied","Data":"00ab46bb8552d449cc095467897166331f4d62804007b05d2ef0c04fa32638fa"} Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.912420 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00ab46bb8552d449cc095467897166331f4d62804007b05d2ef0c04fa32638fa" Nov 22 08:37:15 crc kubenswrapper[4743]: I1122 08:37:15.912429 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.881660 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r"] Nov 22 08:37:17 crc kubenswrapper[4743]: E1122 08:37:17.882263 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67f1857-71d8-47f0-bee7-d03162f14ef0" containerName="util" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.882279 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67f1857-71d8-47f0-bee7-d03162f14ef0" containerName="util" Nov 22 08:37:17 crc kubenswrapper[4743]: E1122 08:37:17.882290 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67f1857-71d8-47f0-bee7-d03162f14ef0" containerName="pull" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.882296 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67f1857-71d8-47f0-bee7-d03162f14ef0" containerName="pull" Nov 22 08:37:17 crc kubenswrapper[4743]: E1122 08:37:17.882313 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b67f1857-71d8-47f0-bee7-d03162f14ef0" containerName="extract" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.882320 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b67f1857-71d8-47f0-bee7-d03162f14ef0" containerName="extract" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.882425 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b67f1857-71d8-47f0-bee7-d03162f14ef0" containerName="extract" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.882894 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.885317 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.885412 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.885499 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-kt5pm" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.896933 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r"] Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.935945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/76850d94-114f-4bc4-8e5d-b86e8080557d-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-7tm7r\" (UID: \"76850d94-114f-4bc4-8e5d-b86e8080557d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" Nov 22 08:37:17 crc kubenswrapper[4743]: I1122 08:37:17.936025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrgr\" (UniqueName: \"kubernetes.io/projected/76850d94-114f-4bc4-8e5d-b86e8080557d-kube-api-access-rvrgr\") pod \"cert-manager-operator-controller-manager-64cf6dff88-7tm7r\" (UID: \"76850d94-114f-4bc4-8e5d-b86e8080557d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" Nov 22 08:37:18 crc kubenswrapper[4743]: I1122 08:37:18.036963 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrgr\" (UniqueName: \"kubernetes.io/projected/76850d94-114f-4bc4-8e5d-b86e8080557d-kube-api-access-rvrgr\") pod \"cert-manager-operator-controller-manager-64cf6dff88-7tm7r\" (UID: \"76850d94-114f-4bc4-8e5d-b86e8080557d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" Nov 22 08:37:18 crc kubenswrapper[4743]: I1122 08:37:18.037040 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/76850d94-114f-4bc4-8e5d-b86e8080557d-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-7tm7r\" (UID: \"76850d94-114f-4bc4-8e5d-b86e8080557d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" Nov 22 08:37:18 crc kubenswrapper[4743]: I1122 08:37:18.037470 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/76850d94-114f-4bc4-8e5d-b86e8080557d-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-7tm7r\" (UID: \"76850d94-114f-4bc4-8e5d-b86e8080557d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" Nov 22 08:37:18 crc kubenswrapper[4743]: I1122 08:37:18.055007 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrgr\" (UniqueName: \"kubernetes.io/projected/76850d94-114f-4bc4-8e5d-b86e8080557d-kube-api-access-rvrgr\") pod \"cert-manager-operator-controller-manager-64cf6dff88-7tm7r\" (UID: \"76850d94-114f-4bc4-8e5d-b86e8080557d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" Nov 22 08:37:18 crc kubenswrapper[4743]: I1122 08:37:18.204590 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" Nov 22 08:37:18 crc kubenswrapper[4743]: I1122 08:37:18.654448 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r"] Nov 22 08:37:18 crc kubenswrapper[4743]: I1122 08:37:18.932136 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" event={"ID":"76850d94-114f-4bc4-8e5d-b86e8080557d","Type":"ContainerStarted","Data":"9824c22e1b8972065621a5026ca8df9d579754a15fbf3655e38f79beedb52555"} Nov 22 08:37:25 crc kubenswrapper[4743]: I1122 08:37:25.976759 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" event={"ID":"76850d94-114f-4bc4-8e5d-b86e8080557d","Type":"ContainerStarted","Data":"ccec4fa3b8ec0111cf6150ff46431ce20ef42a9f9680cda9e11d4b6768e05f02"} Nov 22 08:37:25 crc kubenswrapper[4743]: I1122 08:37:25.997630 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-7tm7r" podStartSLOduration=2.215610338 podStartE2EDuration="8.997607087s" podCreationTimestamp="2025-11-22 08:37:17 +0000 UTC" firstStartedPulling="2025-11-22 08:37:18.667264391 +0000 UTC m=+912.373625443" lastFinishedPulling="2025-11-22 08:37:25.44926114 +0000 UTC m=+919.155622192" observedRunningTime="2025-11-22 08:37:25.996450233 +0000 UTC m=+919.702811285" watchObservedRunningTime="2025-11-22 08:37:25.997607087 +0000 UTC m=+919.703968169" Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.720259 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-drxhj"] Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.721435 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.725744 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vmbk9" Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.729480 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.731593 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-drxhj"] Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.761832 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.826082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shsmc\" (UniqueName: \"kubernetes.io/projected/3da5b450-d97d-45e1-9b46-91733e107f14-kube-api-access-shsmc\") pod \"cert-manager-webhook-f4fb5df64-drxhj\" (UID: \"3da5b450-d97d-45e1-9b46-91733e107f14\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.826149 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3da5b450-d97d-45e1-9b46-91733e107f14-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-drxhj\" (UID: \"3da5b450-d97d-45e1-9b46-91733e107f14\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.927395 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shsmc\" (UniqueName: \"kubernetes.io/projected/3da5b450-d97d-45e1-9b46-91733e107f14-kube-api-access-shsmc\") pod \"cert-manager-webhook-f4fb5df64-drxhj\" (UID: \"3da5b450-d97d-45e1-9b46-91733e107f14\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.927446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3da5b450-d97d-45e1-9b46-91733e107f14-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-drxhj\" (UID: \"3da5b450-d97d-45e1-9b46-91733e107f14\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.949668 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3da5b450-d97d-45e1-9b46-91733e107f14-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-drxhj\" (UID: \"3da5b450-d97d-45e1-9b46-91733e107f14\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" Nov 22 08:37:28 crc kubenswrapper[4743]: I1122 08:37:28.949738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shsmc\" (UniqueName: \"kubernetes.io/projected/3da5b450-d97d-45e1-9b46-91733e107f14-kube-api-access-shsmc\") pod \"cert-manager-webhook-f4fb5df64-drxhj\" (UID: \"3da5b450-d97d-45e1-9b46-91733e107f14\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" Nov 22 08:37:29 crc kubenswrapper[4743]: I1122 08:37:29.070382 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" Nov 22 08:37:29 crc kubenswrapper[4743]: I1122 08:37:29.505304 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-drxhj"] Nov 22 08:37:29 crc kubenswrapper[4743]: W1122 08:37:29.510776 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3da5b450_d97d_45e1_9b46_91733e107f14.slice/crio-7d2d94dfce9a9d0a778b577cb584ed179208631536d1131cb468fc3defca6cb3 WatchSource:0}: Error finding container 7d2d94dfce9a9d0a778b577cb584ed179208631536d1131cb468fc3defca6cb3: Status 404 returned error can't find the container with id 7d2d94dfce9a9d0a778b577cb584ed179208631536d1131cb468fc3defca6cb3 Nov 22 08:37:29 crc kubenswrapper[4743]: I1122 08:37:29.997281 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" event={"ID":"3da5b450-d97d-45e1-9b46-91733e107f14","Type":"ContainerStarted","Data":"7d2d94dfce9a9d0a778b577cb584ed179208631536d1131cb468fc3defca6cb3"} Nov 22 08:37:31 crc kubenswrapper[4743]: I1122 08:37:31.863669 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn"] Nov 22 08:37:31 crc kubenswrapper[4743]: I1122 08:37:31.864960 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" Nov 22 08:37:31 crc kubenswrapper[4743]: I1122 08:37:31.866431 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn"] Nov 22 08:37:31 crc kubenswrapper[4743]: I1122 08:37:31.867259 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-q67z6" Nov 22 08:37:31 crc kubenswrapper[4743]: I1122 08:37:31.973079 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh6kb\" (UniqueName: \"kubernetes.io/projected/dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0-kube-api-access-lh6kb\") pod \"cert-manager-cainjector-855d9ccff4-m7nsn\" (UID: \"dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" Nov 22 08:37:31 crc kubenswrapper[4743]: I1122 08:37:31.973199 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-m7nsn\" (UID: \"dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" Nov 22 08:37:32 crc kubenswrapper[4743]: I1122 08:37:32.076797 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh6kb\" (UniqueName: \"kubernetes.io/projected/dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0-kube-api-access-lh6kb\") pod \"cert-manager-cainjector-855d9ccff4-m7nsn\" (UID: \"dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" Nov 22 08:37:32 crc kubenswrapper[4743]: I1122 08:37:32.076857 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-m7nsn\" (UID: \"dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" Nov 22 08:37:32 crc kubenswrapper[4743]: I1122 08:37:32.101317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-m7nsn\" (UID: \"dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" Nov 22 08:37:32 crc kubenswrapper[4743]: I1122 08:37:32.113536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh6kb\" (UniqueName: \"kubernetes.io/projected/dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0-kube-api-access-lh6kb\") pod \"cert-manager-cainjector-855d9ccff4-m7nsn\" (UID: \"dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" Nov 22 08:37:32 crc kubenswrapper[4743]: I1122 08:37:32.180809 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" Nov 22 08:37:32 crc kubenswrapper[4743]: I1122 08:37:32.641415 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn"] Nov 22 08:37:33 crc kubenswrapper[4743]: I1122 08:37:33.022849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" event={"ID":"dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0","Type":"ContainerStarted","Data":"25b22caeae0a0e2f22e63d9c757f7b4c943847d0b3cda5b16d3bf58e924f0437"} Nov 22 08:37:38 crc kubenswrapper[4743]: I1122 08:37:38.051446 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" event={"ID":"dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0","Type":"ContainerStarted","Data":"8429ab3404df5d418ab5b2173c049ec0d27d47bcaff803a47e88ee6abc739c4f"} Nov 22 08:37:38 crc kubenswrapper[4743]: I1122 08:37:38.053953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" event={"ID":"3da5b450-d97d-45e1-9b46-91733e107f14","Type":"ContainerStarted","Data":"284a9744468e357fc2222860b97d2a99c4d9ff6ef5fe83be8143af9ef46302a3"} Nov 22 08:37:38 crc kubenswrapper[4743]: I1122 08:37:38.054119 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" Nov 22 08:37:38 crc kubenswrapper[4743]: I1122 08:37:38.070202 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-m7nsn" podStartSLOduration=2.282171537 podStartE2EDuration="7.070184215s" podCreationTimestamp="2025-11-22 08:37:31 +0000 UTC" firstStartedPulling="2025-11-22 08:37:32.654685753 +0000 UTC m=+926.361046805" lastFinishedPulling="2025-11-22 08:37:37.442698431 +0000 UTC m=+931.149059483" observedRunningTime="2025-11-22 08:37:38.065671455 +0000 UTC m=+931.772032507" watchObservedRunningTime="2025-11-22 08:37:38.070184215 +0000 UTC m=+931.776545267" Nov 22 08:37:38 crc kubenswrapper[4743]: I1122 08:37:38.086056 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" podStartSLOduration=2.147589974 podStartE2EDuration="10.086040844s" podCreationTimestamp="2025-11-22 08:37:28 +0000 UTC" firstStartedPulling="2025-11-22 08:37:29.513290582 +0000 UTC m=+923.219651634" lastFinishedPulling="2025-11-22 08:37:37.451741452 +0000 UTC m=+931.158102504" observedRunningTime="2025-11-22 08:37:38.085481568 +0000 UTC m=+931.791842620" watchObservedRunningTime="2025-11-22 08:37:38.086040844 +0000 UTC m=+931.792401896" Nov 22 08:37:44 crc kubenswrapper[4743]: I1122 08:37:44.072470 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-drxhj" Nov 22 08:37:47 crc kubenswrapper[4743]: I1122 08:37:47.769382 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-lx25w"] Nov 22 08:37:47 crc kubenswrapper[4743]: I1122 08:37:47.770777 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-lx25w" Nov 22 08:37:47 crc kubenswrapper[4743]: I1122 08:37:47.774678 4743 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7p8hx" Nov 22 08:37:47 crc kubenswrapper[4743]: I1122 08:37:47.778994 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-lx25w"] Nov 22 08:37:47 crc kubenswrapper[4743]: I1122 08:37:47.786133 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520-bound-sa-token\") pod \"cert-manager-86cb77c54b-lx25w\" (UID: \"c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520\") " pod="cert-manager/cert-manager-86cb77c54b-lx25w" Nov 22 08:37:47 crc kubenswrapper[4743]: I1122 08:37:47.786209 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t2wk\" (UniqueName: \"kubernetes.io/projected/c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520-kube-api-access-4t2wk\") pod \"cert-manager-86cb77c54b-lx25w\" (UID: \"c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520\") " pod="cert-manager/cert-manager-86cb77c54b-lx25w" Nov 22 08:37:47 crc kubenswrapper[4743]: I1122 08:37:47.886862 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520-bound-sa-token\") pod \"cert-manager-86cb77c54b-lx25w\" (UID: \"c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520\") " pod="cert-manager/cert-manager-86cb77c54b-lx25w" Nov 22 08:37:47 crc kubenswrapper[4743]: I1122 08:37:47.886931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t2wk\" (UniqueName: \"kubernetes.io/projected/c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520-kube-api-access-4t2wk\") pod \"cert-manager-86cb77c54b-lx25w\" (UID: \"c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520\") " pod="cert-manager/cert-manager-86cb77c54b-lx25w" Nov 22 08:37:47 crc kubenswrapper[4743]: I1122 08:37:47.909744 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520-bound-sa-token\") pod \"cert-manager-86cb77c54b-lx25w\" (UID: \"c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520\") " pod="cert-manager/cert-manager-86cb77c54b-lx25w" Nov 22 08:37:47 crc kubenswrapper[4743]: I1122 08:37:47.910271 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t2wk\" (UniqueName: \"kubernetes.io/projected/c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520-kube-api-access-4t2wk\") pod \"cert-manager-86cb77c54b-lx25w\" (UID: \"c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520\") " pod="cert-manager/cert-manager-86cb77c54b-lx25w" Nov 22 08:37:48 crc kubenswrapper[4743]: I1122 08:37:48.090170 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-lx25w" Nov 22 08:37:48 crc kubenswrapper[4743]: I1122 08:37:48.349799 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-lx25w"] Nov 22 08:37:49 crc kubenswrapper[4743]: I1122 08:37:49.120668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-lx25w" event={"ID":"c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520","Type":"ContainerStarted","Data":"f11ec76c21c1435645865024c2347cb549dabcb062024378630946fe423d0ea3"} Nov 22 08:37:49 crc kubenswrapper[4743]: I1122 08:37:49.121022 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-lx25w" event={"ID":"c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520","Type":"ContainerStarted","Data":"21011509ced79d5957a44db5192e7424bd2c7c21720047f1634e04f6f7e97f67"} Nov 22 08:37:49 crc kubenswrapper[4743]: I1122 08:37:49.141680 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-lx25w" podStartSLOduration=2.141639914 podStartE2EDuration="2.141639914s" podCreationTimestamp="2025-11-22 08:37:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:37:49.134403535 +0000 UTC m=+942.840764607" watchObservedRunningTime="2025-11-22 08:37:49.141639914 +0000 UTC m=+942.848001016" Nov 22 08:37:57 crc kubenswrapper[4743]: I1122 08:37:57.764185 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dh596"] Nov 22 08:37:57 crc kubenswrapper[4743]: I1122 08:37:57.765374 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dh596" Nov 22 08:37:57 crc kubenswrapper[4743]: I1122 08:37:57.821550 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Nov 22 08:37:57 crc kubenswrapper[4743]: I1122 08:37:57.821838 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Nov 22 08:37:57 crc kubenswrapper[4743]: I1122 08:37:57.822118 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-hh5lr" Nov 22 08:37:57 crc kubenswrapper[4743]: I1122 08:37:57.823311 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7trf\" (UniqueName: \"kubernetes.io/projected/1387ab02-1b83-474a-88e1-9d086ced94a3-kube-api-access-k7trf\") pod \"openstack-operator-index-dh596\" (UID: \"1387ab02-1b83-474a-88e1-9d086ced94a3\") " pod="openstack-operators/openstack-operator-index-dh596" Nov 22 08:37:57 crc kubenswrapper[4743]: I1122 08:37:57.835371 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dh596"] Nov 22 08:37:57 crc kubenswrapper[4743]: I1122 08:37:57.924648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7trf\" (UniqueName: \"kubernetes.io/projected/1387ab02-1b83-474a-88e1-9d086ced94a3-kube-api-access-k7trf\") pod \"openstack-operator-index-dh596\" (UID: \"1387ab02-1b83-474a-88e1-9d086ced94a3\") " pod="openstack-operators/openstack-operator-index-dh596" Nov 22 08:37:57 crc kubenswrapper[4743]: I1122 08:37:57.958365 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7trf\" (UniqueName: \"kubernetes.io/projected/1387ab02-1b83-474a-88e1-9d086ced94a3-kube-api-access-k7trf\") pod \"openstack-operator-index-dh596\" (UID: \"1387ab02-1b83-474a-88e1-9d086ced94a3\") " pod="openstack-operators/openstack-operator-index-dh596" Nov 22 08:37:58 crc kubenswrapper[4743]: I1122 08:37:58.144554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dh596" Nov 22 08:37:58 crc kubenswrapper[4743]: I1122 08:37:58.575980 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dh596"] Nov 22 08:37:58 crc kubenswrapper[4743]: W1122 08:37:58.584522 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1387ab02_1b83_474a_88e1_9d086ced94a3.slice/crio-cfb00e758cc4117b5722fd719bbe16b4127c561731375ed09f1883d29036a52e WatchSource:0}: Error finding container cfb00e758cc4117b5722fd719bbe16b4127c561731375ed09f1883d29036a52e: Status 404 returned error can't find the container with id cfb00e758cc4117b5722fd719bbe16b4127c561731375ed09f1883d29036a52e Nov 22 08:37:59 crc kubenswrapper[4743]: I1122 08:37:59.195352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dh596" event={"ID":"1387ab02-1b83-474a-88e1-9d086ced94a3","Type":"ContainerStarted","Data":"cfb00e758cc4117b5722fd719bbe16b4127c561731375ed09f1883d29036a52e"} Nov 22 08:38:01 crc kubenswrapper[4743]: I1122 08:38:01.161442 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dh596"] Nov 22 08:38:01 crc kubenswrapper[4743]: I1122 08:38:01.746521 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-f78gn"] Nov 22 08:38:01 crc kubenswrapper[4743]: I1122 08:38:01.748223 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f78gn" Nov 22 08:38:01 crc kubenswrapper[4743]: I1122 08:38:01.752957 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f78gn"] Nov 22 08:38:01 crc kubenswrapper[4743]: I1122 08:38:01.787163 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd4wg\" (UniqueName: \"kubernetes.io/projected/995bde5d-4b1f-4ee1-ab0e-eacb5f81c4ea-kube-api-access-pd4wg\") pod \"openstack-operator-index-f78gn\" (UID: \"995bde5d-4b1f-4ee1-ab0e-eacb5f81c4ea\") " pod="openstack-operators/openstack-operator-index-f78gn" Nov 22 08:38:01 crc kubenswrapper[4743]: I1122 08:38:01.888686 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd4wg\" (UniqueName: \"kubernetes.io/projected/995bde5d-4b1f-4ee1-ab0e-eacb5f81c4ea-kube-api-access-pd4wg\") pod \"openstack-operator-index-f78gn\" (UID: \"995bde5d-4b1f-4ee1-ab0e-eacb5f81c4ea\") " pod="openstack-operators/openstack-operator-index-f78gn" Nov 22 08:38:01 crc kubenswrapper[4743]: I1122 08:38:01.912716 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd4wg\" (UniqueName: \"kubernetes.io/projected/995bde5d-4b1f-4ee1-ab0e-eacb5f81c4ea-kube-api-access-pd4wg\") pod \"openstack-operator-index-f78gn\" (UID: \"995bde5d-4b1f-4ee1-ab0e-eacb5f81c4ea\") " pod="openstack-operators/openstack-operator-index-f78gn" Nov 22 08:38:02 crc kubenswrapper[4743]: I1122 08:38:02.067677 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f78gn" Nov 22 08:38:02 crc kubenswrapper[4743]: I1122 08:38:02.820925 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f78gn"] Nov 22 08:38:03 crc kubenswrapper[4743]: I1122 08:38:03.228159 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f78gn" event={"ID":"995bde5d-4b1f-4ee1-ab0e-eacb5f81c4ea","Type":"ContainerStarted","Data":"e65092ba8f7942412df9cfb925514c27512d72b6d288218f79691fb3133129a8"} Nov 22 08:38:04 crc kubenswrapper[4743]: I1122 08:38:04.237525 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f78gn" event={"ID":"995bde5d-4b1f-4ee1-ab0e-eacb5f81c4ea","Type":"ContainerStarted","Data":"913cc5fe8f78168daedb6097674849120f9cc8d3bfe925cddfddc7688fc359b0"} Nov 22 08:38:04 crc kubenswrapper[4743]: I1122 08:38:04.240319 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dh596" event={"ID":"1387ab02-1b83-474a-88e1-9d086ced94a3","Type":"ContainerStarted","Data":"d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f"} Nov 22 08:38:04 crc kubenswrapper[4743]: I1122 08:38:04.240489 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dh596" podUID="1387ab02-1b83-474a-88e1-9d086ced94a3" containerName="registry-server" containerID="cri-o://d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f" gracePeriod=2 Nov 22 08:38:04 crc kubenswrapper[4743]: I1122 08:38:04.254742 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-f78gn" podStartSLOduration=2.594549839 podStartE2EDuration="3.254717139s" podCreationTimestamp="2025-11-22 08:38:01 +0000 UTC" firstStartedPulling="2025-11-22 08:38:02.849909696 +0000 UTC m=+956.556270748" lastFinishedPulling="2025-11-22 08:38:03.510076986 +0000 UTC m=+957.216438048" observedRunningTime="2025-11-22 08:38:04.253108913 +0000 UTC m=+957.959469965" watchObservedRunningTime="2025-11-22 08:38:04.254717139 +0000 UTC m=+957.961078201" Nov 22 08:38:04 crc kubenswrapper[4743]: I1122 08:38:04.269083 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dh596" podStartSLOduration=2.34663565 podStartE2EDuration="7.269058704s" podCreationTimestamp="2025-11-22 08:37:57 +0000 UTC" firstStartedPulling="2025-11-22 08:37:58.586392675 +0000 UTC m=+952.292753727" lastFinishedPulling="2025-11-22 08:38:03.508815729 +0000 UTC m=+957.215176781" observedRunningTime="2025-11-22 08:38:04.268499648 +0000 UTC m=+957.974860720" watchObservedRunningTime="2025-11-22 08:38:04.269058704 +0000 UTC m=+957.975419756" Nov 22 08:38:04 crc kubenswrapper[4743]: I1122 08:38:04.830282 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dh596" Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.025910 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7trf\" (UniqueName: \"kubernetes.io/projected/1387ab02-1b83-474a-88e1-9d086ced94a3-kube-api-access-k7trf\") pod \"1387ab02-1b83-474a-88e1-9d086ced94a3\" (UID: \"1387ab02-1b83-474a-88e1-9d086ced94a3\") " Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.032064 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1387ab02-1b83-474a-88e1-9d086ced94a3-kube-api-access-k7trf" (OuterVolumeSpecName: "kube-api-access-k7trf") pod "1387ab02-1b83-474a-88e1-9d086ced94a3" (UID: "1387ab02-1b83-474a-88e1-9d086ced94a3"). InnerVolumeSpecName "kube-api-access-k7trf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.127422 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7trf\" (UniqueName: \"kubernetes.io/projected/1387ab02-1b83-474a-88e1-9d086ced94a3-kube-api-access-k7trf\") on node \"crc\" DevicePath \"\"" Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.248438 4743 generic.go:334] "Generic (PLEG): container finished" podID="1387ab02-1b83-474a-88e1-9d086ced94a3" containerID="d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f" exitCode=0 Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.248495 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dh596" event={"ID":"1387ab02-1b83-474a-88e1-9d086ced94a3","Type":"ContainerDied","Data":"d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f"} Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.248524 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dh596" Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.248547 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dh596" event={"ID":"1387ab02-1b83-474a-88e1-9d086ced94a3","Type":"ContainerDied","Data":"cfb00e758cc4117b5722fd719bbe16b4127c561731375ed09f1883d29036a52e"} Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.248609 4743 scope.go:117] "RemoveContainer" containerID="d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f" Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.270829 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dh596"] Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.270884 4743 scope.go:117] "RemoveContainer" containerID="d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f" Nov 22 08:38:05 crc kubenswrapper[4743]: E1122 08:38:05.271374 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f\": container with ID starting with d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f not found: ID does not exist" containerID="d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f" Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.271416 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f"} err="failed to get container status \"d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f\": rpc error: code = NotFound desc = could not find container \"d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f\": container with ID starting with d92cd07656176cca03fb0b3ea8d8b0fb4f2d7b7b5a8c20bc408d2afb2df0a04f not found: ID does not exist" Nov 22 08:38:05 crc kubenswrapper[4743]: I1122 08:38:05.276972 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dh596"] Nov 22 08:38:07 crc kubenswrapper[4743]: I1122 08:38:07.158361 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1387ab02-1b83-474a-88e1-9d086ced94a3" path="/var/lib/kubelet/pods/1387ab02-1b83-474a-88e1-9d086ced94a3/volumes" Nov 22 08:38:12 crc kubenswrapper[4743]: I1122 08:38:12.067834 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-f78gn" Nov 22 08:38:12 crc kubenswrapper[4743]: I1122 08:38:12.069811 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-f78gn" Nov 22 08:38:12 crc kubenswrapper[4743]: I1122 08:38:12.094989 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-f78gn" Nov 22 08:38:12 crc kubenswrapper[4743]: I1122 08:38:12.320807 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-f78gn" Nov 22 08:38:12 crc kubenswrapper[4743]: I1122 08:38:12.987830 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn"] Nov 22 08:38:12 crc kubenswrapper[4743]: E1122 08:38:12.988699 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1387ab02-1b83-474a-88e1-9d086ced94a3" containerName="registry-server" Nov 22 08:38:12 crc kubenswrapper[4743]: I1122 08:38:12.988805 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1387ab02-1b83-474a-88e1-9d086ced94a3" containerName="registry-server" Nov 22 08:38:12 crc kubenswrapper[4743]: I1122 08:38:12.989051 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1387ab02-1b83-474a-88e1-9d086ced94a3" containerName="registry-server" Nov 22 08:38:12 crc kubenswrapper[4743]: I1122 08:38:12.990118 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:12 crc kubenswrapper[4743]: I1122 08:38:12.994458 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-86w47" Nov 22 08:38:12 crc kubenswrapper[4743]: I1122 08:38:12.998665 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn"] Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.132661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-bundle\") pod \"424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.132708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-util\") pod \"424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.132778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfcmd\" (UniqueName: \"kubernetes.io/projected/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-kube-api-access-sfcmd\") pod \"424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.234025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfcmd\" (UniqueName: \"kubernetes.io/projected/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-kube-api-access-sfcmd\") pod \"424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.234130 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-bundle\") pod \"424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.234148 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-util\") pod \"424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.234695 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-util\") pod \"424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.234760 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-bundle\") pod \"424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.255756 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfcmd\" (UniqueName: \"kubernetes.io/projected/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-kube-api-access-sfcmd\") pod \"424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.309597 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:13 crc kubenswrapper[4743]: I1122 08:38:13.505369 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn"] Nov 22 08:38:14 crc kubenswrapper[4743]: I1122 08:38:14.312013 4743 generic.go:334] "Generic (PLEG): container finished" podID="6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" containerID="c647124cc1f74a7ca4788ea2aa30e50da44f5e4a785edd9aa67f005e184ca774" exitCode=0 Nov 22 08:38:14 crc kubenswrapper[4743]: I1122 08:38:14.312137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" event={"ID":"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f","Type":"ContainerDied","Data":"c647124cc1f74a7ca4788ea2aa30e50da44f5e4a785edd9aa67f005e184ca774"} Nov 22 08:38:14 crc kubenswrapper[4743]: I1122 08:38:14.312416 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" event={"ID":"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f","Type":"ContainerStarted","Data":"6b8946c0c2356e39407f77745ca280aaeedae328aac656c542272b180b0635e9"} Nov 22 08:38:15 crc kubenswrapper[4743]: I1122 08:38:15.321688 4743 generic.go:334] "Generic (PLEG): container finished" podID="6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" containerID="462a343176576c0c4c31f110ab7c3cfc99947b2662e63114e9710937885f1404" exitCode=0 Nov 22 08:38:15 crc kubenswrapper[4743]: I1122 08:38:15.321753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" event={"ID":"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f","Type":"ContainerDied","Data":"462a343176576c0c4c31f110ab7c3cfc99947b2662e63114e9710937885f1404"} Nov 22 08:38:16 crc kubenswrapper[4743]: I1122 08:38:16.331406 4743 generic.go:334] "Generic (PLEG): container finished" podID="6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" containerID="5c7fe8787c96265fc82db80cdc4d2da7aa45df234d10b1a0a21498c766f6c3a8" exitCode=0 Nov 22 08:38:16 crc kubenswrapper[4743]: I1122 08:38:16.331494 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" event={"ID":"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f","Type":"ContainerDied","Data":"5c7fe8787c96265fc82db80cdc4d2da7aa45df234d10b1a0a21498c766f6c3a8"} Nov 22 08:38:17 crc kubenswrapper[4743]: I1122 08:38:17.558032 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:17 crc kubenswrapper[4743]: I1122 08:38:17.615552 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-util\") pod \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " Nov 22 08:38:17 crc kubenswrapper[4743]: I1122 08:38:17.615641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-bundle\") pod \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " Nov 22 08:38:17 crc kubenswrapper[4743]: I1122 08:38:17.615674 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfcmd\" (UniqueName: \"kubernetes.io/projected/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-kube-api-access-sfcmd\") pod \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\" (UID: \"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f\") " Nov 22 08:38:17 crc kubenswrapper[4743]: I1122 08:38:17.616412 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-bundle" (OuterVolumeSpecName: "bundle") pod "6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" (UID: "6c17beb4-33b2-4e6d-9ae5-61f396f3c37f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:38:17 crc kubenswrapper[4743]: I1122 08:38:17.621914 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-kube-api-access-sfcmd" (OuterVolumeSpecName: "kube-api-access-sfcmd") pod "6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" (UID: "6c17beb4-33b2-4e6d-9ae5-61f396f3c37f"). InnerVolumeSpecName "kube-api-access-sfcmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:38:17 crc kubenswrapper[4743]: I1122 08:38:17.631728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-util" (OuterVolumeSpecName: "util") pod "6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" (UID: "6c17beb4-33b2-4e6d-9ae5-61f396f3c37f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:38:17 crc kubenswrapper[4743]: I1122 08:38:17.717342 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-util\") on node \"crc\" DevicePath \"\"" Nov 22 08:38:17 crc kubenswrapper[4743]: I1122 08:38:17.717378 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:38:17 crc kubenswrapper[4743]: I1122 08:38:17.717388 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfcmd\" (UniqueName: \"kubernetes.io/projected/6c17beb4-33b2-4e6d-9ae5-61f396f3c37f-kube-api-access-sfcmd\") on node \"crc\" DevicePath \"\"" Nov 22 08:38:18 crc kubenswrapper[4743]: I1122 08:38:18.345675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" event={"ID":"6c17beb4-33b2-4e6d-9ae5-61f396f3c37f","Type":"ContainerDied","Data":"6b8946c0c2356e39407f77745ca280aaeedae328aac656c542272b180b0635e9"} Nov 22 08:38:18 crc kubenswrapper[4743]: I1122 08:38:18.345755 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b8946c0c2356e39407f77745ca280aaeedae328aac656c542272b180b0635e9" Nov 22 08:38:18 crc kubenswrapper[4743]: I1122 08:38:18.345754 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn" Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.693809 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5"] Nov 22 08:38:20 crc kubenswrapper[4743]: E1122 08:38:20.694418 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" containerName="util" Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.694435 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" containerName="util" Nov 22 08:38:20 crc kubenswrapper[4743]: E1122 08:38:20.694453 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" containerName="extract" Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.694460 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" containerName="extract" Nov 22 08:38:20 crc kubenswrapper[4743]: E1122 08:38:20.694469 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" containerName="pull" Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.694477 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" containerName="pull" Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.694668 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c17beb4-33b2-4e6d-9ae5-61f396f3c37f" containerName="extract" Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.695436 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.697215 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-v4gbz" Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.718356 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5"] Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.771795 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrnlc\" (UniqueName: \"kubernetes.io/projected/874fc1ac-e9dc-4948-ac24-da9140316fd8-kube-api-access-wrnlc\") pod \"openstack-operator-controller-operator-58d789d48c-sv4c5\" (UID: \"874fc1ac-e9dc-4948-ac24-da9140316fd8\") " pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.872834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrnlc\" (UniqueName: \"kubernetes.io/projected/874fc1ac-e9dc-4948-ac24-da9140316fd8-kube-api-access-wrnlc\") pod \"openstack-operator-controller-operator-58d789d48c-sv4c5\" (UID: \"874fc1ac-e9dc-4948-ac24-da9140316fd8\") " pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" Nov 22 08:38:20 crc kubenswrapper[4743]: I1122 08:38:20.895387 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrnlc\" (UniqueName: \"kubernetes.io/projected/874fc1ac-e9dc-4948-ac24-da9140316fd8-kube-api-access-wrnlc\") pod \"openstack-operator-controller-operator-58d789d48c-sv4c5\" (UID: \"874fc1ac-e9dc-4948-ac24-da9140316fd8\") " pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" Nov 22 08:38:21 crc kubenswrapper[4743]: I1122 08:38:21.018892 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" Nov 22 08:38:21 crc kubenswrapper[4743]: I1122 08:38:21.439908 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5"] Nov 22 08:38:21 crc kubenswrapper[4743]: W1122 08:38:21.450780 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod874fc1ac_e9dc_4948_ac24_da9140316fd8.slice/crio-ab224bca2d70485b8fc5378960c8d9e866ba4f447fe4273dbad605209565e017 WatchSource:0}: Error finding container ab224bca2d70485b8fc5378960c8d9e866ba4f447fe4273dbad605209565e017: Status 404 returned error can't find the container with id ab224bca2d70485b8fc5378960c8d9e866ba4f447fe4273dbad605209565e017 Nov 22 08:38:22 crc kubenswrapper[4743]: I1122 08:38:22.379500 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" event={"ID":"874fc1ac-e9dc-4948-ac24-da9140316fd8","Type":"ContainerStarted","Data":"ab224bca2d70485b8fc5378960c8d9e866ba4f447fe4273dbad605209565e017"} Nov 22 08:38:25 crc kubenswrapper[4743]: I1122 08:38:25.397910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" event={"ID":"874fc1ac-e9dc-4948-ac24-da9140316fd8","Type":"ContainerStarted","Data":"b469829cb2b9ae094f59c97447eb1e9a3d851c58354b4fce22c194814b0299ab"} Nov 22 08:38:28 crc kubenswrapper[4743]: I1122 08:38:28.416887 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" event={"ID":"874fc1ac-e9dc-4948-ac24-da9140316fd8","Type":"ContainerStarted","Data":"07f618dfe9ce3a2545e41b673865ea1d54c9b59b195daec9b7c53fa5a65b7574"} Nov 22 08:38:28 crc kubenswrapper[4743]: I1122 08:38:28.417706 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" Nov 22 08:38:28 crc kubenswrapper[4743]: I1122 08:38:28.449125 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" podStartSLOduration=2.121187871 podStartE2EDuration="8.449109029s" podCreationTimestamp="2025-11-22 08:38:20 +0000 UTC" firstStartedPulling="2025-11-22 08:38:21.452346573 +0000 UTC m=+975.158707625" lastFinishedPulling="2025-11-22 08:38:27.780267731 +0000 UTC m=+981.486628783" observedRunningTime="2025-11-22 08:38:28.445501174 +0000 UTC m=+982.151862226" watchObservedRunningTime="2025-11-22 08:38:28.449109029 +0000 UTC m=+982.155470081" Nov 22 08:38:31 crc kubenswrapper[4743]: I1122 08:38:31.022101 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-58d789d48c-sv4c5" Nov 22 08:38:31 crc kubenswrapper[4743]: I1122 08:38:31.240817 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:38:31 crc kubenswrapper[4743]: I1122 08:38:31.241230 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.863359 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.864537 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.867075 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ckf6h" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.877861 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.878813 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.882754 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-mqjwq" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.886127 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.890714 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.891863 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.893206 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-rv7w5" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.903624 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.918186 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.940318 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.942008 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.945784 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wdrnz" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.949382 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.954593 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.958337 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.961214 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-htvcz" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.970641 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.974652 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.975623 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.977964 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-p4l2f" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.986598 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.987717 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.989943 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.990177 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hrs8r" Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.992648 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss"] Nov 22 08:38:47 crc kubenswrapper[4743]: I1122 08:38:47.997979 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.004979 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.006171 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.011836 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-cm5s4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.021651 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.045997 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.047232 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.054556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpzgs\" (UniqueName: \"kubernetes.io/projected/ac624665-bb51-4c61-b213-cb07bd43eafe-kube-api-access-dpzgs\") pod \"cinder-operator-controller-manager-6498cbf48f-8vd9m\" (UID: \"ac624665-bb51-4c61-b213-cb07bd43eafe\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.054636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbwwb\" (UniqueName: \"kubernetes.io/projected/a729ba89-b0fe-4363-b4b4-ffe21f0c627c-kube-api-access-qbwwb\") pod \"glance-operator-controller-manager-7969689c84-k5rhz\" (UID: \"a729ba89-b0fe-4363-b4b4-ffe21f0c627c\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.054680 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54t2s\" (UniqueName: \"kubernetes.io/projected/81cdc04a-86d9-488d-b854-d941f3f5632e-kube-api-access-54t2s\") pod \"heat-operator-controller-manager-56f54d6746-7bcj5\" (UID: \"81cdc04a-86d9-488d-b854-d941f3f5632e\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.054702 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq5bl\" (UniqueName: \"kubernetes.io/projected/9f1b446c-1023-4682-889f-97abca903826-kube-api-access-jq5bl\") pod \"barbican-operator-controller-manager-75fb479bcc-xts6s\" (UID: \"9f1b446c-1023-4682-889f-97abca903826\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.054721 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k274z\" (UniqueName: \"kubernetes.io/projected/657d8d61-7be7-42a6-8472-2d70e55a8428-kube-api-access-k274z\") pod \"designate-operator-controller-manager-767ccfd65f-kcrpp\" (UID: \"657d8d61-7be7-42a6-8472-2d70e55a8428\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.055450 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-r2dww" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.056546 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-579sf"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.061092 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.063272 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rg54q" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.068202 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.069228 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.070970 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-ctnfx" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.076059 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.083176 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-579sf"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.104202 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.108593 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.110512 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.113334 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-x26wc" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.140995 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.158216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqxp\" (UniqueName: \"kubernetes.io/projected/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-kube-api-access-5sqxp\") pod \"infra-operator-controller-manager-6dd8864d7c-gn9b8\" (UID: \"25fd4b24-83f2-4a02-b086-ca0f03cb42a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.158272 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpbbp\" (UniqueName: \"kubernetes.io/projected/e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb-kube-api-access-dpbbp\") pod \"keystone-operator-controller-manager-7454b96578-ktmd4\" (UID: \"e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.158304 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54t2s\" (UniqueName: \"kubernetes.io/projected/81cdc04a-86d9-488d-b854-d941f3f5632e-kube-api-access-54t2s\") pod \"heat-operator-controller-manager-56f54d6746-7bcj5\" (UID: \"81cdc04a-86d9-488d-b854-d941f3f5632e\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.158329 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq5bl\" (UniqueName: \"kubernetes.io/projected/9f1b446c-1023-4682-889f-97abca903826-kube-api-access-jq5bl\") pod \"barbican-operator-controller-manager-75fb479bcc-xts6s\" (UID: \"9f1b446c-1023-4682-889f-97abca903826\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.158353 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k274z\" (UniqueName: \"kubernetes.io/projected/657d8d61-7be7-42a6-8472-2d70e55a8428-kube-api-access-k274z\") pod \"designate-operator-controller-manager-767ccfd65f-kcrpp\" (UID: \"657d8d61-7be7-42a6-8472-2d70e55a8428\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.158380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpzgs\" (UniqueName: \"kubernetes.io/projected/ac624665-bb51-4c61-b213-cb07bd43eafe-kube-api-access-dpzgs\") pod \"cinder-operator-controller-manager-6498cbf48f-8vd9m\" (UID: \"ac624665-bb51-4c61-b213-cb07bd43eafe\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.158407 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77v26\" (UniqueName: \"kubernetes.io/projected/89e94ee4-4365-4e56-a5a2-3d61bbbd8876-kube-api-access-77v26\") pod \"ironic-operator-controller-manager-99b499f4-xbmln\" (UID: \"89e94ee4-4365-4e56-a5a2-3d61bbbd8876\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.158439 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-cert\") pod \"infra-operator-controller-manager-6dd8864d7c-gn9b8\" (UID: \"25fd4b24-83f2-4a02-b086-ca0f03cb42a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.158459 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhpzv\" (UniqueName: \"kubernetes.io/projected/f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f-kube-api-access-vhpzv\") pod \"horizon-operator-controller-manager-598f69df5d-6wnss\" (UID: \"f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.158482 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbwwb\" (UniqueName: \"kubernetes.io/projected/a729ba89-b0fe-4363-b4b4-ffe21f0c627c-kube-api-access-qbwwb\") pod \"glance-operator-controller-manager-7969689c84-k5rhz\" (UID: \"a729ba89-b0fe-4363-b4b4-ffe21f0c627c\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.206726 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.223156 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k274z\" (UniqueName: \"kubernetes.io/projected/657d8d61-7be7-42a6-8472-2d70e55a8428-kube-api-access-k274z\") pod \"designate-operator-controller-manager-767ccfd65f-kcrpp\" (UID: \"657d8d61-7be7-42a6-8472-2d70e55a8428\") " pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.223626 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpzgs\" (UniqueName: \"kubernetes.io/projected/ac624665-bb51-4c61-b213-cb07bd43eafe-kube-api-access-dpzgs\") pod \"cinder-operator-controller-manager-6498cbf48f-8vd9m\" (UID: \"ac624665-bb51-4c61-b213-cb07bd43eafe\") " pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.224249 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbwwb\" (UniqueName: \"kubernetes.io/projected/a729ba89-b0fe-4363-b4b4-ffe21f0c627c-kube-api-access-qbwwb\") pod \"glance-operator-controller-manager-7969689c84-k5rhz\" (UID: \"a729ba89-b0fe-4363-b4b4-ffe21f0c627c\") " pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.225232 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.225328 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.226360 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq5bl\" (UniqueName: \"kubernetes.io/projected/9f1b446c-1023-4682-889f-97abca903826-kube-api-access-jq5bl\") pod \"barbican-operator-controller-manager-75fb479bcc-xts6s\" (UID: \"9f1b446c-1023-4682-889f-97abca903826\") " pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.228048 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-4mhll" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.231001 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.233035 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.234940 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-x6h5n" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.240208 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54t2s\" (UniqueName: \"kubernetes.io/projected/81cdc04a-86d9-488d-b854-d941f3f5632e-kube-api-access-54t2s\") pod \"heat-operator-controller-manager-56f54d6746-7bcj5\" (UID: \"81cdc04a-86d9-488d-b854-d941f3f5632e\") " pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.252624 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.254126 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.257661 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-xrnb5" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.259491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77v26\" (UniqueName: \"kubernetes.io/projected/89e94ee4-4365-4e56-a5a2-3d61bbbd8876-kube-api-access-77v26\") pod \"ironic-operator-controller-manager-99b499f4-xbmln\" (UID: \"89e94ee4-4365-4e56-a5a2-3d61bbbd8876\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.259526 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqx5m\" (UniqueName: \"kubernetes.io/projected/9d349409-980f-4605-bd87-d09fe812dd65-kube-api-access-gqx5m\") pod \"mariadb-operator-controller-manager-54b5986bb8-vt47z\" (UID: \"9d349409-980f-4605-bd87-d09fe812dd65\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.259555 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-cert\") pod \"infra-operator-controller-manager-6dd8864d7c-gn9b8\" (UID: \"25fd4b24-83f2-4a02-b086-ca0f03cb42a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.259591 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhpzv\" (UniqueName: \"kubernetes.io/projected/f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f-kube-api-access-vhpzv\") pod \"horizon-operator-controller-manager-598f69df5d-6wnss\" (UID: \"f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.262505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqxp\" (UniqueName: \"kubernetes.io/projected/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-kube-api-access-5sqxp\") pod \"infra-operator-controller-manager-6dd8864d7c-gn9b8\" (UID: \"25fd4b24-83f2-4a02-b086-ca0f03cb42a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.262546 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpbbp\" (UniqueName: \"kubernetes.io/projected/e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb-kube-api-access-dpbbp\") pod \"keystone-operator-controller-manager-7454b96578-ktmd4\" (UID: \"e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.262569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7brvw\" (UniqueName: \"kubernetes.io/projected/50b5cda8-859c-49f0-92aa-601c16eb9a2a-kube-api-access-7brvw\") pod \"manila-operator-controller-manager-58f887965d-579sf\" (UID: \"50b5cda8-859c-49f0-92aa-601c16eb9a2a\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.262688 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmwtr\" (UniqueName: \"kubernetes.io/projected/d8c50ae0-c8c9-4e87-9130-4c04d5b468ac-kube-api-access-xmwtr\") pod \"neutron-operator-controller-manager-78bd47f458-klfvn\" (UID: \"d8c50ae0-c8c9-4e87-9130-4c04d5b468ac\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" Nov 22 08:38:48 crc kubenswrapper[4743]: E1122 08:38:48.262888 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 22 08:38:48 crc kubenswrapper[4743]: E1122 08:38:48.263003 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-cert podName:25fd4b24-83f2-4a02-b086-ca0f03cb42a3 nodeName:}" failed. No retries permitted until 2025-11-22 08:38:48.762974235 +0000 UTC m=+1002.469335287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-cert") pod "infra-operator-controller-manager-6dd8864d7c-gn9b8" (UID: "25fd4b24-83f2-4a02-b086-ca0f03cb42a3") : secret "infra-operator-webhook-server-cert" not found Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.266770 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.267343 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.284279 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqxp\" (UniqueName: \"kubernetes.io/projected/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-kube-api-access-5sqxp\") pod \"infra-operator-controller-manager-6dd8864d7c-gn9b8\" (UID: \"25fd4b24-83f2-4a02-b086-ca0f03cb42a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.290210 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhpzv\" (UniqueName: \"kubernetes.io/projected/f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f-kube-api-access-vhpzv\") pod \"horizon-operator-controller-manager-598f69df5d-6wnss\" (UID: \"f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f\") " pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.290719 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.291699 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.305334 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77v26\" (UniqueName: \"kubernetes.io/projected/89e94ee4-4365-4e56-a5a2-3d61bbbd8876-kube-api-access-77v26\") pod \"ironic-operator-controller-manager-99b499f4-xbmln\" (UID: \"89e94ee4-4365-4e56-a5a2-3d61bbbd8876\") " pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.305870 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.306315 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpbbp\" (UniqueName: \"kubernetes.io/projected/e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb-kube-api-access-dpbbp\") pod \"keystone-operator-controller-manager-7454b96578-ktmd4\" (UID: \"e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb\") " pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.311232 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.312569 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.314722 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-sfstk" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.322685 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.324247 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.326378 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.328251 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-dhrhm" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.328431 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.333797 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.338673 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.340134 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-jczhh"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.341426 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.343303 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-zmb2r" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.354703 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-jczhh"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.365063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs9l6\" (UniqueName: \"kubernetes.io/projected/1b06dd20-2bb7-4ff2-aa77-997042af333e-kube-api-access-rs9l6\") pod \"nova-operator-controller-manager-cfbb9c588-j682k\" (UID: \"1b06dd20-2bb7-4ff2-aa77-997042af333e\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.365336 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gh4v\" (UniqueName: \"kubernetes.io/projected/0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970-kube-api-access-6gh4v\") pod \"octavia-operator-controller-manager-54cfbf4c7d-g6mzk\" (UID: \"0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.365492 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7brvw\" (UniqueName: \"kubernetes.io/projected/50b5cda8-859c-49f0-92aa-601c16eb9a2a-kube-api-access-7brvw\") pod \"manila-operator-controller-manager-58f887965d-579sf\" (UID: \"50b5cda8-859c-49f0-92aa-601c16eb9a2a\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.365683 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmwtr\" (UniqueName: \"kubernetes.io/projected/d8c50ae0-c8c9-4e87-9130-4c04d5b468ac-kube-api-access-xmwtr\") pod \"neutron-operator-controller-manager-78bd47f458-klfvn\" (UID: \"d8c50ae0-c8c9-4e87-9130-4c04d5b468ac\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.365815 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfm6t\" (UniqueName: \"kubernetes.io/projected/4c6b99d5-9791-40db-91fd-d74c80b2e3a7-kube-api-access-wfm6t\") pod \"ovn-operator-controller-manager-54fc5f65b7-cv2tj\" (UID: \"4c6b99d5-9791-40db-91fd-d74c80b2e3a7\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.365931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqx5m\" (UniqueName: \"kubernetes.io/projected/9d349409-980f-4605-bd87-d09fe812dd65-kube-api-access-gqx5m\") pod \"mariadb-operator-controller-manager-54b5986bb8-vt47z\" (UID: \"9d349409-980f-4605-bd87-d09fe812dd65\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.375408 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.390621 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.391837 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.394295 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-8rmzp" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.402983 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.404623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmwtr\" (UniqueName: \"kubernetes.io/projected/d8c50ae0-c8c9-4e87-9130-4c04d5b468ac-kube-api-access-xmwtr\") pod \"neutron-operator-controller-manager-78bd47f458-klfvn\" (UID: \"d8c50ae0-c8c9-4e87-9130-4c04d5b468ac\") " pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.412909 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqx5m\" (UniqueName: \"kubernetes.io/projected/9d349409-980f-4605-bd87-d09fe812dd65-kube-api-access-gqx5m\") pod \"mariadb-operator-controller-manager-54b5986bb8-vt47z\" (UID: \"9d349409-980f-4605-bd87-d09fe812dd65\") " pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.415422 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7brvw\" (UniqueName: \"kubernetes.io/projected/50b5cda8-859c-49f0-92aa-601c16eb9a2a-kube-api-access-7brvw\") pod \"manila-operator-controller-manager-58f887965d-579sf\" (UID: \"50b5cda8-859c-49f0-92aa-601c16eb9a2a\") " pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.430261 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.452281 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-55bp6"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.453512 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.460984 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-k5pw9" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.467253 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6dbd\" (UniqueName: \"kubernetes.io/projected/3420c6da-358d-4b5c-a383-e25fbc58a2ee-kube-api-access-s6dbd\") pod \"placement-operator-controller-manager-5b797b8dff-z4flw\" (UID: \"3420c6da-358d-4b5c-a383-e25fbc58a2ee\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.467290 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzrwl\" (UniqueName: \"kubernetes.io/projected/9f7a1db7-e801-4da6-b64b-f3babcfcd9c6-kube-api-access-mzrwl\") pod \"swift-operator-controller-manager-d656998f4-jczhh\" (UID: \"9f7a1db7-e801-4da6-b64b-f3babcfcd9c6\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.467361 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvw2k\" (UniqueName: \"kubernetes.io/projected/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-kube-api-access-vvw2k\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4\" (UID: \"f5f27cf7-eaa5-4b71-84a6-94fac3920d39\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.467417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4\" (UID: \"f5f27cf7-eaa5-4b71-84a6-94fac3920d39\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.467442 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfm6t\" (UniqueName: \"kubernetes.io/projected/4c6b99d5-9791-40db-91fd-d74c80b2e3a7-kube-api-access-wfm6t\") pod \"ovn-operator-controller-manager-54fc5f65b7-cv2tj\" (UID: \"4c6b99d5-9791-40db-91fd-d74c80b2e3a7\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.467523 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs9l6\" (UniqueName: \"kubernetes.io/projected/1b06dd20-2bb7-4ff2-aa77-997042af333e-kube-api-access-rs9l6\") pod \"nova-operator-controller-manager-cfbb9c588-j682k\" (UID: \"1b06dd20-2bb7-4ff2-aa77-997042af333e\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.467544 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gh4v\" (UniqueName: \"kubernetes.io/projected/0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970-kube-api-access-6gh4v\") pod \"octavia-operator-controller-manager-54cfbf4c7d-g6mzk\" (UID: \"0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.481837 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.491056 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-55bp6"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.491531 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs9l6\" (UniqueName: \"kubernetes.io/projected/1b06dd20-2bb7-4ff2-aa77-997042af333e-kube-api-access-rs9l6\") pod \"nova-operator-controller-manager-cfbb9c588-j682k\" (UID: \"1b06dd20-2bb7-4ff2-aa77-997042af333e\") " pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.491784 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gh4v\" (UniqueName: \"kubernetes.io/projected/0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970-kube-api-access-6gh4v\") pod \"octavia-operator-controller-manager-54cfbf4c7d-g6mzk\" (UID: \"0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970\") " pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.493959 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.495426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfm6t\" (UniqueName: \"kubernetes.io/projected/4c6b99d5-9791-40db-91fd-d74c80b2e3a7-kube-api-access-wfm6t\") pod \"ovn-operator-controller-manager-54fc5f65b7-cv2tj\" (UID: \"4c6b99d5-9791-40db-91fd-d74c80b2e3a7\") " pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.510954 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.546177 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.548171 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.554197 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.558931 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-gsgfs" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.568204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvw2k\" (UniqueName: \"kubernetes.io/projected/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-kube-api-access-vvw2k\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4\" (UID: \"f5f27cf7-eaa5-4b71-84a6-94fac3920d39\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.568253 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4\" (UID: \"f5f27cf7-eaa5-4b71-84a6-94fac3920d39\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.568291 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk455\" (UniqueName: \"kubernetes.io/projected/75667626-7d6a-46d0-b0b2-f627257967f4-kube-api-access-pk455\") pod \"test-operator-controller-manager-b4c496f69-55bp6\" (UID: \"75667626-7d6a-46d0-b0b2-f627257967f4\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.568360 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6dbd\" (UniqueName: \"kubernetes.io/projected/3420c6da-358d-4b5c-a383-e25fbc58a2ee-kube-api-access-s6dbd\") pod \"placement-operator-controller-manager-5b797b8dff-z4flw\" (UID: \"3420c6da-358d-4b5c-a383-e25fbc58a2ee\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.568382 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzrwl\" (UniqueName: \"kubernetes.io/projected/9f7a1db7-e801-4da6-b64b-f3babcfcd9c6-kube-api-access-mzrwl\") pod \"swift-operator-controller-manager-d656998f4-jczhh\" (UID: \"9f7a1db7-e801-4da6-b64b-f3babcfcd9c6\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.568405 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhjss\" (UniqueName: \"kubernetes.io/projected/904fdb49-cc2c-443c-af9e-950b648018e9-kube-api-access-bhjss\") pod \"telemetry-operator-controller-manager-6d4bf84b58-rssfb\" (UID: \"904fdb49-cc2c-443c-af9e-950b648018e9\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" Nov 22 08:38:48 crc kubenswrapper[4743]: E1122 08:38:48.569142 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 08:38:48 crc kubenswrapper[4743]: E1122 08:38:48.569234 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-cert podName:f5f27cf7-eaa5-4b71-84a6-94fac3920d39 nodeName:}" failed. No retries permitted until 2025-11-22 08:38:49.069191574 +0000 UTC m=+1002.775552636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-cert") pod "openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" (UID: "f5f27cf7-eaa5-4b71-84a6-94fac3920d39") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.598319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzrwl\" (UniqueName: \"kubernetes.io/projected/9f7a1db7-e801-4da6-b64b-f3babcfcd9c6-kube-api-access-mzrwl\") pod \"swift-operator-controller-manager-d656998f4-jczhh\" (UID: \"9f7a1db7-e801-4da6-b64b-f3babcfcd9c6\") " pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.599142 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6dbd\" (UniqueName: \"kubernetes.io/projected/3420c6da-358d-4b5c-a383-e25fbc58a2ee-kube-api-access-s6dbd\") pod \"placement-operator-controller-manager-5b797b8dff-z4flw\" (UID: \"3420c6da-358d-4b5c-a383-e25fbc58a2ee\") " pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.613894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvw2k\" (UniqueName: \"kubernetes.io/projected/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-kube-api-access-vvw2k\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4\" (UID: \"f5f27cf7-eaa5-4b71-84a6-94fac3920d39\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.644325 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.647390 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.653387 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hxg8t" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.654291 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.669179 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhjss\" (UniqueName: \"kubernetes.io/projected/904fdb49-cc2c-443c-af9e-950b648018e9-kube-api-access-bhjss\") pod \"telemetry-operator-controller-manager-6d4bf84b58-rssfb\" (UID: \"904fdb49-cc2c-443c-af9e-950b648018e9\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.669271 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2whtm\" (UniqueName: \"kubernetes.io/projected/5c593905-2ee5-4990-9f9c-85ca81f38319-kube-api-access-2whtm\") pod \"watcher-operator-controller-manager-8c6448b9f-j6vgx\" (UID: \"5c593905-2ee5-4990-9f9c-85ca81f38319\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.669316 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk455\" (UniqueName: \"kubernetes.io/projected/75667626-7d6a-46d0-b0b2-f627257967f4-kube-api-access-pk455\") pod \"test-operator-controller-manager-b4c496f69-55bp6\" (UID: \"75667626-7d6a-46d0-b0b2-f627257967f4\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.680293 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.732062 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.735091 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.735870 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.738098 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.739376 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.739472 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk455\" (UniqueName: \"kubernetes.io/projected/75667626-7d6a-46d0-b0b2-f627257967f4-kube-api-access-pk455\") pod \"test-operator-controller-manager-b4c496f69-55bp6\" (UID: \"75667626-7d6a-46d0-b0b2-f627257967f4\") " pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.746136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhjss\" (UniqueName: \"kubernetes.io/projected/904fdb49-cc2c-443c-af9e-950b648018e9-kube-api-access-bhjss\") pod \"telemetry-operator-controller-manager-6d4bf84b58-rssfb\" (UID: \"904fdb49-cc2c-443c-af9e-950b648018e9\") " pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.761177 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.789124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2whtm\" (UniqueName: \"kubernetes.io/projected/5c593905-2ee5-4990-9f9c-85ca81f38319-kube-api-access-2whtm\") pod \"watcher-operator-controller-manager-8c6448b9f-j6vgx\" (UID: \"5c593905-2ee5-4990-9f9c-85ca81f38319\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.789243 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b247b139-5fdf-426f-8ca6-6bcb58585963-cert\") pod \"openstack-operator-controller-manager-654fc8b94c-cp8qz\" (UID: \"b247b139-5fdf-426f-8ca6-6bcb58585963\") " pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.789296 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-cert\") pod \"infra-operator-controller-manager-6dd8864d7c-gn9b8\" (UID: \"25fd4b24-83f2-4a02-b086-ca0f03cb42a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.789424 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nls58\" (UniqueName: \"kubernetes.io/projected/b247b139-5fdf-426f-8ca6-6bcb58585963-kube-api-access-nls58\") pod \"openstack-operator-controller-manager-654fc8b94c-cp8qz\" (UID: \"b247b139-5fdf-426f-8ca6-6bcb58585963\") " pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:38:48 crc kubenswrapper[4743]: E1122 08:38:48.790066 4743 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Nov 22 08:38:48 crc kubenswrapper[4743]: E1122 08:38:48.790129 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-cert podName:25fd4b24-83f2-4a02-b086-ca0f03cb42a3 nodeName:}" failed. No retries permitted until 2025-11-22 08:38:49.790110815 +0000 UTC m=+1003.496471867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-cert") pod "infra-operator-controller-manager-6dd8864d7c-gn9b8" (UID: "25fd4b24-83f2-4a02-b086-ca0f03cb42a3") : secret "infra-operator-webhook-server-cert" not found Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.818655 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.819798 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.826249 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2whtm\" (UniqueName: \"kubernetes.io/projected/5c593905-2ee5-4990-9f9c-85ca81f38319-kube-api-access-2whtm\") pod \"watcher-operator-controller-manager-8c6448b9f-j6vgx\" (UID: \"5c593905-2ee5-4990-9f9c-85ca81f38319\") " pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.827348 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-bcnzr" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.831736 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.840453 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.877598 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5"] Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.890138 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b247b139-5fdf-426f-8ca6-6bcb58585963-cert\") pod \"openstack-operator-controller-manager-654fc8b94c-cp8qz\" (UID: \"b247b139-5fdf-426f-8ca6-6bcb58585963\") " pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.890206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nls58\" (UniqueName: \"kubernetes.io/projected/b247b139-5fdf-426f-8ca6-6bcb58585963-kube-api-access-nls58\") pod \"openstack-operator-controller-manager-654fc8b94c-cp8qz\" (UID: \"b247b139-5fdf-426f-8ca6-6bcb58585963\") " pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.890262 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szpxm\" (UniqueName: \"kubernetes.io/projected/36e85576-c481-4424-aa1c-21a18036d239-kube-api-access-szpxm\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5\" (UID: \"36e85576-c481-4424-aa1c-21a18036d239\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" Nov 22 08:38:48 crc kubenswrapper[4743]: E1122 08:38:48.890312 4743 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Nov 22 08:38:48 crc kubenswrapper[4743]: E1122 08:38:48.890390 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b247b139-5fdf-426f-8ca6-6bcb58585963-cert podName:b247b139-5fdf-426f-8ca6-6bcb58585963 nodeName:}" failed. No retries permitted until 2025-11-22 08:38:49.390367085 +0000 UTC m=+1003.096728197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b247b139-5fdf-426f-8ca6-6bcb58585963-cert") pod "openstack-operator-controller-manager-654fc8b94c-cp8qz" (UID: "b247b139-5fdf-426f-8ca6-6bcb58585963") : secret "webhook-server-cert" not found Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.895949 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.920511 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nls58\" (UniqueName: \"kubernetes.io/projected/b247b139-5fdf-426f-8ca6-6bcb58585963-kube-api-access-nls58\") pod \"openstack-operator-controller-manager-654fc8b94c-cp8qz\" (UID: \"b247b139-5fdf-426f-8ca6-6bcb58585963\") " pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.945437 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.969130 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" Nov 22 08:38:48 crc kubenswrapper[4743]: I1122 08:38:48.991777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szpxm\" (UniqueName: \"kubernetes.io/projected/36e85576-c481-4424-aa1c-21a18036d239-kube-api-access-szpxm\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5\" (UID: \"36e85576-c481-4424-aa1c-21a18036d239\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.021689 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz"] Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.043455 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szpxm\" (UniqueName: \"kubernetes.io/projected/36e85576-c481-4424-aa1c-21a18036d239-kube-api-access-szpxm\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5\" (UID: \"36e85576-c481-4424-aa1c-21a18036d239\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.100532 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4\" (UID: \"f5f27cf7-eaa5-4b71-84a6-94fac3920d39\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:38:49 crc kubenswrapper[4743]: E1122 08:38:49.100743 4743 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 08:38:49 crc kubenswrapper[4743]: E1122 08:38:49.100786 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-cert podName:f5f27cf7-eaa5-4b71-84a6-94fac3920d39 nodeName:}" failed. No retries permitted until 2025-11-22 08:38:50.100772762 +0000 UTC m=+1003.807133814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-cert") pod "openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" (UID: "f5f27cf7-eaa5-4b71-84a6-94fac3920d39") : secret "openstack-baremetal-operator-webhook-server-cert" not found Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.173791 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.247647 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4"] Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.264987 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln"] Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.275489 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss"] Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.411157 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b247b139-5fdf-426f-8ca6-6bcb58585963-cert\") pod \"openstack-operator-controller-manager-654fc8b94c-cp8qz\" (UID: \"b247b139-5fdf-426f-8ca6-6bcb58585963\") " pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.423997 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn"] Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.429269 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b247b139-5fdf-426f-8ca6-6bcb58585963-cert\") pod \"openstack-operator-controller-manager-654fc8b94c-cp8qz\" (UID: \"b247b139-5fdf-426f-8ca6-6bcb58585963\") " pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.442709 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s"] Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.457416 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp"] Nov 22 08:38:49 crc kubenswrapper[4743]: W1122 08:38:49.471971 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f1b446c_1023_4682_889f_97abca903826.slice/crio-45eac625e6869acb1acd1622175837ad397994fe0c71d57e0f8e58baf0cd44be WatchSource:0}: Error finding container 45eac625e6869acb1acd1622175837ad397994fe0c71d57e0f8e58baf0cd44be: Status 404 returned error can't find the container with id 45eac625e6869acb1acd1622175837ad397994fe0c71d57e0f8e58baf0cd44be Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.591031 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" event={"ID":"657d8d61-7be7-42a6-8472-2d70e55a8428","Type":"ContainerStarted","Data":"af5590e7890057bf94d037f688214c4c9f3ebf22c8eadbc1d1d13838f41b535a"} Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.596610 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" event={"ID":"89e94ee4-4365-4e56-a5a2-3d61bbbd8876","Type":"ContainerStarted","Data":"7a081298e31edbdbd01f100b50e25d39f0ef81d9aeddffcf8e48fd4165fc4209"} Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.602746 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" event={"ID":"a729ba89-b0fe-4363-b4b4-ffe21f0c627c","Type":"ContainerStarted","Data":"110ada090b873ba5960e20b8308093af998ae528db42ebbf048376988e80f5b6"} Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.605065 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" event={"ID":"d8c50ae0-c8c9-4e87-9130-4c04d5b468ac","Type":"ContainerStarted","Data":"bb35893817ca2f984b702c8b24b483a8d709c4420473c95dca9abb56e0267622"} Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.608476 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" event={"ID":"9f1b446c-1023-4682-889f-97abca903826","Type":"ContainerStarted","Data":"45eac625e6869acb1acd1622175837ad397994fe0c71d57e0f8e58baf0cd44be"} Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.611061 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" event={"ID":"e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb","Type":"ContainerStarted","Data":"2c186f9a85a07c01c4c5aebc19a3dc34fab711098fbf2a9a54d5883ba42c0d25"} Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.612743 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" event={"ID":"f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f","Type":"ContainerStarted","Data":"2d0315e44c3a598bfd2d552ad86834a9aac87d6ebed969125cd18f2e579a81cf"} Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.614460 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" event={"ID":"81cdc04a-86d9-488d-b854-d941f3f5632e","Type":"ContainerStarted","Data":"1b5b406cde26fd4fcf539adfc02f214cd5232acf31c604aab81e3904ed22d0a5"} Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.638154 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.644356 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k"] Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.686263 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m"] Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.817331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-cert\") pod \"infra-operator-controller-manager-6dd8864d7c-gn9b8\" (UID: \"25fd4b24-83f2-4a02-b086-ca0f03cb42a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.823292 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/25fd4b24-83f2-4a02-b086-ca0f03cb42a3-cert\") pod \"infra-operator-controller-manager-6dd8864d7c-gn9b8\" (UID: \"25fd4b24-83f2-4a02-b086-ca0f03cb42a3\") " pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:38:49 crc kubenswrapper[4743]: I1122 08:38:49.827568 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.068816 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z"] Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.080040 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-58f887965d-579sf"] Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.094043 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj"] Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.115502 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-d656998f4-jczhh"] Nov 22 08:38:50 crc kubenswrapper[4743]: W1122 08:38:50.115870 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50b5cda8_859c_49f0_92aa_601c16eb9a2a.slice/crio-64ec4fe5c0b14c7612b633856f89285512702c45b128824bd029c2d0d154365e WatchSource:0}: Error finding container 64ec4fe5c0b14c7612b633856f89285512702c45b128824bd029c2d0d154365e: Status 404 returned error can't find the container with id 64ec4fe5c0b14c7612b633856f89285512702c45b128824bd029c2d0d154365e Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.122247 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4\" (UID: \"f5f27cf7-eaa5-4b71-84a6-94fac3920d39\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.134140 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk"] Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.137711 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-b4c496f69-55bp6"] Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.140689 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f5f27cf7-eaa5-4b71-84a6-94fac3920d39-cert\") pod \"openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4\" (UID: \"f5f27cf7-eaa5-4b71-84a6-94fac3920d39\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.155792 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx"] Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.159391 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5"] Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.164898 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb"] Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.174260 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw"] Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.186150 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8"] Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.194290 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6gh4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-54cfbf4c7d-g6mzk_openstack-operators(0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.194411 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhjss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6d4bf84b58-rssfb_openstack-operators(904fdb49-cc2c-443c-af9e-950b648018e9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.194505 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s6dbd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5b797b8dff-z4flw_openstack-operators(3420c6da-358d-4b5c-a383-e25fbc58a2ee): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.197898 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-szpxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5_openstack-operators(36e85576-c481-4424-aa1c-21a18036d239): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.198691 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pk455,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-b4c496f69-55bp6_openstack-operators(75667626-7d6a-46d0-b0b2-f627257967f4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.198967 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" podUID="36e85576-c481-4424-aa1c-21a18036d239" Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.200002 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:86df58f744c1d23233cc98f6ea17c8d6da637c50003d0fc8c100045594aa9894,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5sqxp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-6dd8864d7c-gn9b8_openstack-operators(25fd4b24-83f2-4a02-b086-ca0f03cb42a3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.291884 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.309318 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz"] Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.607832 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4"] Nov 22 08:38:50 crc kubenswrapper[4743]: W1122 08:38:50.614812 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5f27cf7_eaa5_4b71_84a6_94fac3920d39.slice/crio-7496a4736772d1696e1fa1de11a0de911e2692c8049b8df36bfe480c541e08c3 WatchSource:0}: Error finding container 7496a4736772d1696e1fa1de11a0de911e2692c8049b8df36bfe480c541e08c3: Status 404 returned error can't find the container with id 7496a4736772d1696e1fa1de11a0de911e2692c8049b8df36bfe480c541e08c3 Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.630148 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" event={"ID":"0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970","Type":"ContainerStarted","Data":"537436f88f2cbc8870e9cd6e46f67f270e1678c9b5eaca26e405f2605567ff57"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.630196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" event={"ID":"0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970","Type":"ContainerStarted","Data":"7c9cd54bc739ed639f765985c17126198e6e87b4170d5732a1343abc529d14cd"} Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.637911 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" podUID="0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970" Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.641350 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" event={"ID":"904fdb49-cc2c-443c-af9e-950b648018e9","Type":"ContainerStarted","Data":"0e49b5616787ae6cdcedfbef6d8398098e30a7ca7dbb3a7a5c092a566b3b4473"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.652728 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" event={"ID":"ac624665-bb51-4c61-b213-cb07bd43eafe","Type":"ContainerStarted","Data":"e3d2ea33b7028888cc5d23f831033cd9cc51a62ee54a16a8a15fec23e1f9307f"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.659975 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" event={"ID":"3420c6da-358d-4b5c-a383-e25fbc58a2ee","Type":"ContainerStarted","Data":"0b9154b7e22f4b4119bd8d21d0b64bae4c76e3787a375c77a678a72a5138b68a"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.668342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" event={"ID":"f5f27cf7-eaa5-4b71-84a6-94fac3920d39","Type":"ContainerStarted","Data":"7496a4736772d1696e1fa1de11a0de911e2692c8049b8df36bfe480c541e08c3"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.673898 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" event={"ID":"9d349409-980f-4605-bd87-d09fe812dd65","Type":"ContainerStarted","Data":"6f8c5ec9f930dad64f79e336b6375f93629b6fd95310f630e278c8ddb1f380ca"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.677801 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" event={"ID":"5c593905-2ee5-4990-9f9c-85ca81f38319","Type":"ContainerStarted","Data":"4ee9ba9d5917407ae70f55e393a0b6d77cf233dd30b69adf2b3bf901d20dc551"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.678903 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" event={"ID":"4c6b99d5-9791-40db-91fd-d74c80b2e3a7","Type":"ContainerStarted","Data":"ea2688bb550c9826ae6f0692f4bd408de3525cb9ec2c8deccf9af64d90096984"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.680253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" event={"ID":"25fd4b24-83f2-4a02-b086-ca0f03cb42a3","Type":"ContainerStarted","Data":"37e77d310315bc7d17a07da7bb81f125f9b862d2d2c325a09a1cd75071b1d9f1"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.681276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" event={"ID":"1b06dd20-2bb7-4ff2-aa77-997042af333e","Type":"ContainerStarted","Data":"c9e1a92faed50cc98b2592bf9b8c177c3e7e3f5c06c63a2c7eac51a832877487"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.683049 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" event={"ID":"75667626-7d6a-46d0-b0b2-f627257967f4","Type":"ContainerStarted","Data":"18e643ef72f85f6d0da255585f9276589db0fdcd1724538e1a9365766f45bace"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.691825 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" event={"ID":"b247b139-5fdf-426f-8ca6-6bcb58585963","Type":"ContainerStarted","Data":"d787f587d5e711126a268729981b475bfacb9aa0bc8221ffee7a47578168e7c9"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.692832 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" event={"ID":"50b5cda8-859c-49f0-92aa-601c16eb9a2a","Type":"ContainerStarted","Data":"64ec4fe5c0b14c7612b633856f89285512702c45b128824bd029c2d0d154365e"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.695424 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" event={"ID":"9f7a1db7-e801-4da6-b64b-f3babcfcd9c6","Type":"ContainerStarted","Data":"bb57c13f4dd0593c9ee4bcd6b0d4599e127d76b67d624f84e4a303c51c35f012"} Nov 22 08:38:50 crc kubenswrapper[4743]: I1122 08:38:50.696944 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" event={"ID":"36e85576-c481-4424-aa1c-21a18036d239","Type":"ContainerStarted","Data":"93e736ac8eeaaf796703ed8533bd2536249cf48d98bbe52441ca057efa6836c0"} Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.703929 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" podUID="36e85576-c481-4424-aa1c-21a18036d239" Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.736005 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" podUID="75667626-7d6a-46d0-b0b2-f627257967f4" Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.749941 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" podUID="25fd4b24-83f2-4a02-b086-ca0f03cb42a3" Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.750413 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" podUID="904fdb49-cc2c-443c-af9e-950b648018e9" Nov 22 08:38:50 crc kubenswrapper[4743]: E1122 08:38:50.784477 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" podUID="3420c6da-358d-4b5c-a383-e25fbc58a2ee" Nov 22 08:38:51 crc kubenswrapper[4743]: I1122 08:38:51.705118 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" event={"ID":"25fd4b24-83f2-4a02-b086-ca0f03cb42a3","Type":"ContainerStarted","Data":"ba7df5a9f612c8db3f3fe9dc3b8e9513d5064d68ca5fd7ed6983c40cd539729c"} Nov 22 08:38:51 crc kubenswrapper[4743]: E1122 08:38:51.707970 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:86df58f744c1d23233cc98f6ea17c8d6da637c50003d0fc8c100045594aa9894\\\"\"" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" podUID="25fd4b24-83f2-4a02-b086-ca0f03cb42a3" Nov 22 08:38:51 crc kubenswrapper[4743]: I1122 08:38:51.708549 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" event={"ID":"904fdb49-cc2c-443c-af9e-950b648018e9","Type":"ContainerStarted","Data":"b0883c390019e34c136edd01ce8f2424cf2cf872b60d6d0b597b638d7d7721ff"} Nov 22 08:38:51 crc kubenswrapper[4743]: E1122 08:38:51.709948 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" podUID="904fdb49-cc2c-443c-af9e-950b648018e9" Nov 22 08:38:51 crc kubenswrapper[4743]: I1122 08:38:51.710913 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" event={"ID":"75667626-7d6a-46d0-b0b2-f627257967f4","Type":"ContainerStarted","Data":"55eea82e3b4a10752a172c2adea2d467888485e1799fd6b86456d692ead6ac91"} Nov 22 08:38:51 crc kubenswrapper[4743]: E1122 08:38:51.712137 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" podUID="75667626-7d6a-46d0-b0b2-f627257967f4" Nov 22 08:38:51 crc kubenswrapper[4743]: I1122 08:38:51.713641 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" event={"ID":"b247b139-5fdf-426f-8ca6-6bcb58585963","Type":"ContainerStarted","Data":"643afa0a73acea58f81fa78d7e7beaa7454c984c7bc806ea88dfee3bda3d6d98"} Nov 22 08:38:51 crc kubenswrapper[4743]: I1122 08:38:51.713673 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" event={"ID":"b247b139-5fdf-426f-8ca6-6bcb58585963","Type":"ContainerStarted","Data":"ce594d6eceb4c4dddf6316c8e2f421fd5a2875d4c5a5ac17dc4beab6f2b83b55"} Nov 22 08:38:51 crc kubenswrapper[4743]: I1122 08:38:51.713927 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:38:51 crc kubenswrapper[4743]: I1122 08:38:51.716157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" event={"ID":"3420c6da-358d-4b5c-a383-e25fbc58a2ee","Type":"ContainerStarted","Data":"aa913f13a0b6de0fb1f3f187f3d07e76aa99df4e45569e401b9078d096e56558"} Nov 22 08:38:51 crc kubenswrapper[4743]: E1122 08:38:51.718640 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:442c269d79163f8da75505019c02e9f0815837aaadcaddacb8e6c12df297ca13\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" podUID="0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970" Nov 22 08:38:51 crc kubenswrapper[4743]: E1122 08:38:51.718759 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" podUID="3420c6da-358d-4b5c-a383-e25fbc58a2ee" Nov 22 08:38:51 crc kubenswrapper[4743]: E1122 08:38:51.719768 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" podUID="36e85576-c481-4424-aa1c-21a18036d239" Nov 22 08:38:51 crc kubenswrapper[4743]: I1122 08:38:51.762825 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" podStartSLOduration=3.762809 podStartE2EDuration="3.762809s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:38:51.757976421 +0000 UTC m=+1005.464337473" watchObservedRunningTime="2025-11-22 08:38:51.762809 +0000 UTC m=+1005.469170042" Nov 22 08:38:52 crc kubenswrapper[4743]: E1122 08:38:52.731233 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:4094e7fc11a33e8e2b6768a053cafaf5b122446d23f9113d43d520cb64e9776c\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" podUID="3420c6da-358d-4b5c-a383-e25fbc58a2ee" Nov 22 08:38:52 crc kubenswrapper[4743]: E1122 08:38:52.731449 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:86df58f744c1d23233cc98f6ea17c8d6da637c50003d0fc8c100045594aa9894\\\"\"" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" podUID="25fd4b24-83f2-4a02-b086-ca0f03cb42a3" Nov 22 08:38:52 crc kubenswrapper[4743]: E1122 08:38:52.731791 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:82207e753574d4be246f86c4b074500d66cf20214aa80f0a8525cf3287a35e6d\\\"\"" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" podUID="75667626-7d6a-46d0-b0b2-f627257967f4" Nov 22 08:38:52 crc kubenswrapper[4743]: E1122 08:38:52.733096 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:5324a6d2f76fc3041023b0cbd09a733ef2b59f310d390e4d6483d219eb96494f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" podUID="904fdb49-cc2c-443c-af9e-950b648018e9" Nov 22 08:38:59 crc kubenswrapper[4743]: I1122 08:38:59.646047 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-654fc8b94c-cp8qz" Nov 22 08:39:01 crc kubenswrapper[4743]: I1122 08:39:01.241896 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:39:01 crc kubenswrapper[4743]: I1122 08:39:01.242275 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:39:02 crc kubenswrapper[4743]: E1122 08:39:02.922411 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd" Nov 22 08:39:02 crc kubenswrapper[4743]: E1122 08:39:02.923131 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vvw2k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4_openstack-operators(f5f27cf7-eaa5-4b71-84a6-94fac3920d39): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:39:04 crc kubenswrapper[4743]: E1122 08:39:04.404409 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a" Nov 22 08:39:04 crc kubenswrapper[4743]: E1122 08:39:04.404683 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7brvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-58f887965d-579sf_openstack-operators(50b5cda8-859c-49f0-92aa-601c16eb9a2a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:39:05 crc kubenswrapper[4743]: E1122 08:39:05.004258 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b" Nov 22 08:39:05 crc kubenswrapper[4743]: E1122 08:39:05.004860 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wfm6t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-54fc5f65b7-cv2tj_openstack-operators(4c6b99d5-9791-40db-91fd-d74c80b2e3a7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:39:05 crc kubenswrapper[4743]: E1122 08:39:05.578461 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04" Nov 22 08:39:05 crc kubenswrapper[4743]: E1122 08:39:05.578741 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gqx5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-54b5986bb8-vt47z_openstack-operators(9d349409-980f-4605-bd87-d09fe812dd65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:39:06 crc kubenswrapper[4743]: E1122 08:39:06.090437 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f" Nov 22 08:39:06 crc kubenswrapper[4743]: E1122 08:39:06.090670 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2whtm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-8c6448b9f-j6vgx_openstack-operators(5c593905-2ee5-4990-9f9c-85ca81f38319): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:39:06 crc kubenswrapper[4743]: E1122 08:39:06.649033 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:553b1288b330ad05771d59c6b73c1681c95f457e8475682f9ad0d2e6b85f37e9" Nov 22 08:39:06 crc kubenswrapper[4743]: E1122 08:39:06.649343 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:553b1288b330ad05771d59c6b73c1681c95f457e8475682f9ad0d2e6b85f37e9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dpzgs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-6498cbf48f-8vd9m_openstack-operators(ac624665-bb51-4c61-b213-cb07bd43eafe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.181851 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" podUID="f5f27cf7-eaa5-4b71-84a6-94fac3920d39" Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.182259 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" podUID="5c593905-2ee5-4990-9f9c-85ca81f38319" Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.182340 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" podUID="50b5cda8-859c-49f0-92aa-601c16eb9a2a" Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.183289 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" podUID="9d349409-980f-4605-bd87-d09fe812dd65" Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.183416 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" podUID="ac624665-bb51-4c61-b213-cb07bd43eafe" Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.266652 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" podUID="4c6b99d5-9791-40db-91fd-d74c80b2e3a7" Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.860504 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" event={"ID":"e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb","Type":"ContainerStarted","Data":"e33dfb309de94917f5988423b16a1e72206bbbfe4073f05b250a2fcb6476366c"} Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.864432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" event={"ID":"f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f","Type":"ContainerStarted","Data":"fe6bbd5c4624a67021d024d1f37fcc71233cf510a9cb6c896e4effb2e310e8d2"} Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.866734 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" event={"ID":"a729ba89-b0fe-4363-b4b4-ffe21f0c627c","Type":"ContainerStarted","Data":"dbeac832ef8af1227dcd9378fc4d13ed51b6e5bee8ea247326f9fe3439d774b0"} Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.868238 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" event={"ID":"9f1b446c-1023-4682-889f-97abca903826","Type":"ContainerStarted","Data":"a42cccfdebac0624c10b74299bce5652e35b8e93702227bb6d562cb26690a10d"} Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.869631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" event={"ID":"5c593905-2ee5-4990-9f9c-85ca81f38319","Type":"ContainerStarted","Data":"168d9c212f0fda3a5a3de76db1fcf99e662b641ed4eb32ff0b9e3353ac5304d4"} Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.873469 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" podUID="5c593905-2ee5-4990-9f9c-85ca81f38319" Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.880429 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" event={"ID":"ac624665-bb51-4c61-b213-cb07bd43eafe","Type":"ContainerStarted","Data":"2a404190cb749057c6e77d15ad61d170b33ed07df4384ca8d0616e6aad9c13c3"} Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.885363 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:553b1288b330ad05771d59c6b73c1681c95f457e8475682f9ad0d2e6b85f37e9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" podUID="ac624665-bb51-4c61-b213-cb07bd43eafe" Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.888421 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" event={"ID":"81cdc04a-86d9-488d-b854-d941f3f5632e","Type":"ContainerStarted","Data":"b33f6e83f627a9e5a2da52c9140f1e0670a583becdab69b114cf9a5f347c537e"} Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.898531 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" event={"ID":"9d349409-980f-4605-bd87-d09fe812dd65","Type":"ContainerStarted","Data":"7a0042f2100400bc63a00a7f91fca16fdabc81853efa23c117c9a1977720d3eb"} Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.899851 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" podUID="9d349409-980f-4605-bd87-d09fe812dd65" Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.900788 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" event={"ID":"89e94ee4-4365-4e56-a5a2-3d61bbbd8876","Type":"ContainerStarted","Data":"e615f14419f04fbcd889d6ba8c642430eb4e33f654d26c9503ad4c4083cf2ca9"} Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.904458 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" event={"ID":"d8c50ae0-c8c9-4e87-9130-4c04d5b468ac","Type":"ContainerStarted","Data":"d71ca4dbdb0dd7a4dff853231fd7ae536e160185e174878dd4cd29c0e0375ba2"} Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.913822 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" event={"ID":"657d8d61-7be7-42a6-8472-2d70e55a8428","Type":"ContainerStarted","Data":"42be98846edb7b46dc2596ebc4525a03a6111f2d612711359cfa9c41d0a1fe8d"} Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.916189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" event={"ID":"f5f27cf7-eaa5-4b71-84a6-94fac3920d39","Type":"ContainerStarted","Data":"a098fe4f0848d9d2507ab0f4ed423523904949154f64c26a61eaec386ec2a602"} Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.918164 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" podUID="f5f27cf7-eaa5-4b71-84a6-94fac3920d39" Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.919297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" event={"ID":"50b5cda8-859c-49f0-92aa-601c16eb9a2a","Type":"ContainerStarted","Data":"141075c962617ca94b8cd56222085fb58ab71dc8a5c0ddaff55f1f48f5476a12"} Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.920917 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" podUID="50b5cda8-859c-49f0-92aa-601c16eb9a2a" Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.921948 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" event={"ID":"1b06dd20-2bb7-4ff2-aa77-997042af333e","Type":"ContainerStarted","Data":"d2184293b169ead6e4c77142d9db3823dcd3610ee0ecb1908f3124e7909a095e"} Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.926673 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" event={"ID":"4c6b99d5-9791-40db-91fd-d74c80b2e3a7","Type":"ContainerStarted","Data":"a0ce5f4b5039ad01e02867dad4988228d712f8007d925d746bb86030d002b08f"} Nov 22 08:39:08 crc kubenswrapper[4743]: E1122 08:39:08.929224 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" podUID="4c6b99d5-9791-40db-91fd-d74c80b2e3a7" Nov 22 08:39:08 crc kubenswrapper[4743]: I1122 08:39:08.932141 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" event={"ID":"9f7a1db7-e801-4da6-b64b-f3babcfcd9c6","Type":"ContainerStarted","Data":"e0108435bdcc1fc4c0ba44db223a6561375f8d3e70fd4936e8f240c5587195dc"} Nov 22 08:39:09 crc kubenswrapper[4743]: E1122 08:39:09.940028 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4838402d41d42c56613d43dc5041aae475a2b18e6172491d6c4d4a78a580697f\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" podUID="5c593905-2ee5-4990-9f9c-85ca81f38319" Nov 22 08:39:09 crc kubenswrapper[4743]: E1122 08:39:09.941773 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:78852f8ba332a5756c1551c126157f735279101a0fc3277ba4aa4db3478789dd\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" podUID="f5f27cf7-eaa5-4b71-84a6-94fac3920d39" Nov 22 08:39:09 crc kubenswrapper[4743]: E1122 08:39:09.941816 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:7b90521b9e9cb4eb43c2f1c3bf85dbd068d684315f4f705b07708dd078df9d04\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" podUID="9d349409-980f-4605-bd87-d09fe812dd65" Nov 22 08:39:09 crc kubenswrapper[4743]: E1122 08:39:09.941851 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:b749a5dd8bc718875c3f5e81b38d54d003be77ab92de4a3e9f9595566496a58a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" podUID="50b5cda8-859c-49f0-92aa-601c16eb9a2a" Nov 22 08:39:09 crc kubenswrapper[4743]: E1122 08:39:09.941916 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:5d49d4594c66eda7b151746cc6e1d3c67c0129b4503eeb043a64ae8ec2da6a1b\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" podUID="4c6b99d5-9791-40db-91fd-d74c80b2e3a7" Nov 22 08:39:09 crc kubenswrapper[4743]: E1122 08:39:09.942131 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:553b1288b330ad05771d59c6b73c1681c95f457e8475682f9ad0d2e6b85f37e9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" podUID="ac624665-bb51-4c61-b213-cb07bd43eafe" Nov 22 08:39:15 crc kubenswrapper[4743]: I1122 08:39:15.981009 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" event={"ID":"89e94ee4-4365-4e56-a5a2-3d61bbbd8876","Type":"ContainerStarted","Data":"26064ee226bcc3bbb72bde20e210c132c606d51e3d9cbb5e9acbb36dca97675a"} Nov 22 08:39:15 crc kubenswrapper[4743]: I1122 08:39:15.981656 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" Nov 22 08:39:15 crc kubenswrapper[4743]: I1122 08:39:15.983650 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" Nov 22 08:39:16 crc kubenswrapper[4743]: I1122 08:39:16.000459 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-99b499f4-xbmln" podStartSLOduration=10.906471598 podStartE2EDuration="29.00044348s" podCreationTimestamp="2025-11-22 08:38:47 +0000 UTC" firstStartedPulling="2025-11-22 08:38:49.315407701 +0000 UTC m=+1003.021768753" lastFinishedPulling="2025-11-22 08:39:07.409379583 +0000 UTC m=+1021.115740635" observedRunningTime="2025-11-22 08:39:15.997794634 +0000 UTC m=+1029.704155706" watchObservedRunningTime="2025-11-22 08:39:16.00044348 +0000 UTC m=+1029.706804532" Nov 22 08:39:17 crc kubenswrapper[4743]: I1122 08:39:17.993529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" event={"ID":"904fdb49-cc2c-443c-af9e-950b648018e9","Type":"ContainerStarted","Data":"225abcf020335181c68882476dd559919c2c8eaadf82c844327e0f0386a0e06f"} Nov 22 08:39:17 crc kubenswrapper[4743]: I1122 08:39:17.994026 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" Nov 22 08:39:17 crc kubenswrapper[4743]: I1122 08:39:17.995671 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" event={"ID":"d8c50ae0-c8c9-4e87-9130-4c04d5b468ac","Type":"ContainerStarted","Data":"648c367a3659c0c6757b7be58a28750e7c956107ca26af3a5864212602e54b69"} Nov 22 08:39:17 crc kubenswrapper[4743]: I1122 08:39:17.996548 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" Nov 22 08:39:17 crc kubenswrapper[4743]: I1122 08:39:17.998363 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" event={"ID":"3420c6da-358d-4b5c-a383-e25fbc58a2ee","Type":"ContainerStarted","Data":"a07f3190c77bb28d1e0a866df42fcaa599fa0ca07aecbf99b37796502da62bae"} Nov 22 08:39:17 crc kubenswrapper[4743]: I1122 08:39:17.998876 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.000045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" event={"ID":"a729ba89-b0fe-4363-b4b4-ffe21f0c627c","Type":"ContainerStarted","Data":"944cb6a27d24b417fcab7b9ee1e3ecddd5f02c75a037e2b5722ad6e6e013e451"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.000743 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.000983 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.002126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" event={"ID":"9f7a1db7-e801-4da6-b64b-f3babcfcd9c6","Type":"ContainerStarted","Data":"59bf95cb0eaca096cfa6bda0b831341e5f7e4f832b00e93859fa3d577b7a4bb3"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.002826 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.002918 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.008009 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.009997 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" event={"ID":"f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f","Type":"ContainerStarted","Data":"07b6fa9053f23ad095b8d7e65307217559cb8c551def726d4b7bdaac804f7ab4"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.010258 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.012305 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.012524 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" event={"ID":"36e85576-c481-4424-aa1c-21a18036d239","Type":"ContainerStarted","Data":"9912ac81f0333a4c167c922c077ae82b089bcd596f029cf594ed44a7d446ae0a"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.012600 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" podStartSLOduration=3.316386266 podStartE2EDuration="30.012570239s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.194360997 +0000 UTC m=+1003.900722049" lastFinishedPulling="2025-11-22 08:39:16.89054498 +0000 UTC m=+1030.596906022" observedRunningTime="2025-11-22 08:39:18.007355138 +0000 UTC m=+1031.713716200" watchObservedRunningTime="2025-11-22 08:39:18.012570239 +0000 UTC m=+1031.718931281" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.015436 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" event={"ID":"25fd4b24-83f2-4a02-b086-ca0f03cb42a3","Type":"ContainerStarted","Data":"6c67afdb2a8383eb1f50475401fb3a737609d49d5818d186a4c09439fa3ba70f"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.015855 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.017224 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" event={"ID":"0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970","Type":"ContainerStarted","Data":"7f33097253254a25ebf319f0974c8ab5b69765a317353182be5fa0808395a8a5"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.017642 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.018903 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" event={"ID":"75667626-7d6a-46d0-b0b2-f627257967f4","Type":"ContainerStarted","Data":"3b6dbe270f53a7d48c14af36643d9eb32c3cd1cf533f3a230effd2399714611f"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.019241 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.021233 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" event={"ID":"e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb","Type":"ContainerStarted","Data":"65043e9f9404b2237966c675af2995941998790c611167e2c2f51e2811aaa985"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.024010 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.026198 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.027456 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" event={"ID":"1b06dd20-2bb7-4ff2-aa77-997042af333e","Type":"ContainerStarted","Data":"ad6d563cfd266581f57e4504f711e83bba814aee0d7f4aafaa70daa7675d7852"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.028348 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.029433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" event={"ID":"81cdc04a-86d9-488d-b854-d941f3f5632e","Type":"ContainerStarted","Data":"4244880713b93e5923584ad4aa36ec2486325b47d0c5c136f2bc31585dc8ad64"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.029396 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" podStartSLOduration=3.271165408 podStartE2EDuration="30.029380055s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.19445859 +0000 UTC m=+1003.900819642" lastFinishedPulling="2025-11-22 08:39:16.952673237 +0000 UTC m=+1030.659034289" observedRunningTime="2025-11-22 08:39:18.025939745 +0000 UTC m=+1031.732300797" watchObservedRunningTime="2025-11-22 08:39:18.029380055 +0000 UTC m=+1031.735741107" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.030435 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.031161 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.033332 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.033805 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" event={"ID":"657d8d61-7be7-42a6-8472-2d70e55a8428","Type":"ContainerStarted","Data":"3d45eb087870a46e1547c8d1e0ec664124f579abef768270f790ac908635d1a1"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.035325 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.037746 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.044356 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" event={"ID":"9f1b446c-1023-4682-889f-97abca903826","Type":"ContainerStarted","Data":"0ed37df30c37511bf1491a4add9c0e8d182ee09b6ca11defec37a46e9402a133"} Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.044388 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.047825 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.052355 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-d656998f4-jczhh" podStartSLOduration=14.196308227 podStartE2EDuration="30.052330579s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.193873903 +0000 UTC m=+1003.900234955" lastFinishedPulling="2025-11-22 08:39:06.049896235 +0000 UTC m=+1019.756257307" observedRunningTime="2025-11-22 08:39:18.046882161 +0000 UTC m=+1031.753243213" watchObservedRunningTime="2025-11-22 08:39:18.052330579 +0000 UTC m=+1031.758691641" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.076840 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7969689c84-k5rhz" podStartSLOduration=14.137545769 podStartE2EDuration="31.076818417s" podCreationTimestamp="2025-11-22 08:38:47 +0000 UTC" firstStartedPulling="2025-11-22 08:38:49.110598226 +0000 UTC m=+1002.816959278" lastFinishedPulling="2025-11-22 08:39:06.049870874 +0000 UTC m=+1019.756231926" observedRunningTime="2025-11-22 08:39:18.074053077 +0000 UTC m=+1031.780414129" watchObservedRunningTime="2025-11-22 08:39:18.076818417 +0000 UTC m=+1031.783179469" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.122059 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7454b96578-ktmd4" podStartSLOduration=12.018518987 podStartE2EDuration="30.122040755s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:49.311959251 +0000 UTC m=+1003.018320303" lastFinishedPulling="2025-11-22 08:39:07.415481019 +0000 UTC m=+1021.121842071" observedRunningTime="2025-11-22 08:39:18.120624844 +0000 UTC m=+1031.826985886" watchObservedRunningTime="2025-11-22 08:39:18.122040755 +0000 UTC m=+1031.828401807" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.122332 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-78bd47f458-klfvn" podStartSLOduration=12.187583078 podStartE2EDuration="30.122326984s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:49.446241366 +0000 UTC m=+1003.152602418" lastFinishedPulling="2025-11-22 08:39:07.380985272 +0000 UTC m=+1021.087346324" observedRunningTime="2025-11-22 08:39:18.09904679 +0000 UTC m=+1031.805407842" watchObservedRunningTime="2025-11-22 08:39:18.122326984 +0000 UTC m=+1031.828688056" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.150042 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-75fb479bcc-xts6s" podStartSLOduration=13.243590597 podStartE2EDuration="31.150022174s" podCreationTimestamp="2025-11-22 08:38:47 +0000 UTC" firstStartedPulling="2025-11-22 08:38:49.474106332 +0000 UTC m=+1003.180467384" lastFinishedPulling="2025-11-22 08:39:07.380537909 +0000 UTC m=+1021.086898961" observedRunningTime="2025-11-22 08:39:18.148700616 +0000 UTC m=+1031.855061668" watchObservedRunningTime="2025-11-22 08:39:18.150022174 +0000 UTC m=+1031.856383226" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.171608 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-56f54d6746-7bcj5" podStartSLOduration=12.799046136 podStartE2EDuration="31.171566887s" podCreationTimestamp="2025-11-22 08:38:47 +0000 UTC" firstStartedPulling="2025-11-22 08:38:49.007756961 +0000 UTC m=+1002.714118013" lastFinishedPulling="2025-11-22 08:39:07.380277712 +0000 UTC m=+1021.086638764" observedRunningTime="2025-11-22 08:39:18.167070147 +0000 UTC m=+1031.873431199" watchObservedRunningTime="2025-11-22 08:39:18.171566887 +0000 UTC m=+1031.877927939" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.192695 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-598f69df5d-6wnss" podStartSLOduration=13.157734533 podStartE2EDuration="31.192677418s" podCreationTimestamp="2025-11-22 08:38:47 +0000 UTC" firstStartedPulling="2025-11-22 08:38:49.344033889 +0000 UTC m=+1003.050394941" lastFinishedPulling="2025-11-22 08:39:07.378976774 +0000 UTC m=+1021.085337826" observedRunningTime="2025-11-22 08:39:18.185093778 +0000 UTC m=+1031.891454830" watchObservedRunningTime="2025-11-22 08:39:18.192677418 +0000 UTC m=+1031.899038470" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.217900 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" podStartSLOduration=3.562038711 podStartE2EDuration="30.217875997s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.19862718 +0000 UTC m=+1003.904988232" lastFinishedPulling="2025-11-22 08:39:16.854464466 +0000 UTC m=+1030.560825518" observedRunningTime="2025-11-22 08:39:18.216219529 +0000 UTC m=+1031.922580591" watchObservedRunningTime="2025-11-22 08:39:18.217875997 +0000 UTC m=+1031.924237049" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.247543 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" podStartSLOduration=9.243522149 podStartE2EDuration="30.247520594s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.194183842 +0000 UTC m=+1003.900544974" lastFinishedPulling="2025-11-22 08:39:11.198182367 +0000 UTC m=+1024.904543419" observedRunningTime="2025-11-22 08:39:18.233053786 +0000 UTC m=+1031.939414858" watchObservedRunningTime="2025-11-22 08:39:18.247520594 +0000 UTC m=+1031.953881646" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.266634 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" podStartSLOduration=4.551758324 podStartE2EDuration="31.266609537s" podCreationTimestamp="2025-11-22 08:38:47 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.199804564 +0000 UTC m=+1003.906165616" lastFinishedPulling="2025-11-22 08:39:16.914655777 +0000 UTC m=+1030.621016829" observedRunningTime="2025-11-22 08:39:18.261676514 +0000 UTC m=+1031.968037576" watchObservedRunningTime="2025-11-22 08:39:18.266609537 +0000 UTC m=+1031.972970599" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.302635 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-767ccfd65f-kcrpp" podStartSLOduration=13.482394744 podStartE2EDuration="31.302611598s" podCreationTimestamp="2025-11-22 08:38:47 +0000 UTC" firstStartedPulling="2025-11-22 08:38:49.560871151 +0000 UTC m=+1003.267232203" lastFinishedPulling="2025-11-22 08:39:07.381088005 +0000 UTC m=+1021.087449057" observedRunningTime="2025-11-22 08:39:18.280432807 +0000 UTC m=+1031.986793859" watchObservedRunningTime="2025-11-22 08:39:18.302611598 +0000 UTC m=+1032.008972650" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.317649 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5" podStartSLOduration=3.624846699 podStartE2EDuration="30.317631693s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.197823757 +0000 UTC m=+1003.904184809" lastFinishedPulling="2025-11-22 08:39:16.890608741 +0000 UTC m=+1030.596969803" observedRunningTime="2025-11-22 08:39:18.302247528 +0000 UTC m=+1032.008608580" watchObservedRunningTime="2025-11-22 08:39:18.317631693 +0000 UTC m=+1032.023992745" Nov 22 08:39:18 crc kubenswrapper[4743]: I1122 08:39:18.334536 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-cfbb9c588-j682k" podStartSLOduration=12.632652962 podStartE2EDuration="30.334513181s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:49.713786585 +0000 UTC m=+1003.420147637" lastFinishedPulling="2025-11-22 08:39:07.415646804 +0000 UTC m=+1021.122007856" observedRunningTime="2025-11-22 08:39:18.326409777 +0000 UTC m=+1032.032770839" watchObservedRunningTime="2025-11-22 08:39:18.334513181 +0000 UTC m=+1032.040874233" Nov 22 08:39:22 crc kubenswrapper[4743]: I1122 08:39:22.078937 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" event={"ID":"ac624665-bb51-4c61-b213-cb07bd43eafe","Type":"ContainerStarted","Data":"70c5ea6365064f21b55eb895471d14113029b0a3f591a9295401a738fb778a68"} Nov 22 08:39:22 crc kubenswrapper[4743]: I1122 08:39:22.080938 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" Nov 22 08:39:22 crc kubenswrapper[4743]: I1122 08:39:22.106458 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" podStartSLOduration=3.081458601 podStartE2EDuration="35.106444825s" podCreationTimestamp="2025-11-22 08:38:47 +0000 UTC" firstStartedPulling="2025-11-22 08:38:49.75475269 +0000 UTC m=+1003.461113742" lastFinishedPulling="2025-11-22 08:39:21.779738914 +0000 UTC m=+1035.486099966" observedRunningTime="2025-11-22 08:39:22.105980952 +0000 UTC m=+1035.812342004" watchObservedRunningTime="2025-11-22 08:39:22.106444825 +0000 UTC m=+1035.812805877" Nov 22 08:39:23 crc kubenswrapper[4743]: I1122 08:39:23.086189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" event={"ID":"4c6b99d5-9791-40db-91fd-d74c80b2e3a7","Type":"ContainerStarted","Data":"c992f7f68343b14aee40d0d8946d44b5e899b6094a962577229066e4fa355978"} Nov 22 08:39:23 crc kubenswrapper[4743]: I1122 08:39:23.107098 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" podStartSLOduration=3.051221915 podStartE2EDuration="35.107079292s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.190199757 +0000 UTC m=+1003.896560809" lastFinishedPulling="2025-11-22 08:39:22.246057134 +0000 UTC m=+1035.952418186" observedRunningTime="2025-11-22 08:39:23.100806381 +0000 UTC m=+1036.807167433" watchObservedRunningTime="2025-11-22 08:39:23.107079292 +0000 UTC m=+1036.813440364" Nov 22 08:39:28 crc kubenswrapper[4743]: I1122 08:39:28.497173 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-6498cbf48f-8vd9m" Nov 22 08:39:28 crc kubenswrapper[4743]: I1122 08:39:28.736069 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" Nov 22 08:39:28 crc kubenswrapper[4743]: I1122 08:39:28.738179 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-54fc5f65b7-cv2tj" Nov 22 08:39:28 crc kubenswrapper[4743]: I1122 08:39:28.742729 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-54cfbf4c7d-g6mzk" Nov 22 08:39:28 crc kubenswrapper[4743]: I1122 08:39:28.765196 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5b797b8dff-z4flw" Nov 22 08:39:28 crc kubenswrapper[4743]: I1122 08:39:28.899853 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6d4bf84b58-rssfb" Nov 22 08:39:28 crc kubenswrapper[4743]: I1122 08:39:28.951604 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-b4c496f69-55bp6" Nov 22 08:39:29 crc kubenswrapper[4743]: I1122 08:39:29.832779 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-6dd8864d7c-gn9b8" Nov 22 08:39:31 crc kubenswrapper[4743]: I1122 08:39:31.241738 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:39:31 crc kubenswrapper[4743]: I1122 08:39:31.242131 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:39:31 crc kubenswrapper[4743]: I1122 08:39:31.242189 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:39:31 crc kubenswrapper[4743]: I1122 08:39:31.242925 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86f8f3accbf0662fa413321f99b5f1afa28e63a1e90c6983235baff64b7561bc"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 08:39:31 crc kubenswrapper[4743]: I1122 08:39:31.242995 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://86f8f3accbf0662fa413321f99b5f1afa28e63a1e90c6983235baff64b7561bc" gracePeriod=600 Nov 22 08:39:39 crc kubenswrapper[4743]: I1122 08:39:39.222694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" event={"ID":"5c593905-2ee5-4990-9f9c-85ca81f38319","Type":"ContainerStarted","Data":"1063d1ac386a4d843c73dbd699a5da28190256ca275af176cc40b20f5cfafcfd"} Nov 22 08:39:41 crc kubenswrapper[4743]: I1122 08:39:41.248613 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"86f8f3accbf0662fa413321f99b5f1afa28e63a1e90c6983235baff64b7561bc"} Nov 22 08:39:41 crc kubenswrapper[4743]: I1122 08:39:41.248608 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="86f8f3accbf0662fa413321f99b5f1afa28e63a1e90c6983235baff64b7561bc" exitCode=0 Nov 22 08:39:41 crc kubenswrapper[4743]: I1122 08:39:41.248688 4743 scope.go:117] "RemoveContainer" containerID="37c97b2e81f6751d68ee6b6779e8d74c99c6cc572fe2c42aebcabd8215411f9d" Nov 22 08:39:41 crc kubenswrapper[4743]: I1122 08:39:41.248947 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" Nov 22 08:39:41 crc kubenswrapper[4743]: I1122 08:39:41.271112 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" podStartSLOduration=19.881363544 podStartE2EDuration="53.271086729s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.19067854 +0000 UTC m=+1003.897039592" lastFinishedPulling="2025-11-22 08:39:23.580401715 +0000 UTC m=+1037.286762777" observedRunningTime="2025-11-22 08:39:41.267099944 +0000 UTC m=+1054.973461006" watchObservedRunningTime="2025-11-22 08:39:41.271086729 +0000 UTC m=+1054.977447801" Nov 22 08:39:48 crc kubenswrapper[4743]: I1122 08:39:48.973118 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-8c6448b9f-j6vgx" Nov 22 08:39:49 crc kubenswrapper[4743]: I1122 08:39:49.317576 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"5b78c811f3b4d026db1f9c1117668378bf529a268e8a2883a781fcb01b039225"} Nov 22 08:39:49 crc kubenswrapper[4743]: I1122 08:39:49.319561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" event={"ID":"f5f27cf7-eaa5-4b71-84a6-94fac3920d39","Type":"ContainerStarted","Data":"7d9004e67f5189a1f02eebdf5f2d3097ecc45a153f31ee64147e178afa093f91"} Nov 22 08:39:49 crc kubenswrapper[4743]: I1122 08:39:49.319989 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:39:49 crc kubenswrapper[4743]: I1122 08:39:49.321864 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" event={"ID":"50b5cda8-859c-49f0-92aa-601c16eb9a2a","Type":"ContainerStarted","Data":"ef1f6a8d274155e8c733cddd6045cd2cad1fd36e64151fb1c19aa74ccad39128"} Nov 22 08:39:49 crc kubenswrapper[4743]: I1122 08:39:49.322170 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" Nov 22 08:39:49 crc kubenswrapper[4743]: I1122 08:39:49.323750 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" event={"ID":"9d349409-980f-4605-bd87-d09fe812dd65","Type":"ContainerStarted","Data":"43989878aa664db305100fa74c2249e951f2dbf94ce164a346f414b508907b46"} Nov 22 08:39:49 crc kubenswrapper[4743]: I1122 08:39:49.323957 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" Nov 22 08:39:49 crc kubenswrapper[4743]: I1122 08:39:49.370211 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" podStartSLOduration=3.341251996 podStartE2EDuration="1m1.370192225s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.621081782 +0000 UTC m=+1004.327442834" lastFinishedPulling="2025-11-22 08:39:48.650022011 +0000 UTC m=+1062.356383063" observedRunningTime="2025-11-22 08:39:49.363149131 +0000 UTC m=+1063.069510193" watchObservedRunningTime="2025-11-22 08:39:49.370192225 +0000 UTC m=+1063.076553267" Nov 22 08:39:49 crc kubenswrapper[4743]: I1122 08:39:49.380923 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" podStartSLOduration=2.829210942 podStartE2EDuration="1m1.380904225s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.099648887 +0000 UTC m=+1003.806009929" lastFinishedPulling="2025-11-22 08:39:48.65134216 +0000 UTC m=+1062.357703212" observedRunningTime="2025-11-22 08:39:49.380711579 +0000 UTC m=+1063.087072631" watchObservedRunningTime="2025-11-22 08:39:49.380904225 +0000 UTC m=+1063.087265277" Nov 22 08:39:49 crc kubenswrapper[4743]: I1122 08:39:49.394904 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" podStartSLOduration=2.885077508 podStartE2EDuration="1m1.394881999s" podCreationTimestamp="2025-11-22 08:38:48 +0000 UTC" firstStartedPulling="2025-11-22 08:38:50.140058736 +0000 UTC m=+1003.846419788" lastFinishedPulling="2025-11-22 08:39:48.649863207 +0000 UTC m=+1062.356224279" observedRunningTime="2025-11-22 08:39:49.394530159 +0000 UTC m=+1063.100891221" watchObservedRunningTime="2025-11-22 08:39:49.394881999 +0000 UTC m=+1063.101243061" Nov 22 08:39:58 crc kubenswrapper[4743]: I1122 08:39:58.734521 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-58f887965d-579sf" Nov 22 08:39:58 crc kubenswrapper[4743]: I1122 08:39:58.741075 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-54b5986bb8-vt47z" Nov 22 08:40:00 crc kubenswrapper[4743]: I1122 08:40:00.298543 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.680201 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z94wk"] Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.682287 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.685177 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.685198 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.685366 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-tt8v2" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.686328 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.692562 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z94wk"] Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.737346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg4z7\" (UniqueName: \"kubernetes.io/projected/aa3db083-f5c7-486b-8fa5-8975e8de5685-kube-api-access-cg4z7\") pod \"dnsmasq-dns-675f4bcbfc-z94wk\" (UID: \"aa3db083-f5c7-486b-8fa5-8975e8de5685\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.737682 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa3db083-f5c7-486b-8fa5-8975e8de5685-config\") pod \"dnsmasq-dns-675f4bcbfc-z94wk\" (UID: \"aa3db083-f5c7-486b-8fa5-8975e8de5685\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.758683 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-td7tv"] Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.760091 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.762086 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.777031 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-td7tv"] Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.838693 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg4z7\" (UniqueName: \"kubernetes.io/projected/aa3db083-f5c7-486b-8fa5-8975e8de5685-kube-api-access-cg4z7\") pod \"dnsmasq-dns-675f4bcbfc-z94wk\" (UID: \"aa3db083-f5c7-486b-8fa5-8975e8de5685\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.838756 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-config\") pod \"dnsmasq-dns-78dd6ddcc-td7tv\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.838786 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8vs2\" (UniqueName: \"kubernetes.io/projected/4862ac04-e60f-4a1f-b9a8-746cdb194804-kube-api-access-d8vs2\") pod \"dnsmasq-dns-78dd6ddcc-td7tv\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.838854 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa3db083-f5c7-486b-8fa5-8975e8de5685-config\") pod \"dnsmasq-dns-675f4bcbfc-z94wk\" (UID: \"aa3db083-f5c7-486b-8fa5-8975e8de5685\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.838895 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-td7tv\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.840204 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa3db083-f5c7-486b-8fa5-8975e8de5685-config\") pod \"dnsmasq-dns-675f4bcbfc-z94wk\" (UID: \"aa3db083-f5c7-486b-8fa5-8975e8de5685\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.858405 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg4z7\" (UniqueName: \"kubernetes.io/projected/aa3db083-f5c7-486b-8fa5-8975e8de5685-kube-api-access-cg4z7\") pod \"dnsmasq-dns-675f4bcbfc-z94wk\" (UID: \"aa3db083-f5c7-486b-8fa5-8975e8de5685\") " pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.943272 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-td7tv\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.943766 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-config\") pod \"dnsmasq-dns-78dd6ddcc-td7tv\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.943820 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8vs2\" (UniqueName: \"kubernetes.io/projected/4862ac04-e60f-4a1f-b9a8-746cdb194804-kube-api-access-d8vs2\") pod \"dnsmasq-dns-78dd6ddcc-td7tv\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.945254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-td7tv\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.945985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-config\") pod \"dnsmasq-dns-78dd6ddcc-td7tv\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:15 crc kubenswrapper[4743]: I1122 08:40:15.976091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8vs2\" (UniqueName: \"kubernetes.io/projected/4862ac04-e60f-4a1f-b9a8-746cdb194804-kube-api-access-d8vs2\") pod \"dnsmasq-dns-78dd6ddcc-td7tv\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:16 crc kubenswrapper[4743]: I1122 08:40:16.001681 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" Nov 22 08:40:16 crc kubenswrapper[4743]: I1122 08:40:16.075022 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:16 crc kubenswrapper[4743]: I1122 08:40:16.324202 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-td7tv"] Nov 22 08:40:16 crc kubenswrapper[4743]: I1122 08:40:16.334823 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 08:40:16 crc kubenswrapper[4743]: I1122 08:40:16.438717 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z94wk"] Nov 22 08:40:16 crc kubenswrapper[4743]: W1122 08:40:16.444352 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa3db083_f5c7_486b_8fa5_8975e8de5685.slice/crio-feb2852d4acafb410e7c76ace2583a8bc5ab9ef733932a39529fc1e6d4a20d98 WatchSource:0}: Error finding container feb2852d4acafb410e7c76ace2583a8bc5ab9ef733932a39529fc1e6d4a20d98: Status 404 returned error can't find the container with id feb2852d4acafb410e7c76ace2583a8bc5ab9ef733932a39529fc1e6d4a20d98 Nov 22 08:40:16 crc kubenswrapper[4743]: I1122 08:40:16.552953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" event={"ID":"4862ac04-e60f-4a1f-b9a8-746cdb194804","Type":"ContainerStarted","Data":"b513a8b623cf6019a359bb0a88333d8a89ccd258b395873b109a3c8faebc206c"} Nov 22 08:40:16 crc kubenswrapper[4743]: I1122 08:40:16.554861 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" event={"ID":"aa3db083-f5c7-486b-8fa5-8975e8de5685","Type":"ContainerStarted","Data":"feb2852d4acafb410e7c76ace2583a8bc5ab9ef733932a39529fc1e6d4a20d98"} Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.477561 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z94wk"] Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.503521 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p79k8"] Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.504794 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.533770 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p79k8"] Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.693341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-config\") pod \"dnsmasq-dns-666b6646f7-p79k8\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.693389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-dns-svc\") pod \"dnsmasq-dns-666b6646f7-p79k8\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.693456 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhggx\" (UniqueName: \"kubernetes.io/projected/cb4114e2-0dc1-490a-8582-5234733b64ab-kube-api-access-lhggx\") pod \"dnsmasq-dns-666b6646f7-p79k8\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.784428 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-td7tv"] Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.794267 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-config\") pod \"dnsmasq-dns-666b6646f7-p79k8\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.794315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-dns-svc\") pod \"dnsmasq-dns-666b6646f7-p79k8\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.794405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhggx\" (UniqueName: \"kubernetes.io/projected/cb4114e2-0dc1-490a-8582-5234733b64ab-kube-api-access-lhggx\") pod \"dnsmasq-dns-666b6646f7-p79k8\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.795631 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-dns-svc\") pod \"dnsmasq-dns-666b6646f7-p79k8\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.795739 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-config\") pod \"dnsmasq-dns-666b6646f7-p79k8\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.815130 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dsh6z"] Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.816378 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.844369 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dsh6z"] Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.848126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhggx\" (UniqueName: \"kubernetes.io/projected/cb4114e2-0dc1-490a-8582-5234733b64ab-kube-api-access-lhggx\") pod \"dnsmasq-dns-666b6646f7-p79k8\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.900995 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49ljx\" (UniqueName: \"kubernetes.io/projected/03f48b82-41c0-4673-bcae-aef0130d447a-kube-api-access-49ljx\") pod \"dnsmasq-dns-57d769cc4f-dsh6z\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.901470 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dsh6z\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:18 crc kubenswrapper[4743]: I1122 08:40:18.901639 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-config\") pod \"dnsmasq-dns-57d769cc4f-dsh6z\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.003446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49ljx\" (UniqueName: \"kubernetes.io/projected/03f48b82-41c0-4673-bcae-aef0130d447a-kube-api-access-49ljx\") pod \"dnsmasq-dns-57d769cc4f-dsh6z\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.003510 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dsh6z\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.003548 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-config\") pod \"dnsmasq-dns-57d769cc4f-dsh6z\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.004499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-config\") pod \"dnsmasq-dns-57d769cc4f-dsh6z\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.004719 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-dsh6z\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.021586 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49ljx\" (UniqueName: \"kubernetes.io/projected/03f48b82-41c0-4673-bcae-aef0130d447a-kube-api-access-49ljx\") pod \"dnsmasq-dns-57d769cc4f-dsh6z\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.146088 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.189139 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.632153 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p79k8"] Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.648745 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.650455 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.654570 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.654592 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.655066 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.656980 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.657088 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.657553 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.658416 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-hrnbn" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.663648 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.717935 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dsh6z"] Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822054 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822090 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5fac46a-545d-4f30-a7ab-8f5e713e934d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822112 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822286 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822325 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5fac46a-545d-4f30-a7ab-8f5e713e934d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822354 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822382 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.822446 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrgs\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-kube-api-access-hnrgs\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.923756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.923819 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.923838 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnrgs\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-kube-api-access-hnrgs\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.923894 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.923910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.923923 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.923940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5fac46a-545d-4f30-a7ab-8f5e713e934d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.923960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.923995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.924009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5fac46a-545d-4f30-a7ab-8f5e713e934d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.924026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.924782 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.924933 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.926457 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.926721 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.927341 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.929080 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.932768 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.937389 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5fac46a-545d-4f30-a7ab-8f5e713e934d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.955021 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnrgs\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-kube-api-access-hnrgs\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.963955 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.974542 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.976314 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.978054 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.981430 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.981664 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.981833 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.981948 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6hwf2" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.982057 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.982175 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.982511 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5fac46a-545d-4f30-a7ab-8f5e713e934d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " pod="openstack/rabbitmq-server-0" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.982655 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 08:40:19 crc kubenswrapper[4743]: I1122 08:40:19.983157 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.002252 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.127975 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.128040 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.128291 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.128743 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh9pp\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-kube-api-access-bh9pp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.129015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.129214 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.129534 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.129741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.130194 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.130394 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.130458 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235536 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235566 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235621 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh9pp\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-kube-api-access-bh9pp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235643 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235716 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235765 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235799 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235821 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.235844 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.236883 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.238237 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.239738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.240023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.240282 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.240740 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.240752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.241330 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.244147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.245118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.256250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh9pp\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-kube-api-access-bh9pp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.282793 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:20 crc kubenswrapper[4743]: I1122 08:40:20.348994 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.212772 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.214384 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.219525 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.219769 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-8fkdw" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.219952 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.220105 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.224294 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.226281 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.351577 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.351666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.351703 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.351833 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-default\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.351949 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.352095 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-kolla-config\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.352408 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.352553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxhb4\" (UniqueName: \"kubernetes.io/projected/d3d63130-217d-400e-afc5-6b6bb3d56658-kube-api-access-xxhb4\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.454259 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.454320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-kolla-config\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.454367 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.454392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxhb4\" (UniqueName: \"kubernetes.io/projected/d3d63130-217d-400e-afc5-6b6bb3d56658-kube-api-access-xxhb4\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.454417 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.454440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.454488 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.454520 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-default\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.455377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-default\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.456515 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-kolla-config\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.457712 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.458226 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.458411 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.460100 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.461202 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.477986 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxhb4\" (UniqueName: \"kubernetes.io/projected/d3d63130-217d-400e-afc5-6b6bb3d56658-kube-api-access-xxhb4\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.478082 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " pod="openstack/openstack-galera-0" Nov 22 08:40:21 crc kubenswrapper[4743]: I1122 08:40:21.546658 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.649394 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.651080 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.653278 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.653483 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.654046 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-pr4n6" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.654176 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.668874 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.772456 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.772515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.772550 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.772654 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.772687 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.772741 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnwx4\" (UniqueName: \"kubernetes.io/projected/29734ea4-591c-478e-8030-55fcbac72d3a-kube-api-access-mnwx4\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.772768 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.772813 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.874075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.874135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.874176 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.874211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.874258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.874276 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.874523 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnwx4\" (UniqueName: \"kubernetes.io/projected/29734ea4-591c-478e-8030-55fcbac72d3a-kube-api-access-mnwx4\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.874568 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.874620 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.874745 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.875557 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.875629 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.876070 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.879738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.879877 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.891903 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnwx4\" (UniqueName: \"kubernetes.io/projected/29734ea4-591c-478e-8030-55fcbac72d3a-kube-api-access-mnwx4\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.894963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-cell1-galera-0\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:22 crc kubenswrapper[4743]: I1122 08:40:22.971979 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.024702 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.025678 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.031689 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.031772 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.032889 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-h9p5k" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.049991 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.179319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-combined-ca-bundle\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.179613 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nwnx\" (UniqueName: \"kubernetes.io/projected/861e40f8-c596-40a1-b192-2fa51f567b55-kube-api-access-7nwnx\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.179679 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-memcached-tls-certs\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.179752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-config-data\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.179875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-kolla-config\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.281199 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-config-data\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.281270 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-kolla-config\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.281314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-combined-ca-bundle\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.281365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nwnx\" (UniqueName: \"kubernetes.io/projected/861e40f8-c596-40a1-b192-2fa51f567b55-kube-api-access-7nwnx\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.281386 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-memcached-tls-certs\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.282751 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-config-data\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.283042 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-kolla-config\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.290226 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-memcached-tls-certs\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.293130 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-combined-ca-bundle\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.300041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nwnx\" (UniqueName: \"kubernetes.io/projected/861e40f8-c596-40a1-b192-2fa51f567b55-kube-api-access-7nwnx\") pod \"memcached-0\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " pod="openstack/memcached-0" Nov 22 08:40:23 crc kubenswrapper[4743]: I1122 08:40:23.353485 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 08:40:24 crc kubenswrapper[4743]: I1122 08:40:24.634245 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" event={"ID":"cb4114e2-0dc1-490a-8582-5234733b64ab","Type":"ContainerStarted","Data":"94521ccef2558458b0e042b5b4cfaa8253ab1da6072c147e09fdeb352bc51122"} Nov 22 08:40:24 crc kubenswrapper[4743]: I1122 08:40:24.644330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" event={"ID":"03f48b82-41c0-4673-bcae-aef0130d447a","Type":"ContainerStarted","Data":"a345895a6ff0581070eb7e20eb5655dfb9a26f1296023b2677e99e37057199fd"} Nov 22 08:40:25 crc kubenswrapper[4743]: I1122 08:40:25.016776 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:40:25 crc kubenswrapper[4743]: I1122 08:40:25.017918 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 08:40:25 crc kubenswrapper[4743]: I1122 08:40:25.019554 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-cvbxd" Nov 22 08:40:25 crc kubenswrapper[4743]: I1122 08:40:25.032520 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:40:25 crc kubenswrapper[4743]: I1122 08:40:25.109255 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75v6v\" (UniqueName: \"kubernetes.io/projected/1bc16799-92e0-45f0-a46d-770ef95eefa6-kube-api-access-75v6v\") pod \"kube-state-metrics-0\" (UID: \"1bc16799-92e0-45f0-a46d-770ef95eefa6\") " pod="openstack/kube-state-metrics-0" Nov 22 08:40:25 crc kubenswrapper[4743]: I1122 08:40:25.211621 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75v6v\" (UniqueName: \"kubernetes.io/projected/1bc16799-92e0-45f0-a46d-770ef95eefa6-kube-api-access-75v6v\") pod \"kube-state-metrics-0\" (UID: \"1bc16799-92e0-45f0-a46d-770ef95eefa6\") " pod="openstack/kube-state-metrics-0" Nov 22 08:40:25 crc kubenswrapper[4743]: I1122 08:40:25.233843 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75v6v\" (UniqueName: \"kubernetes.io/projected/1bc16799-92e0-45f0-a46d-770ef95eefa6-kube-api-access-75v6v\") pod \"kube-state-metrics-0\" (UID: \"1bc16799-92e0-45f0-a46d-770ef95eefa6\") " pod="openstack/kube-state-metrics-0" Nov 22 08:40:25 crc kubenswrapper[4743]: I1122 08:40:25.334078 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.777958 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6t9hh"] Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.779172 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.780744 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hnx2s" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.781442 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.781748 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.800916 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6t9hh"] Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.806043 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-mz9kc"] Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.807736 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.836307 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mz9kc"] Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.854982 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd754\" (UniqueName: \"kubernetes.io/projected/5db10427-8546-4dea-b849-36bb02c837bd-kube-api-access-hd754\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.855113 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.855169 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-log-ovn\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.855194 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5db10427-8546-4dea-b849-36bb02c837bd-scripts\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.855211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run-ovn\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.855246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-ovn-controller-tls-certs\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.855260 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-combined-ca-bundle\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-etc-ovs\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957134 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd754\" (UniqueName: \"kubernetes.io/projected/5db10427-8546-4dea-b849-36bb02c837bd-kube-api-access-hd754\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957159 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8flq\" (UniqueName: \"kubernetes.io/projected/03685c6a-5ae9-45cf-b66d-5210d4811bda-kube-api-access-c8flq\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957236 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-run\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957298 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03685c6a-5ae9-45cf-b66d-5210d4811bda-scripts\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957332 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-log-ovn\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957362 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-log\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957383 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-lib\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957403 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5db10427-8546-4dea-b849-36bb02c837bd-scripts\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957424 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run-ovn\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-ovn-controller-tls-certs\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.957493 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-combined-ca-bundle\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.959037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.959123 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run-ovn\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.959221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-log-ovn\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.961965 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5db10427-8546-4dea-b849-36bb02c837bd-scripts\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.964119 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-combined-ca-bundle\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.964445 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-ovn-controller-tls-certs\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:27 crc kubenswrapper[4743]: I1122 08:40:27.976915 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd754\" (UniqueName: \"kubernetes.io/projected/5db10427-8546-4dea-b849-36bb02c837bd-kube-api-access-hd754\") pod \"ovn-controller-6t9hh\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.058743 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-run\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.058853 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03685c6a-5ae9-45cf-b66d-5210d4811bda-scripts\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.058893 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-log\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.058955 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-run\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.058969 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-lib\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.059426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-log\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.059446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-etc-ovs\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.059527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8flq\" (UniqueName: \"kubernetes.io/projected/03685c6a-5ae9-45cf-b66d-5210d4811bda-kube-api-access-c8flq\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.059738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-etc-ovs\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.060252 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-lib\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.061680 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03685c6a-5ae9-45cf-b66d-5210d4811bda-scripts\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.084626 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8flq\" (UniqueName: \"kubernetes.io/projected/03685c6a-5ae9-45cf-b66d-5210d4811bda-kube-api-access-c8flq\") pod \"ovn-controller-ovs-mz9kc\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.098127 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:28 crc kubenswrapper[4743]: I1122 08:40:28.124383 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:29 crc kubenswrapper[4743]: I1122 08:40:29.715867 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 08:40:29 crc kubenswrapper[4743]: I1122 08:40:29.793779 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.174589 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.176965 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.180045 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.181449 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8gqfp" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.183116 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.183134 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.184974 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.186059 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.304469 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.304517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.304562 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.304729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnlvb\" (UniqueName: \"kubernetes.io/projected/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-kube-api-access-wnlvb\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.305017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-config\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.305231 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.305305 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.305331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.407178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-config\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.407249 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.407312 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.407339 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.407363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.407385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.407416 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.407453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnlvb\" (UniqueName: \"kubernetes.io/projected/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-kube-api-access-wnlvb\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.407952 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.408274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.420708 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.423667 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.424348 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-config\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.426472 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.427960 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnlvb\" (UniqueName: \"kubernetes.io/projected/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-kube-api-access-wnlvb\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.435183 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.442003 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:30 crc kubenswrapper[4743]: I1122 08:40:30.506086 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.382731 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.388017 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.393546 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.394643 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.395770 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-4x66b" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.397008 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.404146 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.542801 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-config\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.542899 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.542923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.542958 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.543009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.543117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.543159 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.543187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch8hg\" (UniqueName: \"kubernetes.io/projected/3a18c86e-9d86-49ee-918f-76de17000e18-kube-api-access-ch8hg\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.655940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-config\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.656040 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.656072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.656109 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.656143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.656168 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.656207 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.656235 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch8hg\" (UniqueName: \"kubernetes.io/projected/3a18c86e-9d86-49ee-918f-76de17000e18-kube-api-access-ch8hg\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.657698 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.658139 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.659131 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.659203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-config\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.662768 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.672415 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.687222 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.689280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.691429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch8hg\" (UniqueName: \"kubernetes.io/projected/3a18c86e-9d86-49ee-918f-76de17000e18-kube-api-access-ch8hg\") pod \"ovsdbserver-sb-0\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:32 crc kubenswrapper[4743]: I1122 08:40:32.731470 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:34 crc kubenswrapper[4743]: W1122 08:40:34.889404 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff6b12a5_cc5e_4ff8_b4eb_ab76fb4d36c1.slice/crio-b507b4d34cd4f86c446e4edafb6b74db493c1dbcc29dd36d4787d8b073d954b7 WatchSource:0}: Error finding container b507b4d34cd4f86c446e4edafb6b74db493c1dbcc29dd36d4787d8b073d954b7: Status 404 returned error can't find the container with id b507b4d34cd4f86c446e4edafb6b74db493c1dbcc29dd36d4787d8b073d954b7 Nov 22 08:40:34 crc kubenswrapper[4743]: W1122 08:40:34.890292 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5fac46a_545d_4f30_a7ab_8f5e713e934d.slice/crio-76ce132c90151d9f020a53331ee30677627ac885b042cd6fe138821b149b063b WatchSource:0}: Error finding container 76ce132c90151d9f020a53331ee30677627ac885b042cd6fe138821b149b063b: Status 404 returned error can't find the container with id 76ce132c90151d9f020a53331ee30677627ac885b042cd6fe138821b149b063b Nov 22 08:40:35 crc kubenswrapper[4743]: I1122 08:40:35.324886 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 08:40:35 crc kubenswrapper[4743]: E1122 08:40:35.687986 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 08:40:35 crc kubenswrapper[4743]: E1122 08:40:35.688176 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d8vs2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-td7tv_openstack(4862ac04-e60f-4a1f-b9a8-746cdb194804): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:40:35 crc kubenswrapper[4743]: E1122 08:40:35.689550 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" podUID="4862ac04-e60f-4a1f-b9a8-746cdb194804" Nov 22 08:40:35 crc kubenswrapper[4743]: I1122 08:40:35.744823 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d3d63130-217d-400e-afc5-6b6bb3d56658","Type":"ContainerStarted","Data":"c6eafdf2e1a185ee3be73f8fb0387c62cc373faf5a53ee47f0ce1c525ad711e3"} Nov 22 08:40:35 crc kubenswrapper[4743]: I1122 08:40:35.746473 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1","Type":"ContainerStarted","Data":"b507b4d34cd4f86c446e4edafb6b74db493c1dbcc29dd36d4787d8b073d954b7"} Nov 22 08:40:35 crc kubenswrapper[4743]: E1122 08:40:35.746562 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Nov 22 08:40:35 crc kubenswrapper[4743]: E1122 08:40:35.746727 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cg4z7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-z94wk_openstack(aa3db083-f5c7-486b-8fa5-8975e8de5685): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:40:35 crc kubenswrapper[4743]: E1122 08:40:35.748214 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" podUID="aa3db083-f5c7-486b-8fa5-8975e8de5685" Nov 22 08:40:35 crc kubenswrapper[4743]: I1122 08:40:35.751269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5fac46a-545d-4f30-a7ab-8f5e713e934d","Type":"ContainerStarted","Data":"76ce132c90151d9f020a53331ee30677627ac885b042cd6fe138821b149b063b"} Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.355366 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.366145 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.372155 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.457805 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6t9hh"] Nov 22 08:40:36 crc kubenswrapper[4743]: W1122 08:40:36.458950 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bc16799_92e0_45f0_a46d_770ef95eefa6.slice/crio-94d25dcea6f7143c72e0e8b063dead3d2fb7b47423c5b2687150391f6f7f6f97 WatchSource:0}: Error finding container 94d25dcea6f7143c72e0e8b063dead3d2fb7b47423c5b2687150391f6f7f6f97: Status 404 returned error can't find the container with id 94d25dcea6f7143c72e0e8b063dead3d2fb7b47423c5b2687150391f6f7f6f97 Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.467345 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:36 crc kubenswrapper[4743]: W1122 08:40:36.480780 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5db10427_8546_4dea_b849_36bb02c837bd.slice/crio-a5a07d940c5e01f9e847510d625c10e0ef683fe123f8c83ce7269a9f8c1f6185 WatchSource:0}: Error finding container a5a07d940c5e01f9e847510d625c10e0ef683fe123f8c83ce7269a9f8c1f6185: Status 404 returned error can't find the container with id a5a07d940c5e01f9e847510d625c10e0ef683fe123f8c83ce7269a9f8c1f6185 Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.559965 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 08:40:36 crc kubenswrapper[4743]: W1122 08:40:36.563747 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a18c86e_9d86_49ee_918f_76de17000e18.slice/crio-f40b416d068b4981562f93627cae411148e6542e542f0db18ab24e7e66969d03 WatchSource:0}: Error finding container f40b416d068b4981562f93627cae411148e6542e542f0db18ab24e7e66969d03: Status 404 returned error can't find the container with id f40b416d068b4981562f93627cae411148e6542e542f0db18ab24e7e66969d03 Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.639716 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-dns-svc\") pod \"4862ac04-e60f-4a1f-b9a8-746cdb194804\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.639828 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8vs2\" (UniqueName: \"kubernetes.io/projected/4862ac04-e60f-4a1f-b9a8-746cdb194804-kube-api-access-d8vs2\") pod \"4862ac04-e60f-4a1f-b9a8-746cdb194804\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.639884 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-config\") pod \"4862ac04-e60f-4a1f-b9a8-746cdb194804\" (UID: \"4862ac04-e60f-4a1f-b9a8-746cdb194804\") " Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.640803 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-config" (OuterVolumeSpecName: "config") pod "4862ac04-e60f-4a1f-b9a8-746cdb194804" (UID: "4862ac04-e60f-4a1f-b9a8-746cdb194804"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.642737 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4862ac04-e60f-4a1f-b9a8-746cdb194804" (UID: "4862ac04-e60f-4a1f-b9a8-746cdb194804"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.646753 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4862ac04-e60f-4a1f-b9a8-746cdb194804-kube-api-access-d8vs2" (OuterVolumeSpecName: "kube-api-access-d8vs2") pod "4862ac04-e60f-4a1f-b9a8-746cdb194804" (UID: "4862ac04-e60f-4a1f-b9a8-746cdb194804"). InnerVolumeSpecName "kube-api-access-d8vs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:40:36 crc kubenswrapper[4743]: W1122 08:40:36.670040 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03685c6a_5ae9_45cf_b66d_5210d4811bda.slice/crio-e52c0670bd78e3fedbdaf421e24fb03c508396f93d62fc1f36c75c01b05f5630 WatchSource:0}: Error finding container e52c0670bd78e3fedbdaf421e24fb03c508396f93d62fc1f36c75c01b05f5630: Status 404 returned error can't find the container with id e52c0670bd78e3fedbdaf421e24fb03c508396f93d62fc1f36c75c01b05f5630 Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.671240 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-mz9kc"] Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.741932 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.742030 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8vs2\" (UniqueName: \"kubernetes.io/projected/4862ac04-e60f-4a1f-b9a8-746cdb194804-kube-api-access-d8vs2\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.742102 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4862ac04-e60f-4a1f-b9a8-746cdb194804-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.759861 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1bc16799-92e0-45f0-a46d-770ef95eefa6","Type":"ContainerStarted","Data":"94d25dcea6f7143c72e0e8b063dead3d2fb7b47423c5b2687150391f6f7f6f97"} Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.760990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"861e40f8-c596-40a1-b192-2fa51f567b55","Type":"ContainerStarted","Data":"21a3b5351f89ec759c17eadcb40c8a78728afbe6149c5ce1492d256037e3e42a"} Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.762339 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh" event={"ID":"5db10427-8546-4dea-b849-36bb02c837bd","Type":"ContainerStarted","Data":"a5a07d940c5e01f9e847510d625c10e0ef683fe123f8c83ce7269a9f8c1f6185"} Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.763565 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"29734ea4-591c-478e-8030-55fcbac72d3a","Type":"ContainerStarted","Data":"d608b4c2033083afc4449c6a3e9fb408094fd3a4c09eb692f1c19645b680c841"} Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.779733 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3a18c86e-9d86-49ee-918f-76de17000e18","Type":"ContainerStarted","Data":"f40b416d068b4981562f93627cae411148e6542e542f0db18ab24e7e66969d03"} Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.782061 4743 generic.go:334] "Generic (PLEG): container finished" podID="03f48b82-41c0-4673-bcae-aef0130d447a" containerID="979773088cbbc502b234ec63667fff46b53fc263d2a2a09ebb4e990b2336e837" exitCode=0 Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.782139 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" event={"ID":"03f48b82-41c0-4673-bcae-aef0130d447a","Type":"ContainerDied","Data":"979773088cbbc502b234ec63667fff46b53fc263d2a2a09ebb4e990b2336e837"} Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.784296 4743 generic.go:334] "Generic (PLEG): container finished" podID="cb4114e2-0dc1-490a-8582-5234733b64ab" containerID="21ecc6b3d20c284ef43b926c5c463d821ee2edacb2a3220709a25fbf01483ae9" exitCode=0 Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.784394 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" event={"ID":"cb4114e2-0dc1-490a-8582-5234733b64ab","Type":"ContainerDied","Data":"21ecc6b3d20c284ef43b926c5c463d821ee2edacb2a3220709a25fbf01483ae9"} Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.787277 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.787363 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-td7tv" event={"ID":"4862ac04-e60f-4a1f-b9a8-746cdb194804","Type":"ContainerDied","Data":"b513a8b623cf6019a359bb0a88333d8a89ccd258b395873b109a3c8faebc206c"} Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.790565 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mz9kc" event={"ID":"03685c6a-5ae9-45cf-b66d-5210d4811bda","Type":"ContainerStarted","Data":"e52c0670bd78e3fedbdaf421e24fb03c508396f93d62fc1f36c75c01b05f5630"} Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.930245 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-td7tv"] Nov 22 08:40:36 crc kubenswrapper[4743]: I1122 08:40:36.942511 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-td7tv"] Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.177514 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4862ac04-e60f-4a1f-b9a8-746cdb194804" path="/var/lib/kubelet/pods/4862ac04-e60f-4a1f-b9a8-746cdb194804/volumes" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.300286 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.433374 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.456199 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg4z7\" (UniqueName: \"kubernetes.io/projected/aa3db083-f5c7-486b-8fa5-8975e8de5685-kube-api-access-cg4z7\") pod \"aa3db083-f5c7-486b-8fa5-8975e8de5685\" (UID: \"aa3db083-f5c7-486b-8fa5-8975e8de5685\") " Nov 22 08:40:37 crc kubenswrapper[4743]: W1122 08:40:37.456201 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f765fcd_87e8_4d2d_a82d_6ba04aa8a00d.slice/crio-f64f05e22064a45671cb5ffbaa86c35a4105a1bd1e162839e31041dae91c1cd8 WatchSource:0}: Error finding container f64f05e22064a45671cb5ffbaa86c35a4105a1bd1e162839e31041dae91c1cd8: Status 404 returned error can't find the container with id f64f05e22064a45671cb5ffbaa86c35a4105a1bd1e162839e31041dae91c1cd8 Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.456362 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa3db083-f5c7-486b-8fa5-8975e8de5685-config\") pod \"aa3db083-f5c7-486b-8fa5-8975e8de5685\" (UID: \"aa3db083-f5c7-486b-8fa5-8975e8de5685\") " Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.457282 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa3db083-f5c7-486b-8fa5-8975e8de5685-config" (OuterVolumeSpecName: "config") pod "aa3db083-f5c7-486b-8fa5-8975e8de5685" (UID: "aa3db083-f5c7-486b-8fa5-8975e8de5685"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.463188 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa3db083-f5c7-486b-8fa5-8975e8de5685-kube-api-access-cg4z7" (OuterVolumeSpecName: "kube-api-access-cg4z7") pod "aa3db083-f5c7-486b-8fa5-8975e8de5685" (UID: "aa3db083-f5c7-486b-8fa5-8975e8de5685"). InnerVolumeSpecName "kube-api-access-cg4z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.558078 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa3db083-f5c7-486b-8fa5-8975e8de5685-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.558109 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg4z7\" (UniqueName: \"kubernetes.io/projected/aa3db083-f5c7-486b-8fa5-8975e8de5685-kube-api-access-cg4z7\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.802706 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d","Type":"ContainerStarted","Data":"f64f05e22064a45671cb5ffbaa86c35a4105a1bd1e162839e31041dae91c1cd8"} Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.806158 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" event={"ID":"03f48b82-41c0-4673-bcae-aef0130d447a","Type":"ContainerStarted","Data":"5dd3ac541733ba6c4e04ad719e2c73941bdf90aa7977960df0d90c3de86b1b11"} Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.806232 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.809177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" event={"ID":"cb4114e2-0dc1-490a-8582-5234733b64ab","Type":"ContainerStarted","Data":"1492488c837aa8f04f7299b1d413786a94dd865f945e841bffe9f473e3ac4bf5"} Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.809622 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.816934 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.816959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-z94wk" event={"ID":"aa3db083-f5c7-486b-8fa5-8975e8de5685","Type":"ContainerDied","Data":"feb2852d4acafb410e7c76ace2583a8bc5ab9ef733932a39529fc1e6d4a20d98"} Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.831531 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" podStartSLOduration=7.657412682 podStartE2EDuration="19.831507824s" podCreationTimestamp="2025-11-22 08:40:18 +0000 UTC" firstStartedPulling="2025-11-22 08:40:23.632654074 +0000 UTC m=+1097.339015126" lastFinishedPulling="2025-11-22 08:40:35.806749216 +0000 UTC m=+1109.513110268" observedRunningTime="2025-11-22 08:40:37.82236013 +0000 UTC m=+1111.528721192" watchObservedRunningTime="2025-11-22 08:40:37.831507824 +0000 UTC m=+1111.537868896" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.840741 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" podStartSLOduration=7.669953294 podStartE2EDuration="19.84072227s" podCreationTimestamp="2025-11-22 08:40:18 +0000 UTC" firstStartedPulling="2025-11-22 08:40:23.632664474 +0000 UTC m=+1097.339025526" lastFinishedPulling="2025-11-22 08:40:35.80343345 +0000 UTC m=+1109.509794502" observedRunningTime="2025-11-22 08:40:37.83863991 +0000 UTC m=+1111.545000982" watchObservedRunningTime="2025-11-22 08:40:37.84072227 +0000 UTC m=+1111.547083322" Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.886772 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z94wk"] Nov 22 08:40:37 crc kubenswrapper[4743]: I1122 08:40:37.892609 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-z94wk"] Nov 22 08:40:39 crc kubenswrapper[4743]: I1122 08:40:39.170094 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa3db083-f5c7-486b-8fa5-8975e8de5685" path="/var/lib/kubelet/pods/aa3db083-f5c7-486b-8fa5-8975e8de5685/volumes" Nov 22 08:40:44 crc kubenswrapper[4743]: I1122 08:40:44.148550 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:44 crc kubenswrapper[4743]: I1122 08:40:44.192141 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:40:44 crc kubenswrapper[4743]: I1122 08:40:44.240832 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p79k8"] Nov 22 08:40:44 crc kubenswrapper[4743]: I1122 08:40:44.882703 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" podUID="cb4114e2-0dc1-490a-8582-5234733b64ab" containerName="dnsmasq-dns" containerID="cri-o://1492488c837aa8f04f7299b1d413786a94dd865f945e841bffe9f473e3ac4bf5" gracePeriod=10 Nov 22 08:40:45 crc kubenswrapper[4743]: I1122 08:40:45.893765 4743 generic.go:334] "Generic (PLEG): container finished" podID="cb4114e2-0dc1-490a-8582-5234733b64ab" containerID="1492488c837aa8f04f7299b1d413786a94dd865f945e841bffe9f473e3ac4bf5" exitCode=0 Nov 22 08:40:45 crc kubenswrapper[4743]: I1122 08:40:45.893811 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" event={"ID":"cb4114e2-0dc1-490a-8582-5234733b64ab","Type":"ContainerDied","Data":"1492488c837aa8f04f7299b1d413786a94dd865f945e841bffe9f473e3ac4bf5"} Nov 22 08:40:45 crc kubenswrapper[4743]: I1122 08:40:45.894182 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" event={"ID":"cb4114e2-0dc1-490a-8582-5234733b64ab","Type":"ContainerDied","Data":"94521ccef2558458b0e042b5b4cfaa8253ab1da6072c147e09fdeb352bc51122"} Nov 22 08:40:45 crc kubenswrapper[4743]: I1122 08:40:45.894206 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94521ccef2558458b0e042b5b4cfaa8253ab1da6072c147e09fdeb352bc51122" Nov 22 08:40:45 crc kubenswrapper[4743]: I1122 08:40:45.916626 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.109869 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-dns-svc\") pod \"cb4114e2-0dc1-490a-8582-5234733b64ab\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.110419 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-config\") pod \"cb4114e2-0dc1-490a-8582-5234733b64ab\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.110711 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhggx\" (UniqueName: \"kubernetes.io/projected/cb4114e2-0dc1-490a-8582-5234733b64ab-kube-api-access-lhggx\") pod \"cb4114e2-0dc1-490a-8582-5234733b64ab\" (UID: \"cb4114e2-0dc1-490a-8582-5234733b64ab\") " Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.116947 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4114e2-0dc1-490a-8582-5234733b64ab-kube-api-access-lhggx" (OuterVolumeSpecName: "kube-api-access-lhggx") pod "cb4114e2-0dc1-490a-8582-5234733b64ab" (UID: "cb4114e2-0dc1-490a-8582-5234733b64ab"). InnerVolumeSpecName "kube-api-access-lhggx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.149331 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb4114e2-0dc1-490a-8582-5234733b64ab" (UID: "cb4114e2-0dc1-490a-8582-5234733b64ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.149329 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-config" (OuterVolumeSpecName: "config") pod "cb4114e2-0dc1-490a-8582-5234733b64ab" (UID: "cb4114e2-0dc1-490a-8582-5234733b64ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.214325 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhggx\" (UniqueName: \"kubernetes.io/projected/cb4114e2-0dc1-490a-8582-5234733b64ab-kube-api-access-lhggx\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.214367 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.214380 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb4114e2-0dc1-490a-8582-5234733b64ab-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.901041 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-p79k8" Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.947334 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p79k8"] Nov 22 08:40:46 crc kubenswrapper[4743]: I1122 08:40:46.958468 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-p79k8"] Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.162367 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4114e2-0dc1-490a-8582-5234733b64ab" path="/var/lib/kubelet/pods/cb4114e2-0dc1-490a-8582-5234733b64ab/volumes" Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.916856 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh" event={"ID":"5db10427-8546-4dea-b849-36bb02c837bd","Type":"ContainerStarted","Data":"6c0db6fc539fa60e13ce74a4129ccda55a6455c6da77fd7c2a15ec935e19f792"} Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.917549 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-6t9hh" Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.927193 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d3d63130-217d-400e-afc5-6b6bb3d56658","Type":"ContainerStarted","Data":"a13a6453f504348b1fc37cbf718993799543c1d0ff16da7e91ea05103e4dfc4b"} Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.931629 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1bc16799-92e0-45f0-a46d-770ef95eefa6","Type":"ContainerStarted","Data":"fde0e2caee1ca419e2f4d1b5ee9c79dd7c36dbd3603ce5038a379f25b387d1f8"} Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.931863 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.937333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5fac46a-545d-4f30-a7ab-8f5e713e934d","Type":"ContainerStarted","Data":"f84c516977fabf4d12664420973d6cd6aad1dffc3d9d6296c1edb5fc3318472d"} Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.940139 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d","Type":"ContainerStarted","Data":"85ca3e549dbcf99a2f9f8ce67ee485a73244e79666976bd6f0f2fe904f8d3d50"} Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.943913 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6t9hh" podStartSLOduration=11.327139155 podStartE2EDuration="20.943893856s" podCreationTimestamp="2025-11-22 08:40:27 +0000 UTC" firstStartedPulling="2025-11-22 08:40:36.484920543 +0000 UTC m=+1110.191281595" lastFinishedPulling="2025-11-22 08:40:46.101675244 +0000 UTC m=+1119.808036296" observedRunningTime="2025-11-22 08:40:47.93850397 +0000 UTC m=+1121.644865022" watchObservedRunningTime="2025-11-22 08:40:47.943893856 +0000 UTC m=+1121.650254908" Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.945769 4743 generic.go:334] "Generic (PLEG): container finished" podID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerID="5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024" exitCode=0 Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.945948 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mz9kc" event={"ID":"03685c6a-5ae9-45cf-b66d-5210d4811bda","Type":"ContainerDied","Data":"5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024"} Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.956216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"861e40f8-c596-40a1-b192-2fa51f567b55","Type":"ContainerStarted","Data":"05a05410e859d0aa129b195d9be97306e21232ea6ee301f84157b08562c6ad1b"} Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.956811 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 22 08:40:47 crc kubenswrapper[4743]: I1122 08:40:47.997203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"29734ea4-591c-478e-8030-55fcbac72d3a","Type":"ContainerStarted","Data":"07a73aeee6e1a47a17627947bea433502593cbb95367d72965b6e41790f9de15"} Nov 22 08:40:48 crc kubenswrapper[4743]: I1122 08:40:48.008754 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=12.759453985 podStartE2EDuration="23.008728386s" podCreationTimestamp="2025-11-22 08:40:25 +0000 UTC" firstStartedPulling="2025-11-22 08:40:36.465678288 +0000 UTC m=+1110.172039340" lastFinishedPulling="2025-11-22 08:40:46.714952689 +0000 UTC m=+1120.421313741" observedRunningTime="2025-11-22 08:40:48.004518675 +0000 UTC m=+1121.710879727" watchObservedRunningTime="2025-11-22 08:40:48.008728386 +0000 UTC m=+1121.715089438" Nov 22 08:40:48 crc kubenswrapper[4743]: I1122 08:40:48.019643 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3a18c86e-9d86-49ee-918f-76de17000e18","Type":"ContainerStarted","Data":"144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e"} Nov 22 08:40:48 crc kubenswrapper[4743]: I1122 08:40:48.042278 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.894623257 podStartE2EDuration="25.042254973s" podCreationTimestamp="2025-11-22 08:40:23 +0000 UTC" firstStartedPulling="2025-11-22 08:40:36.445232898 +0000 UTC m=+1110.151593950" lastFinishedPulling="2025-11-22 08:40:45.592864614 +0000 UTC m=+1119.299225666" observedRunningTime="2025-11-22 08:40:48.040320858 +0000 UTC m=+1121.746681910" watchObservedRunningTime="2025-11-22 08:40:48.042254973 +0000 UTC m=+1121.748616025" Nov 22 08:40:49 crc kubenswrapper[4743]: I1122 08:40:49.046637 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mz9kc" event={"ID":"03685c6a-5ae9-45cf-b66d-5210d4811bda","Type":"ContainerStarted","Data":"0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df"} Nov 22 08:40:49 crc kubenswrapper[4743]: I1122 08:40:49.047275 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mz9kc" event={"ID":"03685c6a-5ae9-45cf-b66d-5210d4811bda","Type":"ContainerStarted","Data":"6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e"} Nov 22 08:40:49 crc kubenswrapper[4743]: I1122 08:40:49.048505 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:49 crc kubenswrapper[4743]: I1122 08:40:49.048540 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:40:49 crc kubenswrapper[4743]: I1122 08:40:49.055902 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1","Type":"ContainerStarted","Data":"9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27"} Nov 22 08:40:49 crc kubenswrapper[4743]: I1122 08:40:49.073921 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-mz9kc" podStartSLOduration=13.154007532 podStartE2EDuration="22.073894508s" podCreationTimestamp="2025-11-22 08:40:27 +0000 UTC" firstStartedPulling="2025-11-22 08:40:36.673483073 +0000 UTC m=+1110.379844135" lastFinishedPulling="2025-11-22 08:40:45.593370059 +0000 UTC m=+1119.299731111" observedRunningTime="2025-11-22 08:40:49.067488714 +0000 UTC m=+1122.773849766" watchObservedRunningTime="2025-11-22 08:40:49.073894508 +0000 UTC m=+1122.780255560" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.846212 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-7qctt"] Nov 22 08:40:50 crc kubenswrapper[4743]: E1122 08:40:50.850051 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4114e2-0dc1-490a-8582-5234733b64ab" containerName="init" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.850086 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4114e2-0dc1-490a-8582-5234733b64ab" containerName="init" Nov 22 08:40:50 crc kubenswrapper[4743]: E1122 08:40:50.850095 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4114e2-0dc1-490a-8582-5234733b64ab" containerName="dnsmasq-dns" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.850103 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4114e2-0dc1-490a-8582-5234733b64ab" containerName="dnsmasq-dns" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.850281 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4114e2-0dc1-490a-8582-5234733b64ab" containerName="dnsmasq-dns" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.850994 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.854883 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.870168 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7qctt"] Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.893994 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-combined-ca-bundle\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.894070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q7cl\" (UniqueName: \"kubernetes.io/projected/870c700d-9095-4781-ab16-4cce25d24ed2-kube-api-access-6q7cl\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.894165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovs-rundir\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.894195 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.894245 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870c700d-9095-4781-ab16-4cce25d24ed2-config\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.894279 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovn-rundir\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.990524 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-m6t8v"] Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.992845 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.996112 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q7cl\" (UniqueName: \"kubernetes.io/projected/870c700d-9095-4781-ab16-4cce25d24ed2-kube-api-access-6q7cl\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.996377 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovs-rundir\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.996487 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.996608 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870c700d-9095-4781-ab16-4cce25d24ed2-config\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.996703 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovn-rundir\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.996810 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-combined-ca-bundle\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.996885 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovs-rundir\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.997078 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovn-rundir\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.997743 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870c700d-9095-4781-ab16-4cce25d24ed2-config\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:50 crc kubenswrapper[4743]: I1122 08:40:50.998396 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.007475 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.012816 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-combined-ca-bundle\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.023767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q7cl\" (UniqueName: \"kubernetes.io/projected/870c700d-9095-4781-ab16-4cce25d24ed2-kube-api-access-6q7cl\") pod \"ovn-controller-metrics-7qctt\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.030122 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-m6t8v"] Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.099410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.099556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-config\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.099640 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.099771 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlhq6\" (UniqueName: \"kubernetes.io/projected/7f7234b0-750b-4f7d-8ccf-1dde836c5700-kube-api-access-hlhq6\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.129888 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-m6t8v"] Nov 22 08:40:51 crc kubenswrapper[4743]: E1122 08:40:51.130468 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-hlhq6 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" podUID="7f7234b0-750b-4f7d-8ccf-1dde836c5700" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.162291 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-bdf8t"] Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.163544 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.167782 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.173518 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bdf8t"] Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.181643 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.201381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.201790 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlhq6\" (UniqueName: \"kubernetes.io/projected/7f7234b0-750b-4f7d-8ccf-1dde836c5700-kube-api-access-hlhq6\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.202632 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.203169 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.203237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.203331 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-config\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.204838 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-config\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.220160 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlhq6\" (UniqueName: \"kubernetes.io/projected/7f7234b0-750b-4f7d-8ccf-1dde836c5700-kube-api-access-hlhq6\") pod \"dnsmasq-dns-5bf47b49b7-m6t8v\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.304542 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-dns-svc\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.304624 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7j69\" (UniqueName: \"kubernetes.io/projected/f3271d86-833c-4795-942b-7ce09c38b132-kube-api-access-m7j69\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.304696 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.304754 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.304776 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-config\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.406441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7j69\" (UniqueName: \"kubernetes.io/projected/f3271d86-833c-4795-942b-7ce09c38b132-kube-api-access-m7j69\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.406509 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.406535 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.406556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-config\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.406696 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-dns-svc\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.407963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-dns-svc\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.407985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.408991 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.409167 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-config\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.448544 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7j69\" (UniqueName: \"kubernetes.io/projected/f3271d86-833c-4795-942b-7ce09c38b132-kube-api-access-m7j69\") pod \"dnsmasq-dns-8554648995-bdf8t\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.487486 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:51 crc kubenswrapper[4743]: E1122 08:40:51.645655 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d63130_217d_400e_afc5_6b6bb3d56658.slice/crio-a13a6453f504348b1fc37cbf718993799543c1d0ff16da7e91ea05103e4dfc4b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d63130_217d_400e_afc5_6b6bb3d56658.slice/crio-conmon-a13a6453f504348b1fc37cbf718993799543c1d0ff16da7e91ea05103e4dfc4b.scope\": RecentStats: unable to find data in memory cache]" Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.833071 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-7qctt"] Nov 22 08:40:51 crc kubenswrapper[4743]: W1122 08:40:51.841482 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod870c700d_9095_4781_ab16_4cce25d24ed2.slice/crio-0bab0ae56585ab496c78b427d6c0cc668cf56d71b1d51bc48c30ab8fbc9736d7 WatchSource:0}: Error finding container 0bab0ae56585ab496c78b427d6c0cc668cf56d71b1d51bc48c30ab8fbc9736d7: Status 404 returned error can't find the container with id 0bab0ae56585ab496c78b427d6c0cc668cf56d71b1d51bc48c30ab8fbc9736d7 Nov 22 08:40:51 crc kubenswrapper[4743]: I1122 08:40:51.961616 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bdf8t"] Nov 22 08:40:51 crc kubenswrapper[4743]: W1122 08:40:51.970870 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3271d86_833c_4795_942b_7ce09c38b132.slice/crio-e14b649be22b3c321ac3e306a0a347bacbbecf6e0ac6379b477826a96b684bf6 WatchSource:0}: Error finding container e14b649be22b3c321ac3e306a0a347bacbbecf6e0ac6379b477826a96b684bf6: Status 404 returned error can't find the container with id e14b649be22b3c321ac3e306a0a347bacbbecf6e0ac6379b477826a96b684bf6 Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.098557 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7qctt" event={"ID":"870c700d-9095-4781-ab16-4cce25d24ed2","Type":"ContainerStarted","Data":"0bab0ae56585ab496c78b427d6c0cc668cf56d71b1d51bc48c30ab8fbc9736d7"} Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.102140 4743 generic.go:334] "Generic (PLEG): container finished" podID="d3d63130-217d-400e-afc5-6b6bb3d56658" containerID="a13a6453f504348b1fc37cbf718993799543c1d0ff16da7e91ea05103e4dfc4b" exitCode=0 Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.102189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d3d63130-217d-400e-afc5-6b6bb3d56658","Type":"ContainerDied","Data":"a13a6453f504348b1fc37cbf718993799543c1d0ff16da7e91ea05103e4dfc4b"} Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.104304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bdf8t" event={"ID":"f3271d86-833c-4795-942b-7ce09c38b132","Type":"ContainerStarted","Data":"e14b649be22b3c321ac3e306a0a347bacbbecf6e0ac6379b477826a96b684bf6"} Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.107164 4743 generic.go:334] "Generic (PLEG): container finished" podID="29734ea4-591c-478e-8030-55fcbac72d3a" containerID="07a73aeee6e1a47a17627947bea433502593cbb95367d72965b6e41790f9de15" exitCode=0 Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.107248 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"29734ea4-591c-478e-8030-55fcbac72d3a","Type":"ContainerDied","Data":"07a73aeee6e1a47a17627947bea433502593cbb95367d72965b6e41790f9de15"} Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.126859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3a18c86e-9d86-49ee-918f-76de17000e18","Type":"ContainerStarted","Data":"1673fc12fab7560971cb866548c9e23899260f2050054ca6370d37795dfdb742"} Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.141012 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.141110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d","Type":"ContainerStarted","Data":"01419810dd33722ab918d36dfaf000ac018b6b763b7e90647fea3f7eed2c7509"} Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.187365 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.5103083569999995 podStartE2EDuration="21.187344817s" podCreationTimestamp="2025-11-22 08:40:31 +0000 UTC" firstStartedPulling="2025-11-22 08:40:36.568361 +0000 UTC m=+1110.274722052" lastFinishedPulling="2025-11-22 08:40:51.24539745 +0000 UTC m=+1124.951758512" observedRunningTime="2025-11-22 08:40:52.172300743 +0000 UTC m=+1125.878661795" watchObservedRunningTime="2025-11-22 08:40:52.187344817 +0000 UTC m=+1125.893705879" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.203097 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=9.443930904 podStartE2EDuration="23.203071251s" podCreationTimestamp="2025-11-22 08:40:29 +0000 UTC" firstStartedPulling="2025-11-22 08:40:37.459366027 +0000 UTC m=+1111.165727079" lastFinishedPulling="2025-11-22 08:40:51.218506374 +0000 UTC m=+1124.924867426" observedRunningTime="2025-11-22 08:40:52.199254831 +0000 UTC m=+1125.905615883" watchObservedRunningTime="2025-11-22 08:40:52.203071251 +0000 UTC m=+1125.909432303" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.274224 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.345550 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-dns-svc\") pod \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.345697 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-ovsdbserver-nb\") pod \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.345755 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlhq6\" (UniqueName: \"kubernetes.io/projected/7f7234b0-750b-4f7d-8ccf-1dde836c5700-kube-api-access-hlhq6\") pod \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.345808 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-config\") pod \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\" (UID: \"7f7234b0-750b-4f7d-8ccf-1dde836c5700\") " Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.346660 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-config" (OuterVolumeSpecName: "config") pod "7f7234b0-750b-4f7d-8ccf-1dde836c5700" (UID: "7f7234b0-750b-4f7d-8ccf-1dde836c5700"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.346974 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f7234b0-750b-4f7d-8ccf-1dde836c5700" (UID: "7f7234b0-750b-4f7d-8ccf-1dde836c5700"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.347262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f7234b0-750b-4f7d-8ccf-1dde836c5700" (UID: "7f7234b0-750b-4f7d-8ccf-1dde836c5700"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.352116 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f7234b0-750b-4f7d-8ccf-1dde836c5700-kube-api-access-hlhq6" (OuterVolumeSpecName: "kube-api-access-hlhq6") pod "7f7234b0-750b-4f7d-8ccf-1dde836c5700" (UID: "7f7234b0-750b-4f7d-8ccf-1dde836c5700"). InnerVolumeSpecName "kube-api-access-hlhq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.447665 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.447711 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.447725 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlhq6\" (UniqueName: \"kubernetes.io/projected/7f7234b0-750b-4f7d-8ccf-1dde836c5700-kube-api-access-hlhq6\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.447737 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7234b0-750b-4f7d-8ccf-1dde836c5700-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:40:52 crc kubenswrapper[4743]: I1122 08:40:52.731917 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.149845 4743 generic.go:334] "Generic (PLEG): container finished" podID="f3271d86-833c-4795-942b-7ce09c38b132" containerID="d5a03a0b52e1fdd0a746190febcc0a88f6a5de85601c7bb583d66d8236cb71f3" exitCode=0 Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.149912 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bdf8t" event={"ID":"f3271d86-833c-4795-942b-7ce09c38b132","Type":"ContainerDied","Data":"d5a03a0b52e1fdd0a746190febcc0a88f6a5de85601c7bb583d66d8236cb71f3"} Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.162959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"29734ea4-591c-478e-8030-55fcbac72d3a","Type":"ContainerStarted","Data":"c89c36b86576b82473835bf0c40ac138380844e7593a7241e2bf4b37e98aadf1"} Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.164997 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7qctt" event={"ID":"870c700d-9095-4781-ab16-4cce25d24ed2","Type":"ContainerStarted","Data":"544ee1789868b1e74b94d551ca242dd748b844448c139b17d4767a8ea19814b9"} Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.168816 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-m6t8v" Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.169008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d3d63130-217d-400e-afc5-6b6bb3d56658","Type":"ContainerStarted","Data":"9b8889acf70f714f96a12fef606aa84ad0497560bdf80aa3dddac96e77684d76"} Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.216867 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.291351332 podStartE2EDuration="33.216846131s" podCreationTimestamp="2025-11-22 08:40:20 +0000 UTC" firstStartedPulling="2025-11-22 08:40:35.667370985 +0000 UTC m=+1109.373732117" lastFinishedPulling="2025-11-22 08:40:45.592865864 +0000 UTC m=+1119.299226916" observedRunningTime="2025-11-22 08:40:53.198483281 +0000 UTC m=+1126.904844333" watchObservedRunningTime="2025-11-22 08:40:53.216846131 +0000 UTC m=+1126.923207183" Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.233478 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.476858153 podStartE2EDuration="32.23346411s" podCreationTimestamp="2025-11-22 08:40:21 +0000 UTC" firstStartedPulling="2025-11-22 08:40:36.43734382 +0000 UTC m=+1110.143704872" lastFinishedPulling="2025-11-22 08:40:46.193949587 +0000 UTC m=+1119.900310829" observedRunningTime="2025-11-22 08:40:53.231489733 +0000 UTC m=+1126.937850795" watchObservedRunningTime="2025-11-22 08:40:53.23346411 +0000 UTC m=+1126.939825162" Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.251391 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-7qctt" podStartSLOduration=3.251367207 podStartE2EDuration="3.251367207s" podCreationTimestamp="2025-11-22 08:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:40:53.246939499 +0000 UTC m=+1126.953300581" watchObservedRunningTime="2025-11-22 08:40:53.251367207 +0000 UTC m=+1126.957728259" Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.346452 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-m6t8v"] Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.353877 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-m6t8v"] Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.355385 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.731790 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:53 crc kubenswrapper[4743]: I1122 08:40:53.767381 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:54 crc kubenswrapper[4743]: I1122 08:40:54.181183 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bdf8t" event={"ID":"f3271d86-833c-4795-942b-7ce09c38b132","Type":"ContainerStarted","Data":"4f5ab8743f30a6c029f905e16a29418260fb63ebb2b97d8b847767839c5a2732"} Nov 22 08:40:54 crc kubenswrapper[4743]: I1122 08:40:54.221999 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-bdf8t" podStartSLOduration=3.221980631 podStartE2EDuration="3.221980631s" podCreationTimestamp="2025-11-22 08:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:40:54.213716113 +0000 UTC m=+1127.920077175" watchObservedRunningTime="2025-11-22 08:40:54.221980631 +0000 UTC m=+1127.928341683" Nov 22 08:40:54 crc kubenswrapper[4743]: I1122 08:40:54.241662 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 22 08:40:54 crc kubenswrapper[4743]: I1122 08:40:54.508586 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:54 crc kubenswrapper[4743]: I1122 08:40:54.543253 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.164384 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f7234b0-750b-4f7d-8ccf-1dde836c5700" path="/var/lib/kubelet/pods/7f7234b0-750b-4f7d-8ccf-1dde836c5700/volumes" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.189863 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.189907 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.239206 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.295913 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bdf8t"] Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.325482 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-sx88h"] Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.327174 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.341912 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.362531 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-sx88h"] Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.397331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.397460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.397516 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-config\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.397553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.397614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cjh5\" (UniqueName: \"kubernetes.io/projected/da5e223c-67c0-4f09-8f50-bc6be61305d1-kube-api-access-8cjh5\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.499208 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.499259 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-config\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.499300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.499345 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cjh5\" (UniqueName: \"kubernetes.io/projected/da5e223c-67c0-4f09-8f50-bc6be61305d1-kube-api-access-8cjh5\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.499388 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.500286 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.500384 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-config\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.500417 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.500838 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.521938 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cjh5\" (UniqueName: \"kubernetes.io/projected/da5e223c-67c0-4f09-8f50-bc6be61305d1-kube-api-access-8cjh5\") pod \"dnsmasq-dns-b8fbc5445-sx88h\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.560216 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.563374 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.569101 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-ggkt4" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.569392 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.569565 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.574005 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.586671 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.602165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-scripts\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.602254 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.602397 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.602427 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.602460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9817865-d957-42d3-8edb-6800e1075d23-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.602602 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-config\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.602623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp55t\" (UniqueName: \"kubernetes.io/projected/b9817865-d957-42d3-8edb-6800e1075d23-kube-api-access-fp55t\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.653865 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.704479 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.704813 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.704861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.704907 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9817865-d957-42d3-8edb-6800e1075d23-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.704981 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-config\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.705009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp55t\" (UniqueName: \"kubernetes.io/projected/b9817865-d957-42d3-8edb-6800e1075d23-kube-api-access-fp55t\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.705040 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-scripts\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.705984 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9817865-d957-42d3-8edb-6800e1075d23-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.706183 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-scripts\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.706249 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-config\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.713153 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.720970 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.738443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.738935 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp55t\" (UniqueName: \"kubernetes.io/projected/b9817865-d957-42d3-8edb-6800e1075d23-kube-api-access-fp55t\") pod \"ovn-northd-0\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " pod="openstack/ovn-northd-0" Nov 22 08:40:55 crc kubenswrapper[4743]: I1122 08:40:55.904296 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.171460 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-sx88h"] Nov 22 08:40:56 crc kubenswrapper[4743]: W1122 08:40:56.184550 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda5e223c_67c0_4f09_8f50_bc6be61305d1.slice/crio-df30e24e5c0c990b9f39654dadf06d8343ce82543b8641427a3b375da3739148 WatchSource:0}: Error finding container df30e24e5c0c990b9f39654dadf06d8343ce82543b8641427a3b375da3739148: Status 404 returned error can't find the container with id df30e24e5c0c990b9f39654dadf06d8343ce82543b8641427a3b375da3739148 Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.200225 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" event={"ID":"da5e223c-67c0-4f09-8f50-bc6be61305d1","Type":"ContainerStarted","Data":"df30e24e5c0c990b9f39654dadf06d8343ce82543b8641427a3b375da3739148"} Nov 22 08:40:56 crc kubenswrapper[4743]: W1122 08:40:56.402010 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9817865_d957_42d3_8edb_6800e1075d23.slice/crio-d99accdee1e475696e1a788c3d6b69812093615938aecdf117c478b35231380f WatchSource:0}: Error finding container d99accdee1e475696e1a788c3d6b69812093615938aecdf117c478b35231380f: Status 404 returned error can't find the container with id d99accdee1e475696e1a788c3d6b69812093615938aecdf117c478b35231380f Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.410684 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.488243 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.495349 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.497351 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.497416 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-pqg77" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.497806 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.498348 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.511258 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.638478 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.638865 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.638905 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-lock\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.638968 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlm8c\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-kube-api-access-tlm8c\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.639080 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-cache\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.740314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.740370 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.740405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-lock\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.740456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlm8c\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-kube-api-access-tlm8c\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.740537 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-cache\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: E1122 08:40:56.740594 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 08:40:56 crc kubenswrapper[4743]: E1122 08:40:56.740623 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 08:40:56 crc kubenswrapper[4743]: E1122 08:40:56.740689 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift podName:1638fe70-d5cb-4edc-9513-e5ae475c0909 nodeName:}" failed. No retries permitted until 2025-11-22 08:40:57.24066283 +0000 UTC m=+1130.947023922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift") pod "swift-storage-0" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909") : configmap "swift-ring-files" not found Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.740993 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.741100 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-lock\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.741163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-cache\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.764178 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlm8c\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-kube-api-access-tlm8c\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:56 crc kubenswrapper[4743]: I1122 08:40:56.765313 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.040253 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-9fxgn"] Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.041282 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.043546 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.043750 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.043830 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.059476 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9fxgn"] Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.147099 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-etc-swift\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.147287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-dispersionconf\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.147337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-ring-data-devices\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.147453 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-674z7\" (UniqueName: \"kubernetes.io/projected/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-kube-api-access-674z7\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.147549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-swiftconf\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.147647 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-scripts\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.147700 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-combined-ca-bundle\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.206481 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9817865-d957-42d3-8edb-6800e1075d23","Type":"ContainerStarted","Data":"d99accdee1e475696e1a788c3d6b69812093615938aecdf117c478b35231380f"} Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.207073 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-bdf8t" podUID="f3271d86-833c-4795-942b-7ce09c38b132" containerName="dnsmasq-dns" containerID="cri-o://4f5ab8743f30a6c029f905e16a29418260fb63ebb2b97d8b847767839c5a2732" gracePeriod=10 Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.249543 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-dispersionconf\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.250372 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-ring-data-devices\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.250538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-674z7\" (UniqueName: \"kubernetes.io/projected/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-kube-api-access-674z7\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.250679 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-swiftconf\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.250726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.250762 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-scripts\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.250788 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-combined-ca-bundle\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.250851 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-etc-swift\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.250992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-ring-data-devices\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.251240 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-etc-swift\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: E1122 08:40:57.251418 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 08:40:57 crc kubenswrapper[4743]: E1122 08:40:57.251444 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.251484 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-scripts\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: E1122 08:40:57.251489 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift podName:1638fe70-d5cb-4edc-9513-e5ae475c0909 nodeName:}" failed. No retries permitted until 2025-11-22 08:40:58.251470848 +0000 UTC m=+1131.957831960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift") pod "swift-storage-0" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909") : configmap "swift-ring-files" not found Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.254610 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-swiftconf\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.258243 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-dispersionconf\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.258889 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-combined-ca-bundle\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.271973 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-674z7\" (UniqueName: \"kubernetes.io/projected/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-kube-api-access-674z7\") pod \"swift-ring-rebalance-9fxgn\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.358983 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:40:57 crc kubenswrapper[4743]: I1122 08:40:57.787084 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-9fxgn"] Nov 22 08:40:57 crc kubenswrapper[4743]: W1122 08:40:57.816169 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90e84b78_308d_41c1_b9a7_5d0c4cb80d44.slice/crio-e1b1ca8aaa8d0ef9c46ba07f72ac7795172d376aeec9ed1cdd92ddab8ab6253b WatchSource:0}: Error finding container e1b1ca8aaa8d0ef9c46ba07f72ac7795172d376aeec9ed1cdd92ddab8ab6253b: Status 404 returned error can't find the container with id e1b1ca8aaa8d0ef9c46ba07f72ac7795172d376aeec9ed1cdd92ddab8ab6253b Nov 22 08:40:58 crc kubenswrapper[4743]: I1122 08:40:58.214434 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9fxgn" event={"ID":"90e84b78-308d-41c1-b9a7-5d0c4cb80d44","Type":"ContainerStarted","Data":"e1b1ca8aaa8d0ef9c46ba07f72ac7795172d376aeec9ed1cdd92ddab8ab6253b"} Nov 22 08:40:58 crc kubenswrapper[4743]: I1122 08:40:58.217159 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" event={"ID":"da5e223c-67c0-4f09-8f50-bc6be61305d1","Type":"ContainerStarted","Data":"f2886216dda914ee00b1675fd8db61d4ea910f9773dfec2868ea4686645bbea2"} Nov 22 08:40:58 crc kubenswrapper[4743]: I1122 08:40:58.267436 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:40:58 crc kubenswrapper[4743]: E1122 08:40:58.267627 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 08:40:58 crc kubenswrapper[4743]: E1122 08:40:58.267647 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 08:40:58 crc kubenswrapper[4743]: E1122 08:40:58.267705 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift podName:1638fe70-d5cb-4edc-9513-e5ae475c0909 nodeName:}" failed. No retries permitted until 2025-11-22 08:41:00.267690818 +0000 UTC m=+1133.974051870 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift") pod "swift-storage-0" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909") : configmap "swift-ring-files" not found Nov 22 08:40:59 crc kubenswrapper[4743]: I1122 08:40:59.228371 4743 generic.go:334] "Generic (PLEG): container finished" podID="da5e223c-67c0-4f09-8f50-bc6be61305d1" containerID="f2886216dda914ee00b1675fd8db61d4ea910f9773dfec2868ea4686645bbea2" exitCode=0 Nov 22 08:40:59 crc kubenswrapper[4743]: I1122 08:40:59.228828 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" event={"ID":"da5e223c-67c0-4f09-8f50-bc6be61305d1","Type":"ContainerDied","Data":"f2886216dda914ee00b1675fd8db61d4ea910f9773dfec2868ea4686645bbea2"} Nov 22 08:40:59 crc kubenswrapper[4743]: I1122 08:40:59.231448 4743 generic.go:334] "Generic (PLEG): container finished" podID="f3271d86-833c-4795-942b-7ce09c38b132" containerID="4f5ab8743f30a6c029f905e16a29418260fb63ebb2b97d8b847767839c5a2732" exitCode=0 Nov 22 08:40:59 crc kubenswrapper[4743]: I1122 08:40:59.231526 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bdf8t" event={"ID":"f3271d86-833c-4795-942b-7ce09c38b132","Type":"ContainerDied","Data":"4f5ab8743f30a6c029f905e16a29418260fb63ebb2b97d8b847767839c5a2732"} Nov 22 08:41:00 crc kubenswrapper[4743]: I1122 08:41:00.310381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:41:00 crc kubenswrapper[4743]: E1122 08:41:00.310712 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 08:41:00 crc kubenswrapper[4743]: E1122 08:41:00.310751 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 08:41:00 crc kubenswrapper[4743]: E1122 08:41:00.310826 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift podName:1638fe70-d5cb-4edc-9513-e5ae475c0909 nodeName:}" failed. No retries permitted until 2025-11-22 08:41:04.310802085 +0000 UTC m=+1138.017163137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift") pod "swift-storage-0" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909") : configmap "swift-ring-files" not found Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.043811 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.124530 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-dns-svc\") pod \"f3271d86-833c-4795-942b-7ce09c38b132\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.124634 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-config\") pod \"f3271d86-833c-4795-942b-7ce09c38b132\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.124737 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-sb\") pod \"f3271d86-833c-4795-942b-7ce09c38b132\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.124823 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7j69\" (UniqueName: \"kubernetes.io/projected/f3271d86-833c-4795-942b-7ce09c38b132-kube-api-access-m7j69\") pod \"f3271d86-833c-4795-942b-7ce09c38b132\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.124842 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-nb\") pod \"f3271d86-833c-4795-942b-7ce09c38b132\" (UID: \"f3271d86-833c-4795-942b-7ce09c38b132\") " Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.131201 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3271d86-833c-4795-942b-7ce09c38b132-kube-api-access-m7j69" (OuterVolumeSpecName: "kube-api-access-m7j69") pod "f3271d86-833c-4795-942b-7ce09c38b132" (UID: "f3271d86-833c-4795-942b-7ce09c38b132"). InnerVolumeSpecName "kube-api-access-m7j69". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.180044 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f3271d86-833c-4795-942b-7ce09c38b132" (UID: "f3271d86-833c-4795-942b-7ce09c38b132"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.180159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f3271d86-833c-4795-942b-7ce09c38b132" (UID: "f3271d86-833c-4795-942b-7ce09c38b132"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.180824 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-config" (OuterVolumeSpecName: "config") pod "f3271d86-833c-4795-942b-7ce09c38b132" (UID: "f3271d86-833c-4795-942b-7ce09c38b132"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.185459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f3271d86-833c-4795-942b-7ce09c38b132" (UID: "f3271d86-833c-4795-942b-7ce09c38b132"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.227168 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.228260 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.228278 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.228291 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7j69\" (UniqueName: \"kubernetes.io/projected/f3271d86-833c-4795-942b-7ce09c38b132-kube-api-access-m7j69\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.228302 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f3271d86-833c-4795-942b-7ce09c38b132-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.248127 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-bdf8t" event={"ID":"f3271d86-833c-4795-942b-7ce09c38b132","Type":"ContainerDied","Data":"e14b649be22b3c321ac3e306a0a347bacbbecf6e0ac6379b477826a96b684bf6"} Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.248179 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-bdf8t" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.248182 4743 scope.go:117] "RemoveContainer" containerID="4f5ab8743f30a6c029f905e16a29418260fb63ebb2b97d8b847767839c5a2732" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.283095 4743 scope.go:117] "RemoveContainer" containerID="d5a03a0b52e1fdd0a746190febcc0a88f6a5de85601c7bb583d66d8236cb71f3" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.285781 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bdf8t"] Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.292047 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-bdf8t"] Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.547656 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 22 08:41:01 crc kubenswrapper[4743]: I1122 08:41:01.547693 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 22 08:41:02 crc kubenswrapper[4743]: I1122 08:41:02.259983 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" event={"ID":"da5e223c-67c0-4f09-8f50-bc6be61305d1","Type":"ContainerStarted","Data":"5215c7383d75012abf3d0d94618fad8a23559b994de0167f56986ac7a14b929d"} Nov 22 08:41:02 crc kubenswrapper[4743]: I1122 08:41:02.972106 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 22 08:41:02 crc kubenswrapper[4743]: I1122 08:41:02.972244 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 22 08:41:03 crc kubenswrapper[4743]: I1122 08:41:03.162313 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3271d86-833c-4795-942b-7ce09c38b132" path="/var/lib/kubelet/pods/f3271d86-833c-4795-942b-7ce09c38b132/volumes" Nov 22 08:41:03 crc kubenswrapper[4743]: I1122 08:41:03.267432 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:41:03 crc kubenswrapper[4743]: I1122 08:41:03.289983 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" podStartSLOduration=8.2899655 podStartE2EDuration="8.2899655s" podCreationTimestamp="2025-11-22 08:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:41:03.283684519 +0000 UTC m=+1136.990045571" watchObservedRunningTime="2025-11-22 08:41:03.2899655 +0000 UTC m=+1136.996326552" Nov 22 08:41:03 crc kubenswrapper[4743]: I1122 08:41:03.807216 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 22 08:41:03 crc kubenswrapper[4743]: I1122 08:41:03.890752 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="d3d63130-217d-400e-afc5-6b6bb3d56658" containerName="galera" probeResult="failure" output=< Nov 22 08:41:03 crc kubenswrapper[4743]: wsrep_local_state_comment (Joined) differs from Synced Nov 22 08:41:03 crc kubenswrapper[4743]: > Nov 22 08:41:04 crc kubenswrapper[4743]: I1122 08:41:04.380711 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:41:04 crc kubenswrapper[4743]: E1122 08:41:04.380914 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 08:41:04 crc kubenswrapper[4743]: E1122 08:41:04.380934 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 08:41:04 crc kubenswrapper[4743]: E1122 08:41:04.380996 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift podName:1638fe70-d5cb-4edc-9513-e5ae475c0909 nodeName:}" failed. No retries permitted until 2025-11-22 08:41:12.380976118 +0000 UTC m=+1146.087337170 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift") pod "swift-storage-0" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909") : configmap "swift-ring-files" not found Nov 22 08:41:09 crc kubenswrapper[4743]: I1122 08:41:09.226623 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 22 08:41:09 crc kubenswrapper[4743]: I1122 08:41:09.313445 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 22 08:41:09 crc kubenswrapper[4743]: I1122 08:41:09.313761 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9817865-d957-42d3-8edb-6800e1075d23","Type":"ContainerStarted","Data":"1f420d1e2699e276d82c94d18dd411a4b04324712350d57b7ef8e6cdb952414a"} Nov 22 08:41:09 crc kubenswrapper[4743]: I1122 08:41:09.313846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9817865-d957-42d3-8edb-6800e1075d23","Type":"ContainerStarted","Data":"44e22b0e556cf479c4ab148fe02b8b602f8d6a658164bfd210e15bbe9a5c9282"} Nov 22 08:41:09 crc kubenswrapper[4743]: I1122 08:41:09.378470 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.973551108 podStartE2EDuration="14.378451494s" podCreationTimestamp="2025-11-22 08:40:55 +0000 UTC" firstStartedPulling="2025-11-22 08:40:56.407988911 +0000 UTC m=+1130.114349963" lastFinishedPulling="2025-11-22 08:41:08.812889287 +0000 UTC m=+1142.519250349" observedRunningTime="2025-11-22 08:41:09.370643389 +0000 UTC m=+1143.077004441" watchObservedRunningTime="2025-11-22 08:41:09.378451494 +0000 UTC m=+1143.084812546" Nov 22 08:41:10 crc kubenswrapper[4743]: I1122 08:41:10.320476 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 22 08:41:10 crc kubenswrapper[4743]: I1122 08:41:10.655774 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:41:10 crc kubenswrapper[4743]: I1122 08:41:10.718193 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dsh6z"] Nov 22 08:41:10 crc kubenswrapper[4743]: I1122 08:41:10.718426 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" podUID="03f48b82-41c0-4673-bcae-aef0130d447a" containerName="dnsmasq-dns" containerID="cri-o://5dd3ac541733ba6c4e04ad719e2c73941bdf90aa7977960df0d90c3de86b1b11" gracePeriod=10 Nov 22 08:41:11 crc kubenswrapper[4743]: I1122 08:41:11.328389 4743 generic.go:334] "Generic (PLEG): container finished" podID="03f48b82-41c0-4673-bcae-aef0130d447a" containerID="5dd3ac541733ba6c4e04ad719e2c73941bdf90aa7977960df0d90c3de86b1b11" exitCode=0 Nov 22 08:41:11 crc kubenswrapper[4743]: I1122 08:41:11.329169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" event={"ID":"03f48b82-41c0-4673-bcae-aef0130d447a","Type":"ContainerDied","Data":"5dd3ac541733ba6c4e04ad719e2c73941bdf90aa7977960df0d90c3de86b1b11"} Nov 22 08:41:11 crc kubenswrapper[4743]: I1122 08:41:11.627331 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 22 08:41:11 crc kubenswrapper[4743]: I1122 08:41:11.908290 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.031061 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-config\") pod \"03f48b82-41c0-4673-bcae-aef0130d447a\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.031237 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-dns-svc\") pod \"03f48b82-41c0-4673-bcae-aef0130d447a\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.031282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49ljx\" (UniqueName: \"kubernetes.io/projected/03f48b82-41c0-4673-bcae-aef0130d447a-kube-api-access-49ljx\") pod \"03f48b82-41c0-4673-bcae-aef0130d447a\" (UID: \"03f48b82-41c0-4673-bcae-aef0130d447a\") " Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.036368 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03f48b82-41c0-4673-bcae-aef0130d447a-kube-api-access-49ljx" (OuterVolumeSpecName: "kube-api-access-49ljx") pod "03f48b82-41c0-4673-bcae-aef0130d447a" (UID: "03f48b82-41c0-4673-bcae-aef0130d447a"). InnerVolumeSpecName "kube-api-access-49ljx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.081319 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "03f48b82-41c0-4673-bcae-aef0130d447a" (UID: "03f48b82-41c0-4673-bcae-aef0130d447a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.081351 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-config" (OuterVolumeSpecName: "config") pod "03f48b82-41c0-4673-bcae-aef0130d447a" (UID: "03f48b82-41c0-4673-bcae-aef0130d447a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.133206 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.133275 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49ljx\" (UniqueName: \"kubernetes.io/projected/03f48b82-41c0-4673-bcae-aef0130d447a-kube-api-access-49ljx\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.133301 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03f48b82-41c0-4673-bcae-aef0130d447a-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.342867 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" event={"ID":"03f48b82-41c0-4673-bcae-aef0130d447a","Type":"ContainerDied","Data":"a345895a6ff0581070eb7e20eb5655dfb9a26f1296023b2677e99e37057199fd"} Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.342921 4743 scope.go:117] "RemoveContainer" containerID="5dd3ac541733ba6c4e04ad719e2c73941bdf90aa7977960df0d90c3de86b1b11" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.343029 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-dsh6z" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.350089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9fxgn" event={"ID":"90e84b78-308d-41c1-b9a7-5d0c4cb80d44","Type":"ContainerStarted","Data":"c2cd99ce8d0b171935d2ad05dc8e366d1a81b0cd4e4bf014f079a93f8d17c5ad"} Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.366749 4743 scope.go:117] "RemoveContainer" containerID="979773088cbbc502b234ec63667fff46b53fc263d2a2a09ebb4e990b2336e837" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.376635 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-9fxgn" podStartSLOduration=1.5619633880000001 podStartE2EDuration="15.376472133s" podCreationTimestamp="2025-11-22 08:40:57 +0000 UTC" firstStartedPulling="2025-11-22 08:40:57.825565502 +0000 UTC m=+1131.531926554" lastFinishedPulling="2025-11-22 08:41:11.640074247 +0000 UTC m=+1145.346435299" observedRunningTime="2025-11-22 08:41:12.373882089 +0000 UTC m=+1146.080243151" watchObservedRunningTime="2025-11-22 08:41:12.376472133 +0000 UTC m=+1146.082833185" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.403156 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dsh6z"] Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.408817 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-dsh6z"] Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.443274 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:41:12 crc kubenswrapper[4743]: E1122 08:41:12.447309 4743 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Nov 22 08:41:12 crc kubenswrapper[4743]: E1122 08:41:12.447338 4743 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Nov 22 08:41:12 crc kubenswrapper[4743]: E1122 08:41:12.448069 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift podName:1638fe70-d5cb-4edc-9513-e5ae475c0909 nodeName:}" failed. No retries permitted until 2025-11-22 08:41:28.447381339 +0000 UTC m=+1162.153742431 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift") pod "swift-storage-0" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909") : configmap "swift-ring-files" not found Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.992634 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f92e-account-create-f4xn7"] Nov 22 08:41:12 crc kubenswrapper[4743]: E1122 08:41:12.992949 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f48b82-41c0-4673-bcae-aef0130d447a" containerName="init" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.992961 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f48b82-41c0-4673-bcae-aef0130d447a" containerName="init" Nov 22 08:41:12 crc kubenswrapper[4743]: E1122 08:41:12.992987 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3271d86-833c-4795-942b-7ce09c38b132" containerName="dnsmasq-dns" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.992992 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3271d86-833c-4795-942b-7ce09c38b132" containerName="dnsmasq-dns" Nov 22 08:41:12 crc kubenswrapper[4743]: E1122 08:41:12.993005 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3271d86-833c-4795-942b-7ce09c38b132" containerName="init" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.993012 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3271d86-833c-4795-942b-7ce09c38b132" containerName="init" Nov 22 08:41:12 crc kubenswrapper[4743]: E1122 08:41:12.993022 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03f48b82-41c0-4673-bcae-aef0130d447a" containerName="dnsmasq-dns" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.993027 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="03f48b82-41c0-4673-bcae-aef0130d447a" containerName="dnsmasq-dns" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.993174 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3271d86-833c-4795-942b-7ce09c38b132" containerName="dnsmasq-dns" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.993184 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="03f48b82-41c0-4673-bcae-aef0130d447a" containerName="dnsmasq-dns" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.993788 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f92e-account-create-f4xn7" Nov 22 08:41:12 crc kubenswrapper[4743]: I1122 08:41:12.995323 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.003750 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f92e-account-create-f4xn7"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.078722 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-s6m4s"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.079670 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s6m4s" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.090849 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s6m4s"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.156540 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131c329f-8c7c-4d30-a0c3-37ecaac9db82-operator-scripts\") pod \"keystone-db-create-s6m4s\" (UID: \"131c329f-8c7c-4d30-a0c3-37ecaac9db82\") " pod="openstack/keystone-db-create-s6m4s" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.156711 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-operator-scripts\") pod \"keystone-f92e-account-create-f4xn7\" (UID: \"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc\") " pod="openstack/keystone-f92e-account-create-f4xn7" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.156765 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gbqd\" (UniqueName: \"kubernetes.io/projected/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-kube-api-access-7gbqd\") pod \"keystone-f92e-account-create-f4xn7\" (UID: \"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc\") " pod="openstack/keystone-f92e-account-create-f4xn7" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.156801 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5pt\" (UniqueName: \"kubernetes.io/projected/131c329f-8c7c-4d30-a0c3-37ecaac9db82-kube-api-access-pw5pt\") pod \"keystone-db-create-s6m4s\" (UID: \"131c329f-8c7c-4d30-a0c3-37ecaac9db82\") " pod="openstack/keystone-db-create-s6m4s" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.161435 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03f48b82-41c0-4673-bcae-aef0130d447a" path="/var/lib/kubelet/pods/03f48b82-41c0-4673-bcae-aef0130d447a/volumes" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.237370 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-s4q44"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.238402 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s4q44" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.250164 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s4q44"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.259710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gbqd\" (UniqueName: \"kubernetes.io/projected/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-kube-api-access-7gbqd\") pod \"keystone-f92e-account-create-f4xn7\" (UID: \"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc\") " pod="openstack/keystone-f92e-account-create-f4xn7" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.259785 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5pt\" (UniqueName: \"kubernetes.io/projected/131c329f-8c7c-4d30-a0c3-37ecaac9db82-kube-api-access-pw5pt\") pod \"keystone-db-create-s6m4s\" (UID: \"131c329f-8c7c-4d30-a0c3-37ecaac9db82\") " pod="openstack/keystone-db-create-s6m4s" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.259869 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131c329f-8c7c-4d30-a0c3-37ecaac9db82-operator-scripts\") pod \"keystone-db-create-s6m4s\" (UID: \"131c329f-8c7c-4d30-a0c3-37ecaac9db82\") " pod="openstack/keystone-db-create-s6m4s" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.259959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-operator-scripts\") pod \"keystone-f92e-account-create-f4xn7\" (UID: \"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc\") " pod="openstack/keystone-f92e-account-create-f4xn7" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.261106 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131c329f-8c7c-4d30-a0c3-37ecaac9db82-operator-scripts\") pod \"keystone-db-create-s6m4s\" (UID: \"131c329f-8c7c-4d30-a0c3-37ecaac9db82\") " pod="openstack/keystone-db-create-s6m4s" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.261191 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-operator-scripts\") pod \"keystone-f92e-account-create-f4xn7\" (UID: \"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc\") " pod="openstack/keystone-f92e-account-create-f4xn7" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.282756 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gbqd\" (UniqueName: \"kubernetes.io/projected/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-kube-api-access-7gbqd\") pod \"keystone-f92e-account-create-f4xn7\" (UID: \"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc\") " pod="openstack/keystone-f92e-account-create-f4xn7" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.293069 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5pt\" (UniqueName: \"kubernetes.io/projected/131c329f-8c7c-4d30-a0c3-37ecaac9db82-kube-api-access-pw5pt\") pod \"keystone-db-create-s6m4s\" (UID: \"131c329f-8c7c-4d30-a0c3-37ecaac9db82\") " pod="openstack/keystone-db-create-s6m4s" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.303707 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-0984-account-create-dwgfw"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.305113 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0984-account-create-dwgfw" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.308335 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.310091 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f92e-account-create-f4xn7" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.312063 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0984-account-create-dwgfw"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.361041 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62zhb\" (UniqueName: \"kubernetes.io/projected/054d2889-0839-4b71-9515-904051c64bc7-kube-api-access-62zhb\") pod \"placement-db-create-s4q44\" (UID: \"054d2889-0839-4b71-9515-904051c64bc7\") " pod="openstack/placement-db-create-s4q44" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.361138 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/054d2889-0839-4b71-9515-904051c64bc7-operator-scripts\") pod \"placement-db-create-s4q44\" (UID: \"054d2889-0839-4b71-9515-904051c64bc7\") " pod="openstack/placement-db-create-s4q44" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.396529 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s6m4s" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.463076 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdzhp\" (UniqueName: \"kubernetes.io/projected/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-kube-api-access-mdzhp\") pod \"placement-0984-account-create-dwgfw\" (UID: \"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82\") " pod="openstack/placement-0984-account-create-dwgfw" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.463354 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/054d2889-0839-4b71-9515-904051c64bc7-operator-scripts\") pod \"placement-db-create-s4q44\" (UID: \"054d2889-0839-4b71-9515-904051c64bc7\") " pod="openstack/placement-db-create-s4q44" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.463497 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-operator-scripts\") pod \"placement-0984-account-create-dwgfw\" (UID: \"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82\") " pod="openstack/placement-0984-account-create-dwgfw" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.463522 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62zhb\" (UniqueName: \"kubernetes.io/projected/054d2889-0839-4b71-9515-904051c64bc7-kube-api-access-62zhb\") pod \"placement-db-create-s4q44\" (UID: \"054d2889-0839-4b71-9515-904051c64bc7\") " pod="openstack/placement-db-create-s4q44" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.469311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/054d2889-0839-4b71-9515-904051c64bc7-operator-scripts\") pod \"placement-db-create-s4q44\" (UID: \"054d2889-0839-4b71-9515-904051c64bc7\") " pod="openstack/placement-db-create-s4q44" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.497564 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62zhb\" (UniqueName: \"kubernetes.io/projected/054d2889-0839-4b71-9515-904051c64bc7-kube-api-access-62zhb\") pod \"placement-db-create-s4q44\" (UID: \"054d2889-0839-4b71-9515-904051c64bc7\") " pod="openstack/placement-db-create-s4q44" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.512072 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-c4zmd"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.513094 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c4zmd" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.529864 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-c4zmd"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.560048 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s4q44" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.570063 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-operator-scripts\") pod \"placement-0984-account-create-dwgfw\" (UID: \"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82\") " pod="openstack/placement-0984-account-create-dwgfw" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.570142 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdzhp\" (UniqueName: \"kubernetes.io/projected/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-kube-api-access-mdzhp\") pod \"placement-0984-account-create-dwgfw\" (UID: \"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82\") " pod="openstack/placement-0984-account-create-dwgfw" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.571131 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-operator-scripts\") pod \"placement-0984-account-create-dwgfw\" (UID: \"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82\") " pod="openstack/placement-0984-account-create-dwgfw" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.602183 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdzhp\" (UniqueName: \"kubernetes.io/projected/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-kube-api-access-mdzhp\") pod \"placement-0984-account-create-dwgfw\" (UID: \"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82\") " pod="openstack/placement-0984-account-create-dwgfw" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.640110 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-4a60-account-create-qgm25"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.646703 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4a60-account-create-qgm25" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.648241 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4a60-account-create-qgm25"] Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.650490 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.671867 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wqb4\" (UniqueName: \"kubernetes.io/projected/1c41a20c-5a07-4187-bb4f-3f900256ea49-kube-api-access-4wqb4\") pod \"glance-db-create-c4zmd\" (UID: \"1c41a20c-5a07-4187-bb4f-3f900256ea49\") " pod="openstack/glance-db-create-c4zmd" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.671950 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c41a20c-5a07-4187-bb4f-3f900256ea49-operator-scripts\") pod \"glance-db-create-c4zmd\" (UID: \"1c41a20c-5a07-4187-bb4f-3f900256ea49\") " pod="openstack/glance-db-create-c4zmd" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.709441 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0984-account-create-dwgfw" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.784891 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wqb4\" (UniqueName: \"kubernetes.io/projected/1c41a20c-5a07-4187-bb4f-3f900256ea49-kube-api-access-4wqb4\") pod \"glance-db-create-c4zmd\" (UID: \"1c41a20c-5a07-4187-bb4f-3f900256ea49\") " pod="openstack/glance-db-create-c4zmd" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.785147 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c41a20c-5a07-4187-bb4f-3f900256ea49-operator-scripts\") pod \"glance-db-create-c4zmd\" (UID: \"1c41a20c-5a07-4187-bb4f-3f900256ea49\") " pod="openstack/glance-db-create-c4zmd" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.785248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf2ft\" (UniqueName: \"kubernetes.io/projected/b624f29e-b759-4767-ab76-de4d94d4e2af-kube-api-access-mf2ft\") pod \"glance-4a60-account-create-qgm25\" (UID: \"b624f29e-b759-4767-ab76-de4d94d4e2af\") " pod="openstack/glance-4a60-account-create-qgm25" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.785280 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b624f29e-b759-4767-ab76-de4d94d4e2af-operator-scripts\") pod \"glance-4a60-account-create-qgm25\" (UID: \"b624f29e-b759-4767-ab76-de4d94d4e2af\") " pod="openstack/glance-4a60-account-create-qgm25" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.786436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c41a20c-5a07-4187-bb4f-3f900256ea49-operator-scripts\") pod \"glance-db-create-c4zmd\" (UID: \"1c41a20c-5a07-4187-bb4f-3f900256ea49\") " pod="openstack/glance-db-create-c4zmd" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.804326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wqb4\" (UniqueName: \"kubernetes.io/projected/1c41a20c-5a07-4187-bb4f-3f900256ea49-kube-api-access-4wqb4\") pod \"glance-db-create-c4zmd\" (UID: \"1c41a20c-5a07-4187-bb4f-3f900256ea49\") " pod="openstack/glance-db-create-c4zmd" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.886728 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf2ft\" (UniqueName: \"kubernetes.io/projected/b624f29e-b759-4767-ab76-de4d94d4e2af-kube-api-access-mf2ft\") pod \"glance-4a60-account-create-qgm25\" (UID: \"b624f29e-b759-4767-ab76-de4d94d4e2af\") " pod="openstack/glance-4a60-account-create-qgm25" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.886784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b624f29e-b759-4767-ab76-de4d94d4e2af-operator-scripts\") pod \"glance-4a60-account-create-qgm25\" (UID: \"b624f29e-b759-4767-ab76-de4d94d4e2af\") " pod="openstack/glance-4a60-account-create-qgm25" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.887824 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b624f29e-b759-4767-ab76-de4d94d4e2af-operator-scripts\") pod \"glance-4a60-account-create-qgm25\" (UID: \"b624f29e-b759-4767-ab76-de4d94d4e2af\") " pod="openstack/glance-4a60-account-create-qgm25" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.903497 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf2ft\" (UniqueName: \"kubernetes.io/projected/b624f29e-b759-4767-ab76-de4d94d4e2af-kube-api-access-mf2ft\") pod \"glance-4a60-account-create-qgm25\" (UID: \"b624f29e-b759-4767-ab76-de4d94d4e2af\") " pod="openstack/glance-4a60-account-create-qgm25" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.932720 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f92e-account-create-f4xn7"] Nov 22 08:41:13 crc kubenswrapper[4743]: W1122 08:41:13.935455 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1520cff6_cefe_47d7_bce3_1c80dd5eb3dc.slice/crio-dd617132a5d1b71dbaa1b1ab4e5dfbc7eb3f36a6caecdf9ab31ebde4ccffce4d WatchSource:0}: Error finding container dd617132a5d1b71dbaa1b1ab4e5dfbc7eb3f36a6caecdf9ab31ebde4ccffce4d: Status 404 returned error can't find the container with id dd617132a5d1b71dbaa1b1ab4e5dfbc7eb3f36a6caecdf9ab31ebde4ccffce4d Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.972192 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c4zmd" Nov 22 08:41:13 crc kubenswrapper[4743]: I1122 08:41:13.994939 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4a60-account-create-qgm25" Nov 22 08:41:14 crc kubenswrapper[4743]: I1122 08:41:14.043374 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-s6m4s"] Nov 22 08:41:14 crc kubenswrapper[4743]: W1122 08:41:14.067966 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod131c329f_8c7c_4d30_a0c3_37ecaac9db82.slice/crio-fff2da8cab31ecddc082767563ee422c0012d9ffb39279b8846f2b0e14463d9d WatchSource:0}: Error finding container fff2da8cab31ecddc082767563ee422c0012d9ffb39279b8846f2b0e14463d9d: Status 404 returned error can't find the container with id fff2da8cab31ecddc082767563ee422c0012d9ffb39279b8846f2b0e14463d9d Nov 22 08:41:14 crc kubenswrapper[4743]: I1122 08:41:14.177468 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-s4q44"] Nov 22 08:41:14 crc kubenswrapper[4743]: I1122 08:41:14.251195 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-0984-account-create-dwgfw"] Nov 22 08:41:14 crc kubenswrapper[4743]: I1122 08:41:14.368150 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s4q44" event={"ID":"054d2889-0839-4b71-9515-904051c64bc7","Type":"ContainerStarted","Data":"2eeb64283a27623371e30acc80c02ba03f8d572295cc7a0cd20c3c9703ed75ea"} Nov 22 08:41:14 crc kubenswrapper[4743]: I1122 08:41:14.369392 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s6m4s" event={"ID":"131c329f-8c7c-4d30-a0c3-37ecaac9db82","Type":"ContainerStarted","Data":"fff2da8cab31ecddc082767563ee422c0012d9ffb39279b8846f2b0e14463d9d"} Nov 22 08:41:14 crc kubenswrapper[4743]: I1122 08:41:14.370449 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f92e-account-create-f4xn7" event={"ID":"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc","Type":"ContainerStarted","Data":"dd617132a5d1b71dbaa1b1ab4e5dfbc7eb3f36a6caecdf9ab31ebde4ccffce4d"} Nov 22 08:41:14 crc kubenswrapper[4743]: I1122 08:41:14.371958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0984-account-create-dwgfw" event={"ID":"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82","Type":"ContainerStarted","Data":"dd0948232975d679bb871965ced306462518e535c720ba64fdcca8471ff5a724"} Nov 22 08:41:14 crc kubenswrapper[4743]: I1122 08:41:14.446612 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-c4zmd"] Nov 22 08:41:14 crc kubenswrapper[4743]: W1122 08:41:14.447163 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c41a20c_5a07_4187_bb4f_3f900256ea49.slice/crio-91d1ec09f01d19296278d848798d97e40c509b30e5b6384d7333b529a88b96ee WatchSource:0}: Error finding container 91d1ec09f01d19296278d848798d97e40c509b30e5b6384d7333b529a88b96ee: Status 404 returned error can't find the container with id 91d1ec09f01d19296278d848798d97e40c509b30e5b6384d7333b529a88b96ee Nov 22 08:41:14 crc kubenswrapper[4743]: I1122 08:41:14.505088 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-4a60-account-create-qgm25"] Nov 22 08:41:14 crc kubenswrapper[4743]: W1122 08:41:14.510439 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb624f29e_b759_4767_ab76_de4d94d4e2af.slice/crio-5a50b35064e1fa54edd81018534b2457747ae974fd690092788ec33ec5dfb5c6 WatchSource:0}: Error finding container 5a50b35064e1fa54edd81018534b2457747ae974fd690092788ec33ec5dfb5c6: Status 404 returned error can't find the container with id 5a50b35064e1fa54edd81018534b2457747ae974fd690092788ec33ec5dfb5c6 Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.381160 4743 generic.go:334] "Generic (PLEG): container finished" podID="b624f29e-b759-4767-ab76-de4d94d4e2af" containerID="65b882681867eb01a06888b0955cb4a26c937522146e082819e8320ea62fe3d5" exitCode=0 Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.381249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4a60-account-create-qgm25" event={"ID":"b624f29e-b759-4767-ab76-de4d94d4e2af","Type":"ContainerDied","Data":"65b882681867eb01a06888b0955cb4a26c937522146e082819e8320ea62fe3d5"} Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.381499 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4a60-account-create-qgm25" event={"ID":"b624f29e-b759-4767-ab76-de4d94d4e2af","Type":"ContainerStarted","Data":"5a50b35064e1fa54edd81018534b2457747ae974fd690092788ec33ec5dfb5c6"} Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.383787 4743 generic.go:334] "Generic (PLEG): container finished" podID="054d2889-0839-4b71-9515-904051c64bc7" containerID="a9937bd5c1410e351ab5ec4ba137b45596917cb622464a85d33d573baae61398" exitCode=0 Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.384140 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s4q44" event={"ID":"054d2889-0839-4b71-9515-904051c64bc7","Type":"ContainerDied","Data":"a9937bd5c1410e351ab5ec4ba137b45596917cb622464a85d33d573baae61398"} Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.394643 4743 generic.go:334] "Generic (PLEG): container finished" podID="131c329f-8c7c-4d30-a0c3-37ecaac9db82" containerID="59bb33e985a8fb36c339adbe9885bf0826891ad476e2ede03cf3c29fdf19037b" exitCode=0 Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.394742 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s6m4s" event={"ID":"131c329f-8c7c-4d30-a0c3-37ecaac9db82","Type":"ContainerDied","Data":"59bb33e985a8fb36c339adbe9885bf0826891ad476e2ede03cf3c29fdf19037b"} Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.396225 4743 generic.go:334] "Generic (PLEG): container finished" podID="1520cff6-cefe-47d7-bce3-1c80dd5eb3dc" containerID="1fbbbdfbb859c9da828f408196fc0aa8d1393484e6831a83bb031fda26468ddc" exitCode=0 Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.396304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f92e-account-create-f4xn7" event={"ID":"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc","Type":"ContainerDied","Data":"1fbbbdfbb859c9da828f408196fc0aa8d1393484e6831a83bb031fda26468ddc"} Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.403161 4743 generic.go:334] "Generic (PLEG): container finished" podID="4b3ae5bd-983a-4ef5-95a6-52f6db24ac82" containerID="570cc25ff76bbe41a98fb2a034cfe25b67f71d87b0c7e1df88da843ee55eeb93" exitCode=0 Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.403281 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0984-account-create-dwgfw" event={"ID":"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82","Type":"ContainerDied","Data":"570cc25ff76bbe41a98fb2a034cfe25b67f71d87b0c7e1df88da843ee55eeb93"} Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.406815 4743 generic.go:334] "Generic (PLEG): container finished" podID="1c41a20c-5a07-4187-bb4f-3f900256ea49" containerID="9d3af77ffcdc657327ec7c448b5a0a750fd3a193887ca244cc6a7b3498d8993a" exitCode=0 Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.406871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c4zmd" event={"ID":"1c41a20c-5a07-4187-bb4f-3f900256ea49","Type":"ContainerDied","Data":"9d3af77ffcdc657327ec7c448b5a0a750fd3a193887ca244cc6a7b3498d8993a"} Nov 22 08:41:15 crc kubenswrapper[4743]: I1122 08:41:15.406907 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c4zmd" event={"ID":"1c41a20c-5a07-4187-bb4f-3f900256ea49","Type":"ContainerStarted","Data":"91d1ec09f01d19296278d848798d97e40c509b30e5b6384d7333b529a88b96ee"} Nov 22 08:41:16 crc kubenswrapper[4743]: I1122 08:41:16.774338 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s4q44" Nov 22 08:41:16 crc kubenswrapper[4743]: I1122 08:41:16.941103 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62zhb\" (UniqueName: \"kubernetes.io/projected/054d2889-0839-4b71-9515-904051c64bc7-kube-api-access-62zhb\") pod \"054d2889-0839-4b71-9515-904051c64bc7\" (UID: \"054d2889-0839-4b71-9515-904051c64bc7\") " Nov 22 08:41:16 crc kubenswrapper[4743]: I1122 08:41:16.941215 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/054d2889-0839-4b71-9515-904051c64bc7-operator-scripts\") pod \"054d2889-0839-4b71-9515-904051c64bc7\" (UID: \"054d2889-0839-4b71-9515-904051c64bc7\") " Nov 22 08:41:16 crc kubenswrapper[4743]: I1122 08:41:16.942488 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/054d2889-0839-4b71-9515-904051c64bc7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "054d2889-0839-4b71-9515-904051c64bc7" (UID: "054d2889-0839-4b71-9515-904051c64bc7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:16 crc kubenswrapper[4743]: I1122 08:41:16.947640 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/054d2889-0839-4b71-9515-904051c64bc7-kube-api-access-62zhb" (OuterVolumeSpecName: "kube-api-access-62zhb") pod "054d2889-0839-4b71-9515-904051c64bc7" (UID: "054d2889-0839-4b71-9515-904051c64bc7"). InnerVolumeSpecName "kube-api-access-62zhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.007466 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c4zmd" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.013296 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s6m4s" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.021957 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4a60-account-create-qgm25" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.043015 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62zhb\" (UniqueName: \"kubernetes.io/projected/054d2889-0839-4b71-9515-904051c64bc7-kube-api-access-62zhb\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.043049 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/054d2889-0839-4b71-9515-904051c64bc7-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.045988 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f92e-account-create-f4xn7" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.051041 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0984-account-create-dwgfw" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144129 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c41a20c-5a07-4187-bb4f-3f900256ea49-operator-scripts\") pod \"1c41a20c-5a07-4187-bb4f-3f900256ea49\" (UID: \"1c41a20c-5a07-4187-bb4f-3f900256ea49\") " Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144176 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw5pt\" (UniqueName: \"kubernetes.io/projected/131c329f-8c7c-4d30-a0c3-37ecaac9db82-kube-api-access-pw5pt\") pod \"131c329f-8c7c-4d30-a0c3-37ecaac9db82\" (UID: \"131c329f-8c7c-4d30-a0c3-37ecaac9db82\") " Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144278 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-operator-scripts\") pod \"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc\" (UID: \"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc\") " Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144314 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wqb4\" (UniqueName: \"kubernetes.io/projected/1c41a20c-5a07-4187-bb4f-3f900256ea49-kube-api-access-4wqb4\") pod \"1c41a20c-5a07-4187-bb4f-3f900256ea49\" (UID: \"1c41a20c-5a07-4187-bb4f-3f900256ea49\") " Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144370 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gbqd\" (UniqueName: \"kubernetes.io/projected/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-kube-api-access-7gbqd\") pod \"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc\" (UID: \"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc\") " Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144388 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131c329f-8c7c-4d30-a0c3-37ecaac9db82-operator-scripts\") pod \"131c329f-8c7c-4d30-a0c3-37ecaac9db82\" (UID: \"131c329f-8c7c-4d30-a0c3-37ecaac9db82\") " Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144477 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b624f29e-b759-4767-ab76-de4d94d4e2af-operator-scripts\") pod \"b624f29e-b759-4767-ab76-de4d94d4e2af\" (UID: \"b624f29e-b759-4767-ab76-de4d94d4e2af\") " Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf2ft\" (UniqueName: \"kubernetes.io/projected/b624f29e-b759-4767-ab76-de4d94d4e2af-kube-api-access-mf2ft\") pod \"b624f29e-b759-4767-ab76-de4d94d4e2af\" (UID: \"b624f29e-b759-4767-ab76-de4d94d4e2af\") " Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144525 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdzhp\" (UniqueName: \"kubernetes.io/projected/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-kube-api-access-mdzhp\") pod \"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82\" (UID: \"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82\") " Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144553 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-operator-scripts\") pod \"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82\" (UID: \"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82\") " Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.144884 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c41a20c-5a07-4187-bb4f-3f900256ea49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c41a20c-5a07-4187-bb4f-3f900256ea49" (UID: "1c41a20c-5a07-4187-bb4f-3f900256ea49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.145314 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1520cff6-cefe-47d7-bce3-1c80dd5eb3dc" (UID: "1520cff6-cefe-47d7-bce3-1c80dd5eb3dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.145634 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b3ae5bd-983a-4ef5-95a6-52f6db24ac82" (UID: "4b3ae5bd-983a-4ef5-95a6-52f6db24ac82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.145696 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b624f29e-b759-4767-ab76-de4d94d4e2af-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b624f29e-b759-4767-ab76-de4d94d4e2af" (UID: "b624f29e-b759-4767-ab76-de4d94d4e2af"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.146146 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131c329f-8c7c-4d30-a0c3-37ecaac9db82-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "131c329f-8c7c-4d30-a0c3-37ecaac9db82" (UID: "131c329f-8c7c-4d30-a0c3-37ecaac9db82"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.147068 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131c329f-8c7c-4d30-a0c3-37ecaac9db82-kube-api-access-pw5pt" (OuterVolumeSpecName: "kube-api-access-pw5pt") pod "131c329f-8c7c-4d30-a0c3-37ecaac9db82" (UID: "131c329f-8c7c-4d30-a0c3-37ecaac9db82"). InnerVolumeSpecName "kube-api-access-pw5pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.148060 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-kube-api-access-7gbqd" (OuterVolumeSpecName: "kube-api-access-7gbqd") pod "1520cff6-cefe-47d7-bce3-1c80dd5eb3dc" (UID: "1520cff6-cefe-47d7-bce3-1c80dd5eb3dc"). InnerVolumeSpecName "kube-api-access-7gbqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.149036 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c41a20c-5a07-4187-bb4f-3f900256ea49-kube-api-access-4wqb4" (OuterVolumeSpecName: "kube-api-access-4wqb4") pod "1c41a20c-5a07-4187-bb4f-3f900256ea49" (UID: "1c41a20c-5a07-4187-bb4f-3f900256ea49"). InnerVolumeSpecName "kube-api-access-4wqb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.149049 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b624f29e-b759-4767-ab76-de4d94d4e2af-kube-api-access-mf2ft" (OuterVolumeSpecName: "kube-api-access-mf2ft") pod "b624f29e-b759-4767-ab76-de4d94d4e2af" (UID: "b624f29e-b759-4767-ab76-de4d94d4e2af"). InnerVolumeSpecName "kube-api-access-mf2ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.177755 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-kube-api-access-mdzhp" (OuterVolumeSpecName: "kube-api-access-mdzhp") pod "4b3ae5bd-983a-4ef5-95a6-52f6db24ac82" (UID: "4b3ae5bd-983a-4ef5-95a6-52f6db24ac82"). InnerVolumeSpecName "kube-api-access-mdzhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.247405 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.247461 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wqb4\" (UniqueName: \"kubernetes.io/projected/1c41a20c-5a07-4187-bb4f-3f900256ea49-kube-api-access-4wqb4\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.247477 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gbqd\" (UniqueName: \"kubernetes.io/projected/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc-kube-api-access-7gbqd\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.247488 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/131c329f-8c7c-4d30-a0c3-37ecaac9db82-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.247498 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b624f29e-b759-4767-ab76-de4d94d4e2af-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.247508 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf2ft\" (UniqueName: \"kubernetes.io/projected/b624f29e-b759-4767-ab76-de4d94d4e2af-kube-api-access-mf2ft\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.247535 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdzhp\" (UniqueName: \"kubernetes.io/projected/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-kube-api-access-mdzhp\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.247545 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.247557 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c41a20c-5a07-4187-bb4f-3f900256ea49-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.247566 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw5pt\" (UniqueName: \"kubernetes.io/projected/131c329f-8c7c-4d30-a0c3-37ecaac9db82-kube-api-access-pw5pt\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.429095 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-s6m4s" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.429107 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-s6m4s" event={"ID":"131c329f-8c7c-4d30-a0c3-37ecaac9db82","Type":"ContainerDied","Data":"fff2da8cab31ecddc082767563ee422c0012d9ffb39279b8846f2b0e14463d9d"} Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.429197 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fff2da8cab31ecddc082767563ee422c0012d9ffb39279b8846f2b0e14463d9d" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.431521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f92e-account-create-f4xn7" event={"ID":"1520cff6-cefe-47d7-bce3-1c80dd5eb3dc","Type":"ContainerDied","Data":"dd617132a5d1b71dbaa1b1ab4e5dfbc7eb3f36a6caecdf9ab31ebde4ccffce4d"} Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.431558 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd617132a5d1b71dbaa1b1ab4e5dfbc7eb3f36a6caecdf9ab31ebde4ccffce4d" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.431540 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f92e-account-create-f4xn7" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.433133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-0984-account-create-dwgfw" event={"ID":"4b3ae5bd-983a-4ef5-95a6-52f6db24ac82","Type":"ContainerDied","Data":"dd0948232975d679bb871965ced306462518e535c720ba64fdcca8471ff5a724"} Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.433151 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-0984-account-create-dwgfw" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.433159 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd0948232975d679bb871965ced306462518e535c720ba64fdcca8471ff5a724" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.434932 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c4zmd" event={"ID":"1c41a20c-5a07-4187-bb4f-3f900256ea49","Type":"ContainerDied","Data":"91d1ec09f01d19296278d848798d97e40c509b30e5b6384d7333b529a88b96ee"} Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.434966 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91d1ec09f01d19296278d848798d97e40c509b30e5b6384d7333b529a88b96ee" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.434939 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c4zmd" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.436324 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-4a60-account-create-qgm25" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.436344 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-4a60-account-create-qgm25" event={"ID":"b624f29e-b759-4767-ab76-de4d94d4e2af","Type":"ContainerDied","Data":"5a50b35064e1fa54edd81018534b2457747ae974fd690092788ec33ec5dfb5c6"} Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.436544 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a50b35064e1fa54edd81018534b2457747ae974fd690092788ec33ec5dfb5c6" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.440240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-s4q44" event={"ID":"054d2889-0839-4b71-9515-904051c64bc7","Type":"ContainerDied","Data":"2eeb64283a27623371e30acc80c02ba03f8d572295cc7a0cd20c3c9703ed75ea"} Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.440269 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eeb64283a27623371e30acc80c02ba03f8d572295cc7a0cd20c3c9703ed75ea" Nov 22 08:41:17 crc kubenswrapper[4743]: I1122 08:41:17.440315 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-s4q44" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.138276 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-6t9hh" podUID="5db10427-8546-4dea-b849-36bb02c837bd" containerName="ovn-controller" probeResult="failure" output=< Nov 22 08:41:18 crc kubenswrapper[4743]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 22 08:41:18 crc kubenswrapper[4743]: > Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.198668 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.202629 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.415790 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6t9hh-config-z48cb"] Nov 22 08:41:18 crc kubenswrapper[4743]: E1122 08:41:18.416225 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1520cff6-cefe-47d7-bce3-1c80dd5eb3dc" containerName="mariadb-account-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416250 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1520cff6-cefe-47d7-bce3-1c80dd5eb3dc" containerName="mariadb-account-create" Nov 22 08:41:18 crc kubenswrapper[4743]: E1122 08:41:18.416271 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131c329f-8c7c-4d30-a0c3-37ecaac9db82" containerName="mariadb-database-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416290 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="131c329f-8c7c-4d30-a0c3-37ecaac9db82" containerName="mariadb-database-create" Nov 22 08:41:18 crc kubenswrapper[4743]: E1122 08:41:18.416305 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="054d2889-0839-4b71-9515-904051c64bc7" containerName="mariadb-database-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416313 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="054d2889-0839-4b71-9515-904051c64bc7" containerName="mariadb-database-create" Nov 22 08:41:18 crc kubenswrapper[4743]: E1122 08:41:18.416328 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c41a20c-5a07-4187-bb4f-3f900256ea49" containerName="mariadb-database-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416335 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c41a20c-5a07-4187-bb4f-3f900256ea49" containerName="mariadb-database-create" Nov 22 08:41:18 crc kubenswrapper[4743]: E1122 08:41:18.416345 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b3ae5bd-983a-4ef5-95a6-52f6db24ac82" containerName="mariadb-account-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416353 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b3ae5bd-983a-4ef5-95a6-52f6db24ac82" containerName="mariadb-account-create" Nov 22 08:41:18 crc kubenswrapper[4743]: E1122 08:41:18.416370 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b624f29e-b759-4767-ab76-de4d94d4e2af" containerName="mariadb-account-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416377 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b624f29e-b759-4767-ab76-de4d94d4e2af" containerName="mariadb-account-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416609 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c41a20c-5a07-4187-bb4f-3f900256ea49" containerName="mariadb-database-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416641 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="054d2889-0839-4b71-9515-904051c64bc7" containerName="mariadb-database-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416672 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b3ae5bd-983a-4ef5-95a6-52f6db24ac82" containerName="mariadb-account-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416692 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="131c329f-8c7c-4d30-a0c3-37ecaac9db82" containerName="mariadb-database-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416706 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1520cff6-cefe-47d7-bce3-1c80dd5eb3dc" containerName="mariadb-account-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.416728 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b624f29e-b759-4767-ab76-de4d94d4e2af" containerName="mariadb-account-create" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.417366 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.419352 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.434869 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6t9hh-config-z48cb"] Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.576913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fwx\" (UniqueName: \"kubernetes.io/projected/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-kube-api-access-99fwx\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.577015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-scripts\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.577166 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-log-ovn\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.577213 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.577242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-additional-scripts\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.577442 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run-ovn\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.679732 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-log-ovn\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.679789 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.679843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-additional-scripts\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.679938 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run-ovn\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.679991 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99fwx\" (UniqueName: \"kubernetes.io/projected/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-kube-api-access-99fwx\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.680040 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-scripts\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.680284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-log-ovn\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.680339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run-ovn\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.680343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.680662 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-additional-scripts\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.683201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-scripts\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.703473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fwx\" (UniqueName: \"kubernetes.io/projected/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-kube-api-access-99fwx\") pod \"ovn-controller-6t9hh-config-z48cb\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.734212 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.840928 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-92fnd"] Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.843230 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.846530 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.846681 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5qcvv" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.856417 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-92fnd"] Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.986252 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkqzm\" (UniqueName: \"kubernetes.io/projected/142d1e8a-9aac-4c34-9301-1e069919fe82-kube-api-access-fkqzm\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.986317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-combined-ca-bundle\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.986528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-config-data\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:18 crc kubenswrapper[4743]: I1122 08:41:18.986611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-db-sync-config-data\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.090296 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkqzm\" (UniqueName: \"kubernetes.io/projected/142d1e8a-9aac-4c34-9301-1e069919fe82-kube-api-access-fkqzm\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.090352 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-combined-ca-bundle\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.090429 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-config-data\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.090458 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-db-sync-config-data\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.095274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-db-sync-config-data\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.095512 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-combined-ca-bundle\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.098303 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-config-data\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.108572 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkqzm\" (UniqueName: \"kubernetes.io/projected/142d1e8a-9aac-4c34-9301-1e069919fe82-kube-api-access-fkqzm\") pod \"glance-db-sync-92fnd\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.178678 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-92fnd" Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.213246 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6t9hh-config-z48cb"] Nov 22 08:41:19 crc kubenswrapper[4743]: W1122 08:41:19.243397 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0a7b5ad_3d0f_49d2_8528_e2a4c5fb457b.slice/crio-2f0db82d7fadbe5632515d6717b0c92879a7bfdc1f1f3a150b95e13c83d9ad12 WatchSource:0}: Error finding container 2f0db82d7fadbe5632515d6717b0c92879a7bfdc1f1f3a150b95e13c83d9ad12: Status 404 returned error can't find the container with id 2f0db82d7fadbe5632515d6717b0c92879a7bfdc1f1f3a150b95e13c83d9ad12 Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.475907 4743 generic.go:334] "Generic (PLEG): container finished" podID="90e84b78-308d-41c1-b9a7-5d0c4cb80d44" containerID="c2cd99ce8d0b171935d2ad05dc8e366d1a81b0cd4e4bf014f079a93f8d17c5ad" exitCode=0 Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.475995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9fxgn" event={"ID":"90e84b78-308d-41c1-b9a7-5d0c4cb80d44","Type":"ContainerDied","Data":"c2cd99ce8d0b171935d2ad05dc8e366d1a81b0cd4e4bf014f079a93f8d17c5ad"} Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.477117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh-config-z48cb" event={"ID":"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b","Type":"ContainerStarted","Data":"2f0db82d7fadbe5632515d6717b0c92879a7bfdc1f1f3a150b95e13c83d9ad12"} Nov 22 08:41:19 crc kubenswrapper[4743]: I1122 08:41:19.758054 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-92fnd"] Nov 22 08:41:19 crc kubenswrapper[4743]: W1122 08:41:19.759202 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod142d1e8a_9aac_4c34_9301_1e069919fe82.slice/crio-ef71d6ca813760641bc61f2304845f56771f83ac11216487448131c8b61e61bc WatchSource:0}: Error finding container ef71d6ca813760641bc61f2304845f56771f83ac11216487448131c8b61e61bc: Status 404 returned error can't find the container with id ef71d6ca813760641bc61f2304845f56771f83ac11216487448131c8b61e61bc Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.485001 4743 generic.go:334] "Generic (PLEG): container finished" podID="e5fac46a-545d-4f30-a7ab-8f5e713e934d" containerID="f84c516977fabf4d12664420973d6cd6aad1dffc3d9d6296c1edb5fc3318472d" exitCode=0 Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.485071 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5fac46a-545d-4f30-a7ab-8f5e713e934d","Type":"ContainerDied","Data":"f84c516977fabf4d12664420973d6cd6aad1dffc3d9d6296c1edb5fc3318472d"} Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.488087 4743 generic.go:334] "Generic (PLEG): container finished" podID="c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" containerID="a9167b92a9dde989a74ae857841a3d9207fe61f9c8d20e0a6521ebbba621df1f" exitCode=0 Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.488177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh-config-z48cb" event={"ID":"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b","Type":"ContainerDied","Data":"a9167b92a9dde989a74ae857841a3d9207fe61f9c8d20e0a6521ebbba621df1f"} Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.489879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-92fnd" event={"ID":"142d1e8a-9aac-4c34-9301-1e069919fe82","Type":"ContainerStarted","Data":"ef71d6ca813760641bc61f2304845f56771f83ac11216487448131c8b61e61bc"} Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.504043 4743 generic.go:334] "Generic (PLEG): container finished" podID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" containerID="9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27" exitCode=0 Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.504177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1","Type":"ContainerDied","Data":"9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27"} Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.866408 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.946959 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-dispersionconf\") pod \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.947004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-ring-data-devices\") pod \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.947152 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-scripts\") pod \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.947176 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-combined-ca-bundle\") pod \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.947222 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-swiftconf\") pod \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.947245 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-etc-swift\") pod \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.947301 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-674z7\" (UniqueName: \"kubernetes.io/projected/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-kube-api-access-674z7\") pod \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\" (UID: \"90e84b78-308d-41c1-b9a7-5d0c4cb80d44\") " Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.949073 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "90e84b78-308d-41c1-b9a7-5d0c4cb80d44" (UID: "90e84b78-308d-41c1-b9a7-5d0c4cb80d44"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.949454 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "90e84b78-308d-41c1-b9a7-5d0c4cb80d44" (UID: "90e84b78-308d-41c1-b9a7-5d0c4cb80d44"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.952458 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-kube-api-access-674z7" (OuterVolumeSpecName: "kube-api-access-674z7") pod "90e84b78-308d-41c1-b9a7-5d0c4cb80d44" (UID: "90e84b78-308d-41c1-b9a7-5d0c4cb80d44"). InnerVolumeSpecName "kube-api-access-674z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.956057 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "90e84b78-308d-41c1-b9a7-5d0c4cb80d44" (UID: "90e84b78-308d-41c1-b9a7-5d0c4cb80d44"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.972470 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-scripts" (OuterVolumeSpecName: "scripts") pod "90e84b78-308d-41c1-b9a7-5d0c4cb80d44" (UID: "90e84b78-308d-41c1-b9a7-5d0c4cb80d44"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.977421 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.980744 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90e84b78-308d-41c1-b9a7-5d0c4cb80d44" (UID: "90e84b78-308d-41c1-b9a7-5d0c4cb80d44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:41:20 crc kubenswrapper[4743]: I1122 08:41:20.988768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "90e84b78-308d-41c1-b9a7-5d0c4cb80d44" (UID: "90e84b78-308d-41c1-b9a7-5d0c4cb80d44"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.049200 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-674z7\" (UniqueName: \"kubernetes.io/projected/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-kube-api-access-674z7\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.049232 4743 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-dispersionconf\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.049245 4743 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-ring-data-devices\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.049254 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.049261 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.049269 4743 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-swiftconf\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.049278 4743 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/90e84b78-308d-41c1-b9a7-5d0c4cb80d44-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.514393 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5fac46a-545d-4f30-a7ab-8f5e713e934d","Type":"ContainerStarted","Data":"6e1b913f0b8534fa70afd00b409ba87dcd773b31786f2f1c5518bc5e04e427a8"} Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.515329 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.519775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1","Type":"ContainerStarted","Data":"59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5"} Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.519984 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.521546 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-9fxgn" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.522811 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-9fxgn" event={"ID":"90e84b78-308d-41c1-b9a7-5d0c4cb80d44","Type":"ContainerDied","Data":"e1b1ca8aaa8d0ef9c46ba07f72ac7795172d376aeec9ed1cdd92ddab8ab6253b"} Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.522837 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b1ca8aaa8d0ef9c46ba07f72ac7795172d376aeec9ed1cdd92ddab8ab6253b" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.546440 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=52.845980155 podStartE2EDuration="1m3.546421974s" podCreationTimestamp="2025-11-22 08:40:18 +0000 UTC" firstStartedPulling="2025-11-22 08:40:34.892477427 +0000 UTC m=+1108.598838469" lastFinishedPulling="2025-11-22 08:40:45.592919236 +0000 UTC m=+1119.299280288" observedRunningTime="2025-11-22 08:41:21.54280833 +0000 UTC m=+1155.249169402" watchObservedRunningTime="2025-11-22 08:41:21.546421974 +0000 UTC m=+1155.252783016" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.589414 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.919642093 podStartE2EDuration="1m3.589399074s" podCreationTimestamp="2025-11-22 08:40:18 +0000 UTC" firstStartedPulling="2025-11-22 08:40:34.891958112 +0000 UTC m=+1108.598319164" lastFinishedPulling="2025-11-22 08:40:44.561715103 +0000 UTC m=+1118.268076145" observedRunningTime="2025-11-22 08:41:21.570266562 +0000 UTC m=+1155.276627614" watchObservedRunningTime="2025-11-22 08:41:21.589399074 +0000 UTC m=+1155.295760126" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.856868 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.963905 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99fwx\" (UniqueName: \"kubernetes.io/projected/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-kube-api-access-99fwx\") pod \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.963952 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-scripts\") pod \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.963995 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-additional-scripts\") pod \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.964113 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run\") pod \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.964132 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-log-ovn\") pod \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.964159 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run-ovn\") pod \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\" (UID: \"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b\") " Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.964626 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" (UID: "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.964662 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run" (OuterVolumeSpecName: "var-run") pod "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" (UID: "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.964701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" (UID: "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.965215 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" (UID: "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.965809 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-scripts" (OuterVolumeSpecName: "scripts") pod "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" (UID: "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:21 crc kubenswrapper[4743]: I1122 08:41:21.985812 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-kube-api-access-99fwx" (OuterVolumeSpecName: "kube-api-access-99fwx") pod "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" (UID: "c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b"). InnerVolumeSpecName "kube-api-access-99fwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.066583 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99fwx\" (UniqueName: \"kubernetes.io/projected/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-kube-api-access-99fwx\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.066635 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.066648 4743 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.066659 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.066670 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.066680 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.529973 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh-config-z48cb" event={"ID":"c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b","Type":"ContainerDied","Data":"2f0db82d7fadbe5632515d6717b0c92879a7bfdc1f1f3a150b95e13c83d9ad12"} Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.530347 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f0db82d7fadbe5632515d6717b0c92879a7bfdc1f1f3a150b95e13c83d9ad12" Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.530146 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh-config-z48cb" Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.965293 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6t9hh-config-z48cb"] Nov 22 08:41:22 crc kubenswrapper[4743]: I1122 08:41:22.972387 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6t9hh-config-z48cb"] Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.078782 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-6t9hh-config-z45qv"] Nov 22 08:41:23 crc kubenswrapper[4743]: E1122 08:41:23.079132 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" containerName="ovn-config" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.079155 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" containerName="ovn-config" Nov 22 08:41:23 crc kubenswrapper[4743]: E1122 08:41:23.079166 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e84b78-308d-41c1-b9a7-5d0c4cb80d44" containerName="swift-ring-rebalance" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.079172 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e84b78-308d-41c1-b9a7-5d0c4cb80d44" containerName="swift-ring-rebalance" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.079357 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e84b78-308d-41c1-b9a7-5d0c4cb80d44" containerName="swift-ring-rebalance" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.079384 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" containerName="ovn-config" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.079886 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.081941 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.099844 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6t9hh-config-z45qv"] Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.162632 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b" path="/var/lib/kubelet/pods/c0a7b5ad-3d0f-49d2-8528-e2a4c5fb457b/volumes" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.163272 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-6t9hh" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.186059 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-scripts\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.186142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run-ovn\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.186165 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-log-ovn\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.186374 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.186451 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-additional-scripts\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.186489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9rhv\" (UniqueName: \"kubernetes.io/projected/1dc42754-339b-434e-a3f2-5bd0999a8fb8-kube-api-access-k9rhv\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.237636 4743 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod7f7234b0-750b-4f7d-8ccf-1dde836c5700"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod7f7234b0-750b-4f7d-8ccf-1dde836c5700] : Timed out while waiting for systemd to remove kubepods-besteffort-pod7f7234b0_750b_4f7d_8ccf_1dde836c5700.slice" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.288195 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-scripts\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.288305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run-ovn\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.288330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-log-ovn\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.288438 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.288488 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-additional-scripts\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.288515 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9rhv\" (UniqueName: \"kubernetes.io/projected/1dc42754-339b-434e-a3f2-5bd0999a8fb8-kube-api-access-k9rhv\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.288712 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run-ovn\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.289274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-additional-scripts\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.289419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.289827 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-log-ovn\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.290663 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-scripts\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.308830 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9rhv\" (UniqueName: \"kubernetes.io/projected/1dc42754-339b-434e-a3f2-5bd0999a8fb8-kube-api-access-k9rhv\") pod \"ovn-controller-6t9hh-config-z45qv\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.399283 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:23 crc kubenswrapper[4743]: I1122 08:41:23.885260 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-6t9hh-config-z45qv"] Nov 22 08:41:23 crc kubenswrapper[4743]: W1122 08:41:23.897820 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dc42754_339b_434e_a3f2_5bd0999a8fb8.slice/crio-957b0f12de2523b9a180ab2edd384d11485560167918650ed332ad4273023fe4 WatchSource:0}: Error finding container 957b0f12de2523b9a180ab2edd384d11485560167918650ed332ad4273023fe4: Status 404 returned error can't find the container with id 957b0f12de2523b9a180ab2edd384d11485560167918650ed332ad4273023fe4 Nov 22 08:41:24 crc kubenswrapper[4743]: I1122 08:41:24.551111 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh-config-z45qv" event={"ID":"1dc42754-339b-434e-a3f2-5bd0999a8fb8","Type":"ContainerStarted","Data":"957b0f12de2523b9a180ab2edd384d11485560167918650ed332ad4273023fe4"} Nov 22 08:41:25 crc kubenswrapper[4743]: I1122 08:41:25.561206 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh-config-z45qv" event={"ID":"1dc42754-339b-434e-a3f2-5bd0999a8fb8","Type":"ContainerStarted","Data":"6d69d74bde61491735c233b8f8505c03c211704fff1bdd9e4f6579ecc321af8e"} Nov 22 08:41:25 crc kubenswrapper[4743]: I1122 08:41:25.585292 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-6t9hh-config-z45qv" podStartSLOduration=2.585279183 podStartE2EDuration="2.585279183s" podCreationTimestamp="2025-11-22 08:41:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:41:25.582940205 +0000 UTC m=+1159.289301267" watchObservedRunningTime="2025-11-22 08:41:25.585279183 +0000 UTC m=+1159.291640225" Nov 22 08:41:26 crc kubenswrapper[4743]: I1122 08:41:26.569194 4743 generic.go:334] "Generic (PLEG): container finished" podID="1dc42754-339b-434e-a3f2-5bd0999a8fb8" containerID="6d69d74bde61491735c233b8f8505c03c211704fff1bdd9e4f6579ecc321af8e" exitCode=0 Nov 22 08:41:26 crc kubenswrapper[4743]: I1122 08:41:26.569246 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh-config-z45qv" event={"ID":"1dc42754-339b-434e-a3f2-5bd0999a8fb8","Type":"ContainerDied","Data":"6d69d74bde61491735c233b8f8505c03c211704fff1bdd9e4f6579ecc321af8e"} Nov 22 08:41:28 crc kubenswrapper[4743]: I1122 08:41:28.483989 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:41:28 crc kubenswrapper[4743]: I1122 08:41:28.492161 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") pod \"swift-storage-0\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " pod="openstack/swift-storage-0" Nov 22 08:41:28 crc kubenswrapper[4743]: I1122 08:41:28.698677 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 22 08:41:30 crc kubenswrapper[4743]: I1122 08:41:30.352513 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Nov 22 08:41:39 crc kubenswrapper[4743]: I1122 08:41:39.986910 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="e5fac46a-545d-4f30-a7ab-8f5e713e934d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.144523 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:40 crc kubenswrapper[4743]: E1122 08:41:40.165535 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Nov 22 08:41:40 crc kubenswrapper[4743]: E1122 08:41:40.165739 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkqzm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-92fnd_openstack(142d1e8a-9aac-4c34-9301-1e069919fe82): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:41:40 crc kubenswrapper[4743]: E1122 08:41:40.170106 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-92fnd" podUID="142d1e8a-9aac-4c34-9301-1e069919fe82" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.267346 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9rhv\" (UniqueName: \"kubernetes.io/projected/1dc42754-339b-434e-a3f2-5bd0999a8fb8-kube-api-access-k9rhv\") pod \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.267435 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run\") pod \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.267516 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-additional-scripts\") pod \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.267538 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-scripts\") pod \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.267558 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run" (OuterVolumeSpecName: "var-run") pod "1dc42754-339b-434e-a3f2-5bd0999a8fb8" (UID: "1dc42754-339b-434e-a3f2-5bd0999a8fb8"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.267570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run-ovn\") pod \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.267676 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1dc42754-339b-434e-a3f2-5bd0999a8fb8" (UID: "1dc42754-339b-434e-a3f2-5bd0999a8fb8"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.267748 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-log-ovn\") pod \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\" (UID: \"1dc42754-339b-434e-a3f2-5bd0999a8fb8\") " Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.267819 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1dc42754-339b-434e-a3f2-5bd0999a8fb8" (UID: "1dc42754-339b-434e-a3f2-5bd0999a8fb8"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.268153 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.268179 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.268192 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1dc42754-339b-434e-a3f2-5bd0999a8fb8-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.268563 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1dc42754-339b-434e-a3f2-5bd0999a8fb8" (UID: "1dc42754-339b-434e-a3f2-5bd0999a8fb8"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.268807 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-scripts" (OuterVolumeSpecName: "scripts") pod "1dc42754-339b-434e-a3f2-5bd0999a8fb8" (UID: "1dc42754-339b-434e-a3f2-5bd0999a8fb8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.273103 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dc42754-339b-434e-a3f2-5bd0999a8fb8-kube-api-access-k9rhv" (OuterVolumeSpecName: "kube-api-access-k9rhv") pod "1dc42754-339b-434e-a3f2-5bd0999a8fb8" (UID: "1dc42754-339b-434e-a3f2-5bd0999a8fb8"). InnerVolumeSpecName "kube-api-access-k9rhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.350790 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.369715 4743 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.369765 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1dc42754-339b-434e-a3f2-5bd0999a8fb8-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.369779 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9rhv\" (UniqueName: \"kubernetes.io/projected/1dc42754-339b-434e-a3f2-5bd0999a8fb8-kube-api-access-k9rhv\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.573534 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.710293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"0f90ac6c9f8ede09876ab45a6508579f9dab5b53603c758d13a7dfc4e43bef69"} Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.713095 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh-config-z45qv" Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.713097 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh-config-z45qv" event={"ID":"1dc42754-339b-434e-a3f2-5bd0999a8fb8","Type":"ContainerDied","Data":"957b0f12de2523b9a180ab2edd384d11485560167918650ed332ad4273023fe4"} Nov 22 08:41:40 crc kubenswrapper[4743]: I1122 08:41:40.713145 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="957b0f12de2523b9a180ab2edd384d11485560167918650ed332ad4273023fe4" Nov 22 08:41:40 crc kubenswrapper[4743]: E1122 08:41:40.715496 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-92fnd" podUID="142d1e8a-9aac-4c34-9301-1e069919fe82" Nov 22 08:41:41 crc kubenswrapper[4743]: I1122 08:41:41.228825 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6t9hh-config-z45qv"] Nov 22 08:41:41 crc kubenswrapper[4743]: I1122 08:41:41.235704 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6t9hh-config-z45qv"] Nov 22 08:41:43 crc kubenswrapper[4743]: I1122 08:41:43.163042 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dc42754-339b-434e-a3f2-5bd0999a8fb8" path="/var/lib/kubelet/pods/1dc42754-339b-434e-a3f2-5bd0999a8fb8/volumes" Nov 22 08:41:44 crc kubenswrapper[4743]: I1122 08:41:44.758813 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded"} Nov 22 08:41:44 crc kubenswrapper[4743]: I1122 08:41:44.759201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf"} Nov 22 08:41:45 crc kubenswrapper[4743]: I1122 08:41:45.768957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6"} Nov 22 08:41:45 crc kubenswrapper[4743]: I1122 08:41:45.769537 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e"} Nov 22 08:41:48 crc kubenswrapper[4743]: I1122 08:41:48.801215 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b"} Nov 22 08:41:48 crc kubenswrapper[4743]: I1122 08:41:48.802478 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3"} Nov 22 08:41:49 crc kubenswrapper[4743]: I1122 08:41:49.814461 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb"} Nov 22 08:41:49 crc kubenswrapper[4743]: I1122 08:41:49.814864 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4"} Nov 22 08:41:49 crc kubenswrapper[4743]: I1122 08:41:49.986676 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.268559 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-svwz4"] Nov 22 08:41:50 crc kubenswrapper[4743]: E1122 08:41:50.268936 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dc42754-339b-434e-a3f2-5bd0999a8fb8" containerName="ovn-config" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.268951 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dc42754-339b-434e-a3f2-5bd0999a8fb8" containerName="ovn-config" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.269088 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dc42754-339b-434e-a3f2-5bd0999a8fb8" containerName="ovn-config" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.269701 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-svwz4" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.287102 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-svwz4"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.352677 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.372436 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-96rh7"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.372641 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22dkh\" (UniqueName: \"kubernetes.io/projected/3633999d-a3b2-483d-9ca9-601350b07e59-kube-api-access-22dkh\") pod \"barbican-db-create-svwz4\" (UID: \"3633999d-a3b2-483d-9ca9-601350b07e59\") " pod="openstack/barbican-db-create-svwz4" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.372850 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3633999d-a3b2-483d-9ca9-601350b07e59-operator-scripts\") pod \"barbican-db-create-svwz4\" (UID: \"3633999d-a3b2-483d-9ca9-601350b07e59\") " pod="openstack/barbican-db-create-svwz4" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.373762 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-96rh7" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.385727 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-96rh7"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.425725 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-675b-account-create-t7bnk"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.428732 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-675b-account-create-t7bnk" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.435350 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.443637 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-675b-account-create-t7bnk"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.477160 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvrv9\" (UniqueName: \"kubernetes.io/projected/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-kube-api-access-nvrv9\") pod \"cinder-db-create-96rh7\" (UID: \"2019c268-c5f1-4eff-aa27-6f26c3f37dfa\") " pod="openstack/cinder-db-create-96rh7" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.477256 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-operator-scripts\") pod \"cinder-db-create-96rh7\" (UID: \"2019c268-c5f1-4eff-aa27-6f26c3f37dfa\") " pod="openstack/cinder-db-create-96rh7" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.477295 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvs5m\" (UniqueName: \"kubernetes.io/projected/1bf0a98f-65ac-4997-a98c-fb20ef181219-kube-api-access-fvs5m\") pod \"barbican-675b-account-create-t7bnk\" (UID: \"1bf0a98f-65ac-4997-a98c-fb20ef181219\") " pod="openstack/barbican-675b-account-create-t7bnk" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.477329 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3633999d-a3b2-483d-9ca9-601350b07e59-operator-scripts\") pod \"barbican-db-create-svwz4\" (UID: \"3633999d-a3b2-483d-9ca9-601350b07e59\") " pod="openstack/barbican-db-create-svwz4" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.477378 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22dkh\" (UniqueName: \"kubernetes.io/projected/3633999d-a3b2-483d-9ca9-601350b07e59-kube-api-access-22dkh\") pod \"barbican-db-create-svwz4\" (UID: \"3633999d-a3b2-483d-9ca9-601350b07e59\") " pod="openstack/barbican-db-create-svwz4" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.477551 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf0a98f-65ac-4997-a98c-fb20ef181219-operator-scripts\") pod \"barbican-675b-account-create-t7bnk\" (UID: \"1bf0a98f-65ac-4997-a98c-fb20ef181219\") " pod="openstack/barbican-675b-account-create-t7bnk" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.479056 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3633999d-a3b2-483d-9ca9-601350b07e59-operator-scripts\") pod \"barbican-db-create-svwz4\" (UID: \"3633999d-a3b2-483d-9ca9-601350b07e59\") " pod="openstack/barbican-db-create-svwz4" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.516419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22dkh\" (UniqueName: \"kubernetes.io/projected/3633999d-a3b2-483d-9ca9-601350b07e59-kube-api-access-22dkh\") pod \"barbican-db-create-svwz4\" (UID: \"3633999d-a3b2-483d-9ca9-601350b07e59\") " pod="openstack/barbican-db-create-svwz4" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.521949 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-89a7-account-create-g24xd"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.527722 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-89a7-account-create-g24xd" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.529543 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.533108 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-89a7-account-create-g24xd"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.579901 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf0a98f-65ac-4997-a98c-fb20ef181219-operator-scripts\") pod \"barbican-675b-account-create-t7bnk\" (UID: \"1bf0a98f-65ac-4997-a98c-fb20ef181219\") " pod="openstack/barbican-675b-account-create-t7bnk" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.578636 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf0a98f-65ac-4997-a98c-fb20ef181219-operator-scripts\") pod \"barbican-675b-account-create-t7bnk\" (UID: \"1bf0a98f-65ac-4997-a98c-fb20ef181219\") " pod="openstack/barbican-675b-account-create-t7bnk" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.580259 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvrv9\" (UniqueName: \"kubernetes.io/projected/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-kube-api-access-nvrv9\") pod \"cinder-db-create-96rh7\" (UID: \"2019c268-c5f1-4eff-aa27-6f26c3f37dfa\") " pod="openstack/cinder-db-create-96rh7" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.580364 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sltfk\" (UniqueName: \"kubernetes.io/projected/c685a13e-8100-43c1-a0c4-417a12135281-kube-api-access-sltfk\") pod \"cinder-89a7-account-create-g24xd\" (UID: \"c685a13e-8100-43c1-a0c4-417a12135281\") " pod="openstack/cinder-89a7-account-create-g24xd" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.580483 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-operator-scripts\") pod \"cinder-db-create-96rh7\" (UID: \"2019c268-c5f1-4eff-aa27-6f26c3f37dfa\") " pod="openstack/cinder-db-create-96rh7" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.580610 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvs5m\" (UniqueName: \"kubernetes.io/projected/1bf0a98f-65ac-4997-a98c-fb20ef181219-kube-api-access-fvs5m\") pod \"barbican-675b-account-create-t7bnk\" (UID: \"1bf0a98f-65ac-4997-a98c-fb20ef181219\") " pod="openstack/barbican-675b-account-create-t7bnk" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.580788 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c685a13e-8100-43c1-a0c4-417a12135281-operator-scripts\") pod \"cinder-89a7-account-create-g24xd\" (UID: \"c685a13e-8100-43c1-a0c4-417a12135281\") " pod="openstack/cinder-89a7-account-create-g24xd" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.581025 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-operator-scripts\") pod \"cinder-db-create-96rh7\" (UID: \"2019c268-c5f1-4eff-aa27-6f26c3f37dfa\") " pod="openstack/cinder-db-create-96rh7" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.586357 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-svwz4" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.597559 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvrv9\" (UniqueName: \"kubernetes.io/projected/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-kube-api-access-nvrv9\") pod \"cinder-db-create-96rh7\" (UID: \"2019c268-c5f1-4eff-aa27-6f26c3f37dfa\") " pod="openstack/cinder-db-create-96rh7" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.597978 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvs5m\" (UniqueName: \"kubernetes.io/projected/1bf0a98f-65ac-4997-a98c-fb20ef181219-kube-api-access-fvs5m\") pod \"barbican-675b-account-create-t7bnk\" (UID: \"1bf0a98f-65ac-4997-a98c-fb20ef181219\") " pod="openstack/barbican-675b-account-create-t7bnk" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.662125 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fs8kc"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.663729 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fs8kc" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.669553 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jqcdm"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.671152 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.675511 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.676013 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.676195 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.678769 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vtjvr" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.681961 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c685a13e-8100-43c1-a0c4-417a12135281-operator-scripts\") pod \"cinder-89a7-account-create-g24xd\" (UID: \"c685a13e-8100-43c1-a0c4-417a12135281\") " pod="openstack/cinder-89a7-account-create-g24xd" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.682039 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sltfk\" (UniqueName: \"kubernetes.io/projected/c685a13e-8100-43c1-a0c4-417a12135281-kube-api-access-sltfk\") pod \"cinder-89a7-account-create-g24xd\" (UID: \"c685a13e-8100-43c1-a0c4-417a12135281\") " pod="openstack/cinder-89a7-account-create-g24xd" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.682861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c685a13e-8100-43c1-a0c4-417a12135281-operator-scripts\") pod \"cinder-89a7-account-create-g24xd\" (UID: \"c685a13e-8100-43c1-a0c4-417a12135281\") " pod="openstack/cinder-89a7-account-create-g24xd" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.688837 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fs8kc"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.689928 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-96rh7" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.695735 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jqcdm"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.715865 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sltfk\" (UniqueName: \"kubernetes.io/projected/c685a13e-8100-43c1-a0c4-417a12135281-kube-api-access-sltfk\") pod \"cinder-89a7-account-create-g24xd\" (UID: \"c685a13e-8100-43c1-a0c4-417a12135281\") " pod="openstack/cinder-89a7-account-create-g24xd" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.749013 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-675b-account-create-t7bnk" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.783786 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njj2w\" (UniqueName: \"kubernetes.io/projected/31a314de-cc63-4bd3-9abc-aaa38391e873-kube-api-access-njj2w\") pod \"neutron-db-create-fs8kc\" (UID: \"31a314de-cc63-4bd3-9abc-aaa38391e873\") " pod="openstack/neutron-db-create-fs8kc" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.783862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9fzr\" (UniqueName: \"kubernetes.io/projected/84edae0e-41a9-42b0-a1bc-1a303dc92946-kube-api-access-r9fzr\") pod \"keystone-db-sync-jqcdm\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.783933 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-combined-ca-bundle\") pod \"keystone-db-sync-jqcdm\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.783979 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a314de-cc63-4bd3-9abc-aaa38391e873-operator-scripts\") pod \"neutron-db-create-fs8kc\" (UID: \"31a314de-cc63-4bd3-9abc-aaa38391e873\") " pod="openstack/neutron-db-create-fs8kc" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.784066 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-config-data\") pod \"keystone-db-sync-jqcdm\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.795801 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8dc4-account-create-f65gm"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.803153 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8dc4-account-create-f65gm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.805961 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.817201 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8dc4-account-create-f65gm"] Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.868781 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-89a7-account-create-g24xd" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.886174 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-config-data\") pod \"keystone-db-sync-jqcdm\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.886668 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njj2w\" (UniqueName: \"kubernetes.io/projected/31a314de-cc63-4bd3-9abc-aaa38391e873-kube-api-access-njj2w\") pod \"neutron-db-create-fs8kc\" (UID: \"31a314de-cc63-4bd3-9abc-aaa38391e873\") " pod="openstack/neutron-db-create-fs8kc" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.886702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9fzr\" (UniqueName: \"kubernetes.io/projected/84edae0e-41a9-42b0-a1bc-1a303dc92946-kube-api-access-r9fzr\") pod \"keystone-db-sync-jqcdm\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.886745 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86aa9353-ef38-45ec-8e1f-12f3ec108756-operator-scripts\") pod \"neutron-8dc4-account-create-f65gm\" (UID: \"86aa9353-ef38-45ec-8e1f-12f3ec108756\") " pod="openstack/neutron-8dc4-account-create-f65gm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.886789 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-combined-ca-bundle\") pod \"keystone-db-sync-jqcdm\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.886830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a314de-cc63-4bd3-9abc-aaa38391e873-operator-scripts\") pod \"neutron-db-create-fs8kc\" (UID: \"31a314de-cc63-4bd3-9abc-aaa38391e873\") " pod="openstack/neutron-db-create-fs8kc" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.886864 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf962\" (UniqueName: \"kubernetes.io/projected/86aa9353-ef38-45ec-8e1f-12f3ec108756-kube-api-access-jf962\") pod \"neutron-8dc4-account-create-f65gm\" (UID: \"86aa9353-ef38-45ec-8e1f-12f3ec108756\") " pod="openstack/neutron-8dc4-account-create-f65gm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.899471 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-config-data\") pod \"keystone-db-sync-jqcdm\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.899965 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a314de-cc63-4bd3-9abc-aaa38391e873-operator-scripts\") pod \"neutron-db-create-fs8kc\" (UID: \"31a314de-cc63-4bd3-9abc-aaa38391e873\") " pod="openstack/neutron-db-create-fs8kc" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.907439 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njj2w\" (UniqueName: \"kubernetes.io/projected/31a314de-cc63-4bd3-9abc-aaa38391e873-kube-api-access-njj2w\") pod \"neutron-db-create-fs8kc\" (UID: \"31a314de-cc63-4bd3-9abc-aaa38391e873\") " pod="openstack/neutron-db-create-fs8kc" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.981884 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fs8kc" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.989524 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf962\" (UniqueName: \"kubernetes.io/projected/86aa9353-ef38-45ec-8e1f-12f3ec108756-kube-api-access-jf962\") pod \"neutron-8dc4-account-create-f65gm\" (UID: \"86aa9353-ef38-45ec-8e1f-12f3ec108756\") " pod="openstack/neutron-8dc4-account-create-f65gm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.989680 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86aa9353-ef38-45ec-8e1f-12f3ec108756-operator-scripts\") pod \"neutron-8dc4-account-create-f65gm\" (UID: \"86aa9353-ef38-45ec-8e1f-12f3ec108756\") " pod="openstack/neutron-8dc4-account-create-f65gm" Nov 22 08:41:50 crc kubenswrapper[4743]: I1122 08:41:50.990448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86aa9353-ef38-45ec-8e1f-12f3ec108756-operator-scripts\") pod \"neutron-8dc4-account-create-f65gm\" (UID: \"86aa9353-ef38-45ec-8e1f-12f3ec108756\") " pod="openstack/neutron-8dc4-account-create-f65gm" Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.012313 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf962\" (UniqueName: \"kubernetes.io/projected/86aa9353-ef38-45ec-8e1f-12f3ec108756-kube-api-access-jf962\") pod \"neutron-8dc4-account-create-f65gm\" (UID: \"86aa9353-ef38-45ec-8e1f-12f3ec108756\") " pod="openstack/neutron-8dc4-account-create-f65gm" Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.029831 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-96rh7"] Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.091339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-combined-ca-bundle\") pod \"keystone-db-sync-jqcdm\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.097998 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9fzr\" (UniqueName: \"kubernetes.io/projected/84edae0e-41a9-42b0-a1bc-1a303dc92946-kube-api-access-r9fzr\") pod \"keystone-db-sync-jqcdm\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.123644 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8dc4-account-create-f65gm" Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.288413 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.297301 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-svwz4"] Nov 22 08:41:51 crc kubenswrapper[4743]: W1122 08:41:51.305259 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3633999d_a3b2_483d_9ca9_601350b07e59.slice/crio-348db1ac7b59cab02559c0e4cb6f96f9c5ae3c09d7b9a6b4f500b086c12e0fa6 WatchSource:0}: Error finding container 348db1ac7b59cab02559c0e4cb6f96f9c5ae3c09d7b9a6b4f500b086c12e0fa6: Status 404 returned error can't find the container with id 348db1ac7b59cab02559c0e4cb6f96f9c5ae3c09d7b9a6b4f500b086c12e0fa6 Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.441829 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-675b-account-create-t7bnk"] Nov 22 08:41:51 crc kubenswrapper[4743]: W1122 08:41:51.446284 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bf0a98f_65ac_4997_a98c_fb20ef181219.slice/crio-9ac8ce11e5209604e2ab426bd41788f908b6da8f4fdc902a2cdd9fae606cdc98 WatchSource:0}: Error finding container 9ac8ce11e5209604e2ab426bd41788f908b6da8f4fdc902a2cdd9fae606cdc98: Status 404 returned error can't find the container with id 9ac8ce11e5209604e2ab426bd41788f908b6da8f4fdc902a2cdd9fae606cdc98 Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.520614 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-89a7-account-create-g24xd"] Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.609318 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fs8kc"] Nov 22 08:41:51 crc kubenswrapper[4743]: W1122 08:41:51.614506 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31a314de_cc63_4bd3_9abc_aaa38391e873.slice/crio-71c5a73851fc51f1bece2b4a4e2892504715ea5115cec542519077fb41c7d013 WatchSource:0}: Error finding container 71c5a73851fc51f1bece2b4a4e2892504715ea5115cec542519077fb41c7d013: Status 404 returned error can't find the container with id 71c5a73851fc51f1bece2b4a4e2892504715ea5115cec542519077fb41c7d013 Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.698749 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8dc4-account-create-f65gm"] Nov 22 08:41:51 crc kubenswrapper[4743]: W1122 08:41:51.708024 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86aa9353_ef38_45ec_8e1f_12f3ec108756.slice/crio-545697c73a4b20525b9b821b22aee3972a249447e2254b00b84662df18e29cdf WatchSource:0}: Error finding container 545697c73a4b20525b9b821b22aee3972a249447e2254b00b84662df18e29cdf: Status 404 returned error can't find the container with id 545697c73a4b20525b9b821b22aee3972a249447e2254b00b84662df18e29cdf Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.796961 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jqcdm"] Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.837508 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-svwz4" event={"ID":"3633999d-a3b2-483d-9ca9-601350b07e59","Type":"ContainerStarted","Data":"348db1ac7b59cab02559c0e4cb6f96f9c5ae3c09d7b9a6b4f500b086c12e0fa6"} Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.845433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-89a7-account-create-g24xd" event={"ID":"c685a13e-8100-43c1-a0c4-417a12135281","Type":"ContainerStarted","Data":"5b4bcd450ab8ffbdc6c0d9ed1a653e83b1d2f2c2be169e210978503dea75414b"} Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.846489 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jqcdm" event={"ID":"84edae0e-41a9-42b0-a1bc-1a303dc92946","Type":"ContainerStarted","Data":"3ef1f203e118abb5277bf1b1ba2e037fa63aa9715949b9017c87034cbdcaa0f1"} Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.848401 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-96rh7" event={"ID":"2019c268-c5f1-4eff-aa27-6f26c3f37dfa","Type":"ContainerStarted","Data":"522839b5fdebb84c1fea1d0e014e5145ae5940ceb284fa8c95f021daffa542f0"} Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.848435 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-96rh7" event={"ID":"2019c268-c5f1-4eff-aa27-6f26c3f37dfa","Type":"ContainerStarted","Data":"bdb2664e141f83b0ded98c3c0372d90678144c16ed55417115e04ff5e8f3f2ba"} Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.849302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8dc4-account-create-f65gm" event={"ID":"86aa9353-ef38-45ec-8e1f-12f3ec108756","Type":"ContainerStarted","Data":"545697c73a4b20525b9b821b22aee3972a249447e2254b00b84662df18e29cdf"} Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.850322 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fs8kc" event={"ID":"31a314de-cc63-4bd3-9abc-aaa38391e873","Type":"ContainerStarted","Data":"71c5a73851fc51f1bece2b4a4e2892504715ea5115cec542519077fb41c7d013"} Nov 22 08:41:51 crc kubenswrapper[4743]: I1122 08:41:51.851078 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-675b-account-create-t7bnk" event={"ID":"1bf0a98f-65ac-4997-a98c-fb20ef181219","Type":"ContainerStarted","Data":"9ac8ce11e5209604e2ab426bd41788f908b6da8f4fdc902a2cdd9fae606cdc98"} Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.869223 4743 generic.go:334] "Generic (PLEG): container finished" podID="c685a13e-8100-43c1-a0c4-417a12135281" containerID="02c08be1a06f176f363240d80ae5a5c4fa7036c63f2fb50020e992f965215d10" exitCode=0 Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.869665 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-89a7-account-create-g24xd" event={"ID":"c685a13e-8100-43c1-a0c4-417a12135281","Type":"ContainerDied","Data":"02c08be1a06f176f363240d80ae5a5c4fa7036c63f2fb50020e992f965215d10"} Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.872883 4743 generic.go:334] "Generic (PLEG): container finished" podID="2019c268-c5f1-4eff-aa27-6f26c3f37dfa" containerID="522839b5fdebb84c1fea1d0e014e5145ae5940ceb284fa8c95f021daffa542f0" exitCode=0 Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.872948 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-96rh7" event={"ID":"2019c268-c5f1-4eff-aa27-6f26c3f37dfa","Type":"ContainerDied","Data":"522839b5fdebb84c1fea1d0e014e5145ae5940ceb284fa8c95f021daffa542f0"} Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.880359 4743 generic.go:334] "Generic (PLEG): container finished" podID="86aa9353-ef38-45ec-8e1f-12f3ec108756" containerID="9634df27808c5b6900e16e550218eae167007f2f968cbec651ad9d5729aaf28a" exitCode=0 Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.880498 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8dc4-account-create-f65gm" event={"ID":"86aa9353-ef38-45ec-8e1f-12f3ec108756","Type":"ContainerDied","Data":"9634df27808c5b6900e16e550218eae167007f2f968cbec651ad9d5729aaf28a"} Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.898964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84"} Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.903345 4743 generic.go:334] "Generic (PLEG): container finished" podID="31a314de-cc63-4bd3-9abc-aaa38391e873" containerID="bfd31ce4d61a607de637c6db8af6011fdd9c45b14e183e2552977504481ffd47" exitCode=0 Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.903454 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fs8kc" event={"ID":"31a314de-cc63-4bd3-9abc-aaa38391e873","Type":"ContainerDied","Data":"bfd31ce4d61a607de637c6db8af6011fdd9c45b14e183e2552977504481ffd47"} Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.904837 4743 generic.go:334] "Generic (PLEG): container finished" podID="1bf0a98f-65ac-4997-a98c-fb20ef181219" containerID="402d9421e3dbd2b5e063a6bf98415c17b6ebaa5cf0bc4fe18813a071b338fe93" exitCode=0 Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.904902 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-675b-account-create-t7bnk" event={"ID":"1bf0a98f-65ac-4997-a98c-fb20ef181219","Type":"ContainerDied","Data":"402d9421e3dbd2b5e063a6bf98415c17b6ebaa5cf0bc4fe18813a071b338fe93"} Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.911086 4743 generic.go:334] "Generic (PLEG): container finished" podID="3633999d-a3b2-483d-9ca9-601350b07e59" containerID="d65252f2cbf9207547c52ed982045e996c05e604dc4138895706f868b51d4c6c" exitCode=0 Nov 22 08:41:52 crc kubenswrapper[4743]: I1122 08:41:52.911145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-svwz4" event={"ID":"3633999d-a3b2-483d-9ca9-601350b07e59","Type":"ContainerDied","Data":"d65252f2cbf9207547c52ed982045e996c05e604dc4138895706f868b51d4c6c"} Nov 22 08:41:53 crc kubenswrapper[4743]: I1122 08:41:53.928782 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291"} Nov 22 08:41:53 crc kubenswrapper[4743]: I1122 08:41:53.929143 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776"} Nov 22 08:41:53 crc kubenswrapper[4743]: I1122 08:41:53.929163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b"} Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.303182 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-675b-account-create-t7bnk" Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.396327 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf0a98f-65ac-4997-a98c-fb20ef181219-operator-scripts\") pod \"1bf0a98f-65ac-4997-a98c-fb20ef181219\" (UID: \"1bf0a98f-65ac-4997-a98c-fb20ef181219\") " Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.396382 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvs5m\" (UniqueName: \"kubernetes.io/projected/1bf0a98f-65ac-4997-a98c-fb20ef181219-kube-api-access-fvs5m\") pod \"1bf0a98f-65ac-4997-a98c-fb20ef181219\" (UID: \"1bf0a98f-65ac-4997-a98c-fb20ef181219\") " Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.397164 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf0a98f-65ac-4997-a98c-fb20ef181219-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1bf0a98f-65ac-4997-a98c-fb20ef181219" (UID: "1bf0a98f-65ac-4997-a98c-fb20ef181219"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.403338 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf0a98f-65ac-4997-a98c-fb20ef181219-kube-api-access-fvs5m" (OuterVolumeSpecName: "kube-api-access-fvs5m") pod "1bf0a98f-65ac-4997-a98c-fb20ef181219" (UID: "1bf0a98f-65ac-4997-a98c-fb20ef181219"). InnerVolumeSpecName "kube-api-access-fvs5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.498080 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bf0a98f-65ac-4997-a98c-fb20ef181219-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.498413 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvs5m\" (UniqueName: \"kubernetes.io/projected/1bf0a98f-65ac-4997-a98c-fb20ef181219-kube-api-access-fvs5m\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.940461 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-675b-account-create-t7bnk" Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.940463 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-675b-account-create-t7bnk" event={"ID":"1bf0a98f-65ac-4997-a98c-fb20ef181219","Type":"ContainerDied","Data":"9ac8ce11e5209604e2ab426bd41788f908b6da8f4fdc902a2cdd9fae606cdc98"} Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.940511 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ac8ce11e5209604e2ab426bd41788f908b6da8f4fdc902a2cdd9fae606cdc98" Nov 22 08:41:54 crc kubenswrapper[4743]: I1122 08:41:54.948526 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f"} Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.258751 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fs8kc" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.312396 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8dc4-account-create-f65gm" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.321127 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-svwz4" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.343363 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-96rh7" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.343971 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a314de-cc63-4bd3-9abc-aaa38391e873-operator-scripts\") pod \"31a314de-cc63-4bd3-9abc-aaa38391e873\" (UID: \"31a314de-cc63-4bd3-9abc-aaa38391e873\") " Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.344133 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njj2w\" (UniqueName: \"kubernetes.io/projected/31a314de-cc63-4bd3-9abc-aaa38391e873-kube-api-access-njj2w\") pod \"31a314de-cc63-4bd3-9abc-aaa38391e873\" (UID: \"31a314de-cc63-4bd3-9abc-aaa38391e873\") " Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.345393 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31a314de-cc63-4bd3-9abc-aaa38391e873-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31a314de-cc63-4bd3-9abc-aaa38391e873" (UID: "31a314de-cc63-4bd3-9abc-aaa38391e873"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.355838 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a314de-cc63-4bd3-9abc-aaa38391e873-kube-api-access-njj2w" (OuterVolumeSpecName: "kube-api-access-njj2w") pod "31a314de-cc63-4bd3-9abc-aaa38391e873" (UID: "31a314de-cc63-4bd3-9abc-aaa38391e873"). InnerVolumeSpecName "kube-api-access-njj2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.356177 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-89a7-account-create-g24xd" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.445189 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3633999d-a3b2-483d-9ca9-601350b07e59-operator-scripts\") pod \"3633999d-a3b2-483d-9ca9-601350b07e59\" (UID: \"3633999d-a3b2-483d-9ca9-601350b07e59\") " Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.445674 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3633999d-a3b2-483d-9ca9-601350b07e59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3633999d-a3b2-483d-9ca9-601350b07e59" (UID: "3633999d-a3b2-483d-9ca9-601350b07e59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.446019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2019c268-c5f1-4eff-aa27-6f26c3f37dfa" (UID: "2019c268-c5f1-4eff-aa27-6f26c3f37dfa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.445296 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-operator-scripts\") pod \"2019c268-c5f1-4eff-aa27-6f26c3f37dfa\" (UID: \"2019c268-c5f1-4eff-aa27-6f26c3f37dfa\") " Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.446189 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c685a13e-8100-43c1-a0c4-417a12135281-operator-scripts\") pod \"c685a13e-8100-43c1-a0c4-417a12135281\" (UID: \"c685a13e-8100-43c1-a0c4-417a12135281\") " Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.446970 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c685a13e-8100-43c1-a0c4-417a12135281-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c685a13e-8100-43c1-a0c4-417a12135281" (UID: "c685a13e-8100-43c1-a0c4-417a12135281"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.447637 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86aa9353-ef38-45ec-8e1f-12f3ec108756-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86aa9353-ef38-45ec-8e1f-12f3ec108756" (UID: "86aa9353-ef38-45ec-8e1f-12f3ec108756"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.447689 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86aa9353-ef38-45ec-8e1f-12f3ec108756-operator-scripts\") pod \"86aa9353-ef38-45ec-8e1f-12f3ec108756\" (UID: \"86aa9353-ef38-45ec-8e1f-12f3ec108756\") " Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.447788 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvrv9\" (UniqueName: \"kubernetes.io/projected/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-kube-api-access-nvrv9\") pod \"2019c268-c5f1-4eff-aa27-6f26c3f37dfa\" (UID: \"2019c268-c5f1-4eff-aa27-6f26c3f37dfa\") " Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.448369 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf962\" (UniqueName: \"kubernetes.io/projected/86aa9353-ef38-45ec-8e1f-12f3ec108756-kube-api-access-jf962\") pod \"86aa9353-ef38-45ec-8e1f-12f3ec108756\" (UID: \"86aa9353-ef38-45ec-8e1f-12f3ec108756\") " Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.448793 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sltfk\" (UniqueName: \"kubernetes.io/projected/c685a13e-8100-43c1-a0c4-417a12135281-kube-api-access-sltfk\") pod \"c685a13e-8100-43c1-a0c4-417a12135281\" (UID: \"c685a13e-8100-43c1-a0c4-417a12135281\") " Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.448848 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22dkh\" (UniqueName: \"kubernetes.io/projected/3633999d-a3b2-483d-9ca9-601350b07e59-kube-api-access-22dkh\") pod \"3633999d-a3b2-483d-9ca9-601350b07e59\" (UID: \"3633999d-a3b2-483d-9ca9-601350b07e59\") " Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.449499 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3633999d-a3b2-483d-9ca9-601350b07e59-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.449523 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.449538 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c685a13e-8100-43c1-a0c4-417a12135281-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.449551 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njj2w\" (UniqueName: \"kubernetes.io/projected/31a314de-cc63-4bd3-9abc-aaa38391e873-kube-api-access-njj2w\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.449563 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86aa9353-ef38-45ec-8e1f-12f3ec108756-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.449728 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31a314de-cc63-4bd3-9abc-aaa38391e873-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.450773 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-kube-api-access-nvrv9" (OuterVolumeSpecName: "kube-api-access-nvrv9") pod "2019c268-c5f1-4eff-aa27-6f26c3f37dfa" (UID: "2019c268-c5f1-4eff-aa27-6f26c3f37dfa"). InnerVolumeSpecName "kube-api-access-nvrv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.452649 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86aa9353-ef38-45ec-8e1f-12f3ec108756-kube-api-access-jf962" (OuterVolumeSpecName: "kube-api-access-jf962") pod "86aa9353-ef38-45ec-8e1f-12f3ec108756" (UID: "86aa9353-ef38-45ec-8e1f-12f3ec108756"). InnerVolumeSpecName "kube-api-access-jf962". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.453146 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c685a13e-8100-43c1-a0c4-417a12135281-kube-api-access-sltfk" (OuterVolumeSpecName: "kube-api-access-sltfk") pod "c685a13e-8100-43c1-a0c4-417a12135281" (UID: "c685a13e-8100-43c1-a0c4-417a12135281"). InnerVolumeSpecName "kube-api-access-sltfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.453246 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3633999d-a3b2-483d-9ca9-601350b07e59-kube-api-access-22dkh" (OuterVolumeSpecName: "kube-api-access-22dkh") pod "3633999d-a3b2-483d-9ca9-601350b07e59" (UID: "3633999d-a3b2-483d-9ca9-601350b07e59"). InnerVolumeSpecName "kube-api-access-22dkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.551836 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sltfk\" (UniqueName: \"kubernetes.io/projected/c685a13e-8100-43c1-a0c4-417a12135281-kube-api-access-sltfk\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.551878 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22dkh\" (UniqueName: \"kubernetes.io/projected/3633999d-a3b2-483d-9ca9-601350b07e59-kube-api-access-22dkh\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.551892 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvrv9\" (UniqueName: \"kubernetes.io/projected/2019c268-c5f1-4eff-aa27-6f26c3f37dfa-kube-api-access-nvrv9\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.551905 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf962\" (UniqueName: \"kubernetes.io/projected/86aa9353-ef38-45ec-8e1f-12f3ec108756-kube-api-access-jf962\") on node \"crc\" DevicePath \"\"" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.976556 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-89a7-account-create-g24xd" event={"ID":"c685a13e-8100-43c1-a0c4-417a12135281","Type":"ContainerDied","Data":"5b4bcd450ab8ffbdc6c0d9ed1a653e83b1d2f2c2be169e210978503dea75414b"} Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.976640 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4bcd450ab8ffbdc6c0d9ed1a653e83b1d2f2c2be169e210978503dea75414b" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.976727 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-89a7-account-create-g24xd" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.982450 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jqcdm" event={"ID":"84edae0e-41a9-42b0-a1bc-1a303dc92946","Type":"ContainerStarted","Data":"9610dc8f6778c28876a91471c6c7ed36b0828b18a27d7acf62fa3c2dcf65b6ca"} Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.985770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-96rh7" event={"ID":"2019c268-c5f1-4eff-aa27-6f26c3f37dfa","Type":"ContainerDied","Data":"bdb2664e141f83b0ded98c3c0372d90678144c16ed55417115e04ff5e8f3f2ba"} Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.985758 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-96rh7" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.985840 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdb2664e141f83b0ded98c3c0372d90678144c16ed55417115e04ff5e8f3f2ba" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.996012 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8dc4-account-create-f65gm" event={"ID":"86aa9353-ef38-45ec-8e1f-12f3ec108756","Type":"ContainerDied","Data":"545697c73a4b20525b9b821b22aee3972a249447e2254b00b84662df18e29cdf"} Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.996049 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="545697c73a4b20525b9b821b22aee3972a249447e2254b00b84662df18e29cdf" Nov 22 08:41:57 crc kubenswrapper[4743]: I1122 08:41:57.996106 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8dc4-account-create-f65gm" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.011265 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jqcdm" podStartSLOduration=2.731777093 podStartE2EDuration="8.011248885s" podCreationTimestamp="2025-11-22 08:41:50 +0000 UTC" firstStartedPulling="2025-11-22 08:41:51.818696258 +0000 UTC m=+1185.525057310" lastFinishedPulling="2025-11-22 08:41:57.09816805 +0000 UTC m=+1190.804529102" observedRunningTime="2025-11-22 08:41:58.009080902 +0000 UTC m=+1191.715441954" watchObservedRunningTime="2025-11-22 08:41:58.011248885 +0000 UTC m=+1191.717609937" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.020144 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed"} Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.020229 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerStarted","Data":"18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe"} Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.023346 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fs8kc" event={"ID":"31a314de-cc63-4bd3-9abc-aaa38391e873","Type":"ContainerDied","Data":"71c5a73851fc51f1bece2b4a4e2892504715ea5115cec542519077fb41c7d013"} Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.023402 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71c5a73851fc51f1bece2b4a4e2892504715ea5115cec542519077fb41c7d013" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.023365 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fs8kc" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.026507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-92fnd" event={"ID":"142d1e8a-9aac-4c34-9301-1e069919fe82","Type":"ContainerStarted","Data":"6ed030084c63587910484888925d6585aae009c6384bcc1c9a46d6aea22044b6"} Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.029437 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-svwz4" event={"ID":"3633999d-a3b2-483d-9ca9-601350b07e59","Type":"ContainerDied","Data":"348db1ac7b59cab02559c0e4cb6f96f9c5ae3c09d7b9a6b4f500b086c12e0fa6"} Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.029462 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="348db1ac7b59cab02559c0e4cb6f96f9c5ae3c09d7b9a6b4f500b086c12e0fa6" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.029461 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-svwz4" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.056334 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=51.106225692 podStartE2EDuration="1m3.056315725s" podCreationTimestamp="2025-11-22 08:40:55 +0000 UTC" firstStartedPulling="2025-11-22 08:41:40.59111819 +0000 UTC m=+1174.297479242" lastFinishedPulling="2025-11-22 08:41:52.541208223 +0000 UTC m=+1186.247569275" observedRunningTime="2025-11-22 08:41:58.052988209 +0000 UTC m=+1191.759349271" watchObservedRunningTime="2025-11-22 08:41:58.056315725 +0000 UTC m=+1191.762676777" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.080646 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-92fnd" podStartSLOduration=2.748806932 podStartE2EDuration="40.080630387s" podCreationTimestamp="2025-11-22 08:41:18 +0000 UTC" firstStartedPulling="2025-11-22 08:41:19.766661855 +0000 UTC m=+1153.473022907" lastFinishedPulling="2025-11-22 08:41:57.09848531 +0000 UTC m=+1190.804846362" observedRunningTime="2025-11-22 08:41:58.078563127 +0000 UTC m=+1191.784924179" watchObservedRunningTime="2025-11-22 08:41:58.080630387 +0000 UTC m=+1191.786991439" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.375431 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdzh7"] Nov 22 08:41:58 crc kubenswrapper[4743]: E1122 08:41:58.375881 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a314de-cc63-4bd3-9abc-aaa38391e873" containerName="mariadb-database-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.375900 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a314de-cc63-4bd3-9abc-aaa38391e873" containerName="mariadb-database-create" Nov 22 08:41:58 crc kubenswrapper[4743]: E1122 08:41:58.375923 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf0a98f-65ac-4997-a98c-fb20ef181219" containerName="mariadb-account-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.375930 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf0a98f-65ac-4997-a98c-fb20ef181219" containerName="mariadb-account-create" Nov 22 08:41:58 crc kubenswrapper[4743]: E1122 08:41:58.375948 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86aa9353-ef38-45ec-8e1f-12f3ec108756" containerName="mariadb-account-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.375955 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="86aa9353-ef38-45ec-8e1f-12f3ec108756" containerName="mariadb-account-create" Nov 22 08:41:58 crc kubenswrapper[4743]: E1122 08:41:58.375977 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2019c268-c5f1-4eff-aa27-6f26c3f37dfa" containerName="mariadb-database-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.375984 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2019c268-c5f1-4eff-aa27-6f26c3f37dfa" containerName="mariadb-database-create" Nov 22 08:41:58 crc kubenswrapper[4743]: E1122 08:41:58.375999 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3633999d-a3b2-483d-9ca9-601350b07e59" containerName="mariadb-database-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.376006 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3633999d-a3b2-483d-9ca9-601350b07e59" containerName="mariadb-database-create" Nov 22 08:41:58 crc kubenswrapper[4743]: E1122 08:41:58.376022 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c685a13e-8100-43c1-a0c4-417a12135281" containerName="mariadb-account-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.376028 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c685a13e-8100-43c1-a0c4-417a12135281" containerName="mariadb-account-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.376207 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="86aa9353-ef38-45ec-8e1f-12f3ec108756" containerName="mariadb-account-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.376225 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a314de-cc63-4bd3-9abc-aaa38391e873" containerName="mariadb-database-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.376238 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2019c268-c5f1-4eff-aa27-6f26c3f37dfa" containerName="mariadb-database-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.376247 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3633999d-a3b2-483d-9ca9-601350b07e59" containerName="mariadb-database-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.376277 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bf0a98f-65ac-4997-a98c-fb20ef181219" containerName="mariadb-account-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.376300 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c685a13e-8100-43c1-a0c4-417a12135281" containerName="mariadb-account-create" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.377445 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.382993 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.428972 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdzh7"] Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.466981 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.467025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55qnp\" (UniqueName: \"kubernetes.io/projected/772ec858-ecf4-4c00-82ab-c7e0ea020070-kube-api-access-55qnp\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.467059 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.467141 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.467158 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.467201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-config\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.568945 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-config\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.569004 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.569023 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55qnp\" (UniqueName: \"kubernetes.io/projected/772ec858-ecf4-4c00-82ab-c7e0ea020070-kube-api-access-55qnp\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.569057 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.569167 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.569192 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.569984 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-config\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.570213 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.570266 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.570284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.570607 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.597872 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55qnp\" (UniqueName: \"kubernetes.io/projected/772ec858-ecf4-4c00-82ab-c7e0ea020070-kube-api-access-55qnp\") pod \"dnsmasq-dns-5c79d794d7-pdzh7\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:58 crc kubenswrapper[4743]: I1122 08:41:58.699552 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:41:59 crc kubenswrapper[4743]: W1122 08:41:59.171047 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod772ec858_ecf4_4c00_82ab_c7e0ea020070.slice/crio-add7f9c1510ecf39b5dce654e0c38d9a87ed31736892abecf5720f6110eda9e3 WatchSource:0}: Error finding container add7f9c1510ecf39b5dce654e0c38d9a87ed31736892abecf5720f6110eda9e3: Status 404 returned error can't find the container with id add7f9c1510ecf39b5dce654e0c38d9a87ed31736892abecf5720f6110eda9e3 Nov 22 08:41:59 crc kubenswrapper[4743]: I1122 08:41:59.175915 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdzh7"] Nov 22 08:42:00 crc kubenswrapper[4743]: I1122 08:42:00.047117 4743 generic.go:334] "Generic (PLEG): container finished" podID="772ec858-ecf4-4c00-82ab-c7e0ea020070" containerID="62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a" exitCode=0 Nov 22 08:42:00 crc kubenswrapper[4743]: I1122 08:42:00.047157 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" event={"ID":"772ec858-ecf4-4c00-82ab-c7e0ea020070","Type":"ContainerDied","Data":"62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a"} Nov 22 08:42:00 crc kubenswrapper[4743]: I1122 08:42:00.047183 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" event={"ID":"772ec858-ecf4-4c00-82ab-c7e0ea020070","Type":"ContainerStarted","Data":"add7f9c1510ecf39b5dce654e0c38d9a87ed31736892abecf5720f6110eda9e3"} Nov 22 08:42:01 crc kubenswrapper[4743]: I1122 08:42:01.058464 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" event={"ID":"772ec858-ecf4-4c00-82ab-c7e0ea020070","Type":"ContainerStarted","Data":"6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c"} Nov 22 08:42:01 crc kubenswrapper[4743]: I1122 08:42:01.058914 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:42:01 crc kubenswrapper[4743]: I1122 08:42:01.080437 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" podStartSLOduration=3.080415726 podStartE2EDuration="3.080415726s" podCreationTimestamp="2025-11-22 08:41:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:01.07813774 +0000 UTC m=+1194.784498792" watchObservedRunningTime="2025-11-22 08:42:01.080415726 +0000 UTC m=+1194.786776778" Nov 22 08:42:01 crc kubenswrapper[4743]: I1122 08:42:01.241322 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:42:01 crc kubenswrapper[4743]: I1122 08:42:01.241671 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:42:02 crc kubenswrapper[4743]: I1122 08:42:02.067989 4743 generic.go:334] "Generic (PLEG): container finished" podID="84edae0e-41a9-42b0-a1bc-1a303dc92946" containerID="9610dc8f6778c28876a91471c6c7ed36b0828b18a27d7acf62fa3c2dcf65b6ca" exitCode=0 Nov 22 08:42:02 crc kubenswrapper[4743]: I1122 08:42:02.068039 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jqcdm" event={"ID":"84edae0e-41a9-42b0-a1bc-1a303dc92946","Type":"ContainerDied","Data":"9610dc8f6778c28876a91471c6c7ed36b0828b18a27d7acf62fa3c2dcf65b6ca"} Nov 22 08:42:03 crc kubenswrapper[4743]: I1122 08:42:03.378170 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:42:03 crc kubenswrapper[4743]: I1122 08:42:03.454765 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-config-data\") pod \"84edae0e-41a9-42b0-a1bc-1a303dc92946\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " Nov 22 08:42:03 crc kubenswrapper[4743]: I1122 08:42:03.454840 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9fzr\" (UniqueName: \"kubernetes.io/projected/84edae0e-41a9-42b0-a1bc-1a303dc92946-kube-api-access-r9fzr\") pod \"84edae0e-41a9-42b0-a1bc-1a303dc92946\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " Nov 22 08:42:03 crc kubenswrapper[4743]: I1122 08:42:03.455197 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-combined-ca-bundle\") pod \"84edae0e-41a9-42b0-a1bc-1a303dc92946\" (UID: \"84edae0e-41a9-42b0-a1bc-1a303dc92946\") " Nov 22 08:42:03 crc kubenswrapper[4743]: I1122 08:42:03.462723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84edae0e-41a9-42b0-a1bc-1a303dc92946-kube-api-access-r9fzr" (OuterVolumeSpecName: "kube-api-access-r9fzr") pod "84edae0e-41a9-42b0-a1bc-1a303dc92946" (UID: "84edae0e-41a9-42b0-a1bc-1a303dc92946"). InnerVolumeSpecName "kube-api-access-r9fzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:03 crc kubenswrapper[4743]: I1122 08:42:03.481075 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84edae0e-41a9-42b0-a1bc-1a303dc92946" (UID: "84edae0e-41a9-42b0-a1bc-1a303dc92946"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:03 crc kubenswrapper[4743]: I1122 08:42:03.505918 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-config-data" (OuterVolumeSpecName: "config-data") pod "84edae0e-41a9-42b0-a1bc-1a303dc92946" (UID: "84edae0e-41a9-42b0-a1bc-1a303dc92946"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:03 crc kubenswrapper[4743]: I1122 08:42:03.557745 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:03 crc kubenswrapper[4743]: I1122 08:42:03.558228 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9fzr\" (UniqueName: \"kubernetes.io/projected/84edae0e-41a9-42b0-a1bc-1a303dc92946-kube-api-access-r9fzr\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:03 crc kubenswrapper[4743]: I1122 08:42:03.558248 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84edae0e-41a9-42b0-a1bc-1a303dc92946-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.092480 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jqcdm" event={"ID":"84edae0e-41a9-42b0-a1bc-1a303dc92946","Type":"ContainerDied","Data":"3ef1f203e118abb5277bf1b1ba2e037fa63aa9715949b9017c87034cbdcaa0f1"} Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.092525 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ef1f203e118abb5277bf1b1ba2e037fa63aa9715949b9017c87034cbdcaa0f1" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.092527 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jqcdm" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.321413 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7cfkq"] Nov 22 08:42:04 crc kubenswrapper[4743]: E1122 08:42:04.332055 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84edae0e-41a9-42b0-a1bc-1a303dc92946" containerName="keystone-db-sync" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.332095 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84edae0e-41a9-42b0-a1bc-1a303dc92946" containerName="keystone-db-sync" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.332308 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="84edae0e-41a9-42b0-a1bc-1a303dc92946" containerName="keystone-db-sync" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.332864 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.336199 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7cfkq"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.340522 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.340739 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.341080 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.341245 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.341465 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vtjvr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.377159 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdzh7"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.377472 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" podUID="772ec858-ecf4-4c00-82ab-c7e0ea020070" containerName="dnsmasq-dns" containerID="cri-o://6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c" gracePeriod=10 Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.402327 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b868669f-lg7q2"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.403645 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.421117 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-lg7q2"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.474545 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.474981 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-combined-ca-bundle\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.475057 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmtgq\" (UniqueName: \"kubernetes.io/projected/9b921a24-4b07-4d33-ab26-3dd171297e24-kube-api-access-nmtgq\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.475094 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-svc\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.475171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-config\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.475242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-credential-keys\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.475274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-scripts\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.475296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-config-data\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.475333 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnjsn\" (UniqueName: \"kubernetes.io/projected/57220368-3097-46da-8f24-783a8e80327e-kube-api-access-xnjsn\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.475395 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.475454 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.475501 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-fernet-keys\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.485436 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-sj8hg"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.487324 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.490020 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.490175 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.490328 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2bw4c" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.498599 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-sj8hg"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.576514 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.576566 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-combined-ca-bundle\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.580747 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-combined-ca-bundle\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.577698 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-nb\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.580830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmtgq\" (UniqueName: \"kubernetes.io/projected/9b921a24-4b07-4d33-ab26-3dd171297e24-kube-api-access-nmtgq\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.580859 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-svc\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581007 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-config\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-config-data\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-credential-keys\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-scripts\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-scripts\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581200 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-config-data\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-db-sync-config-data\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581246 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnjsn\" (UniqueName: \"kubernetes.io/projected/57220368-3097-46da-8f24-783a8e80327e-kube-api-access-xnjsn\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581308 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98hsz\" (UniqueName: \"kubernetes.io/projected/a87658ca-ad68-4136-82dd-14201100b4ea-kube-api-access-98hsz\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581355 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a87658ca-ad68-4136-82dd-14201100b4ea-etc-machine-id\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581393 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-fernet-keys\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.581569 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-svc\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.583410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-config\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.586651 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-sb\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.586889 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-swift-storage-0\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.588247 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-fernet-keys\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.592832 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-combined-ca-bundle\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.595196 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-credential-keys\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.600433 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-m9jrr"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.601554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.603846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-scripts\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.605184 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-config-data\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.608659 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmtgq\" (UniqueName: \"kubernetes.io/projected/9b921a24-4b07-4d33-ab26-3dd171297e24-kube-api-access-nmtgq\") pod \"dnsmasq-dns-5b868669f-lg7q2\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.608894 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8gdtg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.608906 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.613428 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnjsn\" (UniqueName: \"kubernetes.io/projected/57220368-3097-46da-8f24-783a8e80327e-kube-api-access-xnjsn\") pod \"keystone-bootstrap-7cfkq\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.618718 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-m9jrr"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.654858 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.675868 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-n2d6p"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.677422 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.679392 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-825fq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.679561 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.679981 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.682990 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-config-data\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.683047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-scripts\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.683081 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86k8q\" (UniqueName: \"kubernetes.io/projected/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-kube-api-access-86k8q\") pod \"barbican-db-sync-m9jrr\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.683104 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-db-sync-config-data\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.683159 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98hsz\" (UniqueName: \"kubernetes.io/projected/a87658ca-ad68-4136-82dd-14201100b4ea-kube-api-access-98hsz\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.683183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a87658ca-ad68-4136-82dd-14201100b4ea-etc-machine-id\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.683210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-db-sync-config-data\") pod \"barbican-db-sync-m9jrr\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.683281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-combined-ca-bundle\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.683330 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-combined-ca-bundle\") pod \"barbican-db-sync-m9jrr\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.683634 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-22g48"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.684463 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a87658ca-ad68-4136-82dd-14201100b4ea-etc-machine-id\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.685158 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.688860 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mgkwl" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.689104 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.691652 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-n2d6p"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.695236 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.697346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-db-sync-config-data\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.700069 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-config-data\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.703214 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-combined-ca-bundle\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.703219 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-lg7q2"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.703884 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.710563 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-22g48"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.712012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-scripts\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.731792 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98hsz\" (UniqueName: \"kubernetes.io/projected/a87658ca-ad68-4136-82dd-14201100b4ea-kube-api-access-98hsz\") pod \"cinder-db-sync-sj8hg\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.743591 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.745843 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.754131 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.754620 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.760452 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.766683 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-jkvtq"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.783626 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-jkvtq"] Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.783721 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-db-sync-config-data\") pod \"barbican-db-sync-m9jrr\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786450 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-config-data\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-scripts\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786516 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5a99c13-319a-4df1-8061-8bb20463cd73-logs\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-config\") pod \"neutron-db-sync-22g48\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786700 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sllsw\" (UniqueName: \"kubernetes.io/projected/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-kube-api-access-sllsw\") pod \"neutron-db-sync-22g48\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786775 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-combined-ca-bundle\") pod \"barbican-db-sync-m9jrr\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786805 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-combined-ca-bundle\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786832 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-combined-ca-bundle\") pod \"neutron-db-sync-22g48\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786852 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbc4s\" (UniqueName: \"kubernetes.io/projected/c5a99c13-319a-4df1-8061-8bb20463cd73-kube-api-access-tbc4s\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.786945 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86k8q\" (UniqueName: \"kubernetes.io/projected/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-kube-api-access-86k8q\") pod \"barbican-db-sync-m9jrr\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.792445 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-db-sync-config-data\") pod \"barbican-db-sync-m9jrr\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.796602 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-combined-ca-bundle\") pod \"barbican-db-sync-m9jrr\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.819363 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86k8q\" (UniqueName: \"kubernetes.io/projected/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-kube-api-access-86k8q\") pod \"barbican-db-sync-m9jrr\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.886890 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.893872 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-scripts\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.893929 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-config-data\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.893959 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cmpx\" (UniqueName: \"kubernetes.io/projected/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-kube-api-access-2cmpx\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894027 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-scripts\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-config\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894077 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5a99c13-319a-4df1-8061-8bb20463cd73-logs\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894144 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-config\") pod \"neutron-db-sync-22g48\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894170 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sllsw\" (UniqueName: \"kubernetes.io/projected/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-kube-api-access-sllsw\") pod \"neutron-db-sync-22g48\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-combined-ca-bundle\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894310 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-config-data\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894374 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-combined-ca-bundle\") pod \"neutron-db-sync-22g48\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbc4s\" (UniqueName: \"kubernetes.io/projected/c5a99c13-319a-4df1-8061-8bb20463cd73-kube-api-access-tbc4s\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894428 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894467 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894503 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-log-httpd\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894565 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894622 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894676 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-run-httpd\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndxx9\" (UniqueName: \"kubernetes.io/projected/661914bd-2b43-425b-837a-8c4104173ef4-kube-api-access-ndxx9\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.894827 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-svc\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.896769 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5a99c13-319a-4df1-8061-8bb20463cd73-logs\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.902366 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-config-data\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.908423 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-scripts\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.913039 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-combined-ca-bundle\") pod \"neutron-db-sync-22g48\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.913985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-config\") pod \"neutron-db-sync-22g48\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.916681 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sllsw\" (UniqueName: \"kubernetes.io/projected/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-kube-api-access-sllsw\") pod \"neutron-db-sync-22g48\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.923493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-combined-ca-bundle\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.927488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbc4s\" (UniqueName: \"kubernetes.io/projected/c5a99c13-319a-4df1-8061-8bb20463cd73-kube-api-access-tbc4s\") pod \"placement-db-sync-n2d6p\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.996435 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-config-data\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.996543 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.996586 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.996610 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-log-httpd\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.996644 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.996672 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.997106 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.997139 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-run-httpd\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.997194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndxx9\" (UniqueName: \"kubernetes.io/projected/661914bd-2b43-425b-837a-8c4104173ef4-kube-api-access-ndxx9\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.997347 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-svc\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.997393 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-scripts\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.997437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cmpx\" (UniqueName: \"kubernetes.io/projected/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-kube-api-access-2cmpx\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.997482 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-config\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:04 crc kubenswrapper[4743]: I1122 08:42:04.998251 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-log-httpd\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:04.999008 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-svc\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:04.999357 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-nb\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:04.999820 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-run-httpd\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.000689 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-swift-storage-0\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.003140 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-config-data\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.003402 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-sb\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.003846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-config\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.005459 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.006789 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.014511 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.016686 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndxx9\" (UniqueName: \"kubernetes.io/projected/661914bd-2b43-425b-837a-8c4104173ef4-kube-api-access-ndxx9\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.018135 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-scripts\") pod \"ceilometer-0\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " pod="openstack/ceilometer-0" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.021064 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cmpx\" (UniqueName: \"kubernetes.io/projected/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-kube-api-access-2cmpx\") pod \"dnsmasq-dns-cf78879c9-jkvtq\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.101934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55qnp\" (UniqueName: \"kubernetes.io/projected/772ec858-ecf4-4c00-82ab-c7e0ea020070-kube-api-access-55qnp\") pod \"772ec858-ecf4-4c00-82ab-c7e0ea020070\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.102014 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-sb\") pod \"772ec858-ecf4-4c00-82ab-c7e0ea020070\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.102036 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-swift-storage-0\") pod \"772ec858-ecf4-4c00-82ab-c7e0ea020070\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.102084 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-config\") pod \"772ec858-ecf4-4c00-82ab-c7e0ea020070\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.102148 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-nb\") pod \"772ec858-ecf4-4c00-82ab-c7e0ea020070\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.102333 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-svc\") pod \"772ec858-ecf4-4c00-82ab-c7e0ea020070\" (UID: \"772ec858-ecf4-4c00-82ab-c7e0ea020070\") " Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.103006 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.112660 4743 generic.go:334] "Generic (PLEG): container finished" podID="772ec858-ecf4-4c00-82ab-c7e0ea020070" containerID="6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c" exitCode=0 Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.112717 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" event={"ID":"772ec858-ecf4-4c00-82ab-c7e0ea020070","Type":"ContainerDied","Data":"6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c"} Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.112744 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.112771 4743 scope.go:117] "RemoveContainer" containerID="6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.112756 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-pdzh7" event={"ID":"772ec858-ecf4-4c00-82ab-c7e0ea020070","Type":"ContainerDied","Data":"add7f9c1510ecf39b5dce654e0c38d9a87ed31736892abecf5720f6110eda9e3"} Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.113898 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772ec858-ecf4-4c00-82ab-c7e0ea020070-kube-api-access-55qnp" (OuterVolumeSpecName: "kube-api-access-55qnp") pod "772ec858-ecf4-4c00-82ab-c7e0ea020070" (UID: "772ec858-ecf4-4c00-82ab-c7e0ea020070"). InnerVolumeSpecName "kube-api-access-55qnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.125535 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.146397 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.161856 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.168596 4743 scope.go:117] "RemoveContainer" containerID="62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.169295 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.180995 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "772ec858-ecf4-4c00-82ab-c7e0ea020070" (UID: "772ec858-ecf4-4c00-82ab-c7e0ea020070"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.200252 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "772ec858-ecf4-4c00-82ab-c7e0ea020070" (UID: "772ec858-ecf4-4c00-82ab-c7e0ea020070"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.210848 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.210885 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.210894 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55qnp\" (UniqueName: \"kubernetes.io/projected/772ec858-ecf4-4c00-82ab-c7e0ea020070-kube-api-access-55qnp\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.215066 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "772ec858-ecf4-4c00-82ab-c7e0ea020070" (UID: "772ec858-ecf4-4c00-82ab-c7e0ea020070"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.222604 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-config" (OuterVolumeSpecName: "config") pod "772ec858-ecf4-4c00-82ab-c7e0ea020070" (UID: "772ec858-ecf4-4c00-82ab-c7e0ea020070"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.234834 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "772ec858-ecf4-4c00-82ab-c7e0ea020070" (UID: "772ec858-ecf4-4c00-82ab-c7e0ea020070"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.312349 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.312775 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.312785 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/772ec858-ecf4-4c00-82ab-c7e0ea020070-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.333777 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7cfkq"] Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.341392 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-lg7q2"] Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.343874 4743 scope.go:117] "RemoveContainer" containerID="6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c" Nov 22 08:42:05 crc kubenswrapper[4743]: E1122 08:42:05.344895 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c\": container with ID starting with 6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c not found: ID does not exist" containerID="6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.344930 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c"} err="failed to get container status \"6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c\": rpc error: code = NotFound desc = could not find container \"6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c\": container with ID starting with 6113d796053310371d5d862dca1b6b366fbfd98fd951e76b7271942356bb8a8c not found: ID does not exist" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.344958 4743 scope.go:117] "RemoveContainer" containerID="62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a" Nov 22 08:42:05 crc kubenswrapper[4743]: E1122 08:42:05.346790 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a\": container with ID starting with 62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a not found: ID does not exist" containerID="62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a" Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.346839 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a"} err="failed to get container status \"62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a\": rpc error: code = NotFound desc = could not find container \"62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a\": container with ID starting with 62e36151ffdbf63fb3bff981559d21c6133d97963578d456e1e8d5b307c39a0a not found: ID does not exist" Nov 22 08:42:05 crc kubenswrapper[4743]: W1122 08:42:05.347545 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57220368_3097_46da_8f24_783a8e80327e.slice/crio-0372fc10f436e238097cabdfce414ed349d0c8176301a5fef76c8fb8cf96abc9 WatchSource:0}: Error finding container 0372fc10f436e238097cabdfce414ed349d0c8176301a5fef76c8fb8cf96abc9: Status 404 returned error can't find the container with id 0372fc10f436e238097cabdfce414ed349d0c8176301a5fef76c8fb8cf96abc9 Nov 22 08:42:05 crc kubenswrapper[4743]: W1122 08:42:05.359779 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b921a24_4b07_4d33_ab26_3dd171297e24.slice/crio-4095995340ff11590ce6b1a46a4bd70306a39c7bd80699970cf2a3d46d9e4a37 WatchSource:0}: Error finding container 4095995340ff11590ce6b1a46a4bd70306a39c7bd80699970cf2a3d46d9e4a37: Status 404 returned error can't find the container with id 4095995340ff11590ce6b1a46a4bd70306a39c7bd80699970cf2a3d46d9e4a37 Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.485346 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-sj8hg"] Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.516228 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdzh7"] Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.533383 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-pdzh7"] Nov 22 08:42:05 crc kubenswrapper[4743]: W1122 08:42:05.624880 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda87658ca_ad68_4136_82dd_14201100b4ea.slice/crio-ceac13374c9a82fa11f67e9670abf103a4bd8acb26242db71aad86b0b0b24720 WatchSource:0}: Error finding container ceac13374c9a82fa11f67e9670abf103a4bd8acb26242db71aad86b0b0b24720: Status 404 returned error can't find the container with id ceac13374c9a82fa11f67e9670abf103a4bd8acb26242db71aad86b0b0b24720 Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.702390 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-m9jrr"] Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.834545 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-n2d6p"] Nov 22 08:42:05 crc kubenswrapper[4743]: I1122 08:42:05.996750 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.004142 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-22g48"] Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.131932 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-22g48" event={"ID":"95f6e846-532f-419c-bd4a-7d2e7eb41a2c","Type":"ContainerStarted","Data":"b49c5a68e2654efe006e6c7f2fc4e4b07ce75a46d076a1b8de9c355f13091294"} Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.135848 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661914bd-2b43-425b-837a-8c4104173ef4","Type":"ContainerStarted","Data":"13981e383c530b2c42163461a9cfbeb9cda55f404c0daf4b27ab57e9d61a0e88"} Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.144709 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sj8hg" event={"ID":"a87658ca-ad68-4136-82dd-14201100b4ea","Type":"ContainerStarted","Data":"ceac13374c9a82fa11f67e9670abf103a4bd8acb26242db71aad86b0b0b24720"} Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.146533 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7cfkq" event={"ID":"57220368-3097-46da-8f24-783a8e80327e","Type":"ContainerStarted","Data":"7c16c6e79f2fba2bf718be7b393d3ab67b2c838e9819d990799299a5fe9b826a"} Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.146591 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7cfkq" event={"ID":"57220368-3097-46da-8f24-783a8e80327e","Type":"ContainerStarted","Data":"0372fc10f436e238097cabdfce414ed349d0c8176301a5fef76c8fb8cf96abc9"} Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.160292 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n2d6p" event={"ID":"c5a99c13-319a-4df1-8061-8bb20463cd73","Type":"ContainerStarted","Data":"50f89b58cd2780bf4ddf2de8c19039079de19896e31192e383acd335fa63949e"} Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.162645 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m9jrr" event={"ID":"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8","Type":"ContainerStarted","Data":"20c95cacde332b673bf3e87e0639c008dbb8696fac411165237e187da0a0f8b7"} Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.163981 4743 generic.go:334] "Generic (PLEG): container finished" podID="9b921a24-4b07-4d33-ab26-3dd171297e24" containerID="6990345a2a39ef4de8abd6f635b3ef0fc6158c413d9fc91ed95e2f67fce6d452" exitCode=0 Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.164023 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-lg7q2" event={"ID":"9b921a24-4b07-4d33-ab26-3dd171297e24","Type":"ContainerDied","Data":"6990345a2a39ef4de8abd6f635b3ef0fc6158c413d9fc91ed95e2f67fce6d452"} Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.164052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-lg7q2" event={"ID":"9b921a24-4b07-4d33-ab26-3dd171297e24","Type":"ContainerStarted","Data":"4095995340ff11590ce6b1a46a4bd70306a39c7bd80699970cf2a3d46d9e4a37"} Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.169074 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-jkvtq"] Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.179739 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7cfkq" podStartSLOduration=2.179719381 podStartE2EDuration="2.179719381s" podCreationTimestamp="2025-11-22 08:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:06.171419802 +0000 UTC m=+1199.877780854" watchObservedRunningTime="2025-11-22 08:42:06.179719381 +0000 UTC m=+1199.886080433" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.416960 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.617567 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.752628 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-nb\") pod \"9b921a24-4b07-4d33-ab26-3dd171297e24\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.752690 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-swift-storage-0\") pod \"9b921a24-4b07-4d33-ab26-3dd171297e24\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.752747 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-svc\") pod \"9b921a24-4b07-4d33-ab26-3dd171297e24\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.752795 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-config\") pod \"9b921a24-4b07-4d33-ab26-3dd171297e24\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.752841 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmtgq\" (UniqueName: \"kubernetes.io/projected/9b921a24-4b07-4d33-ab26-3dd171297e24-kube-api-access-nmtgq\") pod \"9b921a24-4b07-4d33-ab26-3dd171297e24\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.752928 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-sb\") pod \"9b921a24-4b07-4d33-ab26-3dd171297e24\" (UID: \"9b921a24-4b07-4d33-ab26-3dd171297e24\") " Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.761262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b921a24-4b07-4d33-ab26-3dd171297e24-kube-api-access-nmtgq" (OuterVolumeSpecName: "kube-api-access-nmtgq") pod "9b921a24-4b07-4d33-ab26-3dd171297e24" (UID: "9b921a24-4b07-4d33-ab26-3dd171297e24"). InnerVolumeSpecName "kube-api-access-nmtgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.790718 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9b921a24-4b07-4d33-ab26-3dd171297e24" (UID: "9b921a24-4b07-4d33-ab26-3dd171297e24"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.798255 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-config" (OuterVolumeSpecName: "config") pod "9b921a24-4b07-4d33-ab26-3dd171297e24" (UID: "9b921a24-4b07-4d33-ab26-3dd171297e24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.801466 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9b921a24-4b07-4d33-ab26-3dd171297e24" (UID: "9b921a24-4b07-4d33-ab26-3dd171297e24"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.803261 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9b921a24-4b07-4d33-ab26-3dd171297e24" (UID: "9b921a24-4b07-4d33-ab26-3dd171297e24"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.804177 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9b921a24-4b07-4d33-ab26-3dd171297e24" (UID: "9b921a24-4b07-4d33-ab26-3dd171297e24"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.854548 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.854622 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.854634 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmtgq\" (UniqueName: \"kubernetes.io/projected/9b921a24-4b07-4d33-ab26-3dd171297e24-kube-api-access-nmtgq\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.854645 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.854653 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:06 crc kubenswrapper[4743]: I1122 08:42:06.854662 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9b921a24-4b07-4d33-ab26-3dd171297e24-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.174361 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" containerID="168effbbeb7e147054c39cabb5636edda277ef8d9ff9db122b31d231ddda70e4" exitCode=0 Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.176176 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="772ec858-ecf4-4c00-82ab-c7e0ea020070" path="/var/lib/kubelet/pods/772ec858-ecf4-4c00-82ab-c7e0ea020070/volumes" Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.176943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" event={"ID":"9ccd8a3d-0d1b-4334-9808-d636f5c16e42","Type":"ContainerDied","Data":"168effbbeb7e147054c39cabb5636edda277ef8d9ff9db122b31d231ddda70e4"} Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.176972 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" event={"ID":"9ccd8a3d-0d1b-4334-9808-d636f5c16e42","Type":"ContainerStarted","Data":"340ec3b21dec3b3593bf13421c2c23a326cd140b7c27ad5d9ac828cf097b1542"} Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.185559 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b868669f-lg7q2" event={"ID":"9b921a24-4b07-4d33-ab26-3dd171297e24","Type":"ContainerDied","Data":"4095995340ff11590ce6b1a46a4bd70306a39c7bd80699970cf2a3d46d9e4a37"} Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.185598 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b868669f-lg7q2" Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.185631 4743 scope.go:117] "RemoveContainer" containerID="6990345a2a39ef4de8abd6f635b3ef0fc6158c413d9fc91ed95e2f67fce6d452" Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.211907 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-22g48" event={"ID":"95f6e846-532f-419c-bd4a-7d2e7eb41a2c","Type":"ContainerStarted","Data":"d86df506e2e6495d0d0573fb95792122de57a18bc48b48390f2fddae46e7a46f"} Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.328048 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-22g48" podStartSLOduration=3.328024821 podStartE2EDuration="3.328024821s" podCreationTimestamp="2025-11-22 08:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:07.293957548 +0000 UTC m=+1201.000318600" watchObservedRunningTime="2025-11-22 08:42:07.328024821 +0000 UTC m=+1201.034385873" Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.375831 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-lg7q2"] Nov 22 08:42:07 crc kubenswrapper[4743]: I1122 08:42:07.382893 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b868669f-lg7q2"] Nov 22 08:42:08 crc kubenswrapper[4743]: I1122 08:42:08.225477 4743 generic.go:334] "Generic (PLEG): container finished" podID="142d1e8a-9aac-4c34-9301-1e069919fe82" containerID="6ed030084c63587910484888925d6585aae009c6384bcc1c9a46d6aea22044b6" exitCode=0 Nov 22 08:42:08 crc kubenswrapper[4743]: I1122 08:42:08.225809 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-92fnd" event={"ID":"142d1e8a-9aac-4c34-9301-1e069919fe82","Type":"ContainerDied","Data":"6ed030084c63587910484888925d6585aae009c6384bcc1c9a46d6aea22044b6"} Nov 22 08:42:08 crc kubenswrapper[4743]: I1122 08:42:08.234875 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" event={"ID":"9ccd8a3d-0d1b-4334-9808-d636f5c16e42","Type":"ContainerStarted","Data":"64c8190fa95fba7afe24cc791b5c24ca88455d0f1a977faa91d5054beca8df4c"} Nov 22 08:42:08 crc kubenswrapper[4743]: I1122 08:42:08.290705 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" podStartSLOduration=4.290687526 podStartE2EDuration="4.290687526s" podCreationTimestamp="2025-11-22 08:42:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:08.278626388 +0000 UTC m=+1201.984987440" watchObservedRunningTime="2025-11-22 08:42:08.290687526 +0000 UTC m=+1201.997048578" Nov 22 08:42:09 crc kubenswrapper[4743]: I1122 08:42:09.164460 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b921a24-4b07-4d33-ab26-3dd171297e24" path="/var/lib/kubelet/pods/9b921a24-4b07-4d33-ab26-3dd171297e24/volumes" Nov 22 08:42:09 crc kubenswrapper[4743]: I1122 08:42:09.248237 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.258828 4743 generic.go:334] "Generic (PLEG): container finished" podID="57220368-3097-46da-8f24-783a8e80327e" containerID="7c16c6e79f2fba2bf718be7b393d3ab67b2c838e9819d990799299a5fe9b826a" exitCode=0 Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.258920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7cfkq" event={"ID":"57220368-3097-46da-8f24-783a8e80327e","Type":"ContainerDied","Data":"7c16c6e79f2fba2bf718be7b393d3ab67b2c838e9819d990799299a5fe9b826a"} Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.456007 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-92fnd" Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.527285 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-config-data\") pod \"142d1e8a-9aac-4c34-9301-1e069919fe82\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.527336 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkqzm\" (UniqueName: \"kubernetes.io/projected/142d1e8a-9aac-4c34-9301-1e069919fe82-kube-api-access-fkqzm\") pod \"142d1e8a-9aac-4c34-9301-1e069919fe82\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.527386 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-combined-ca-bundle\") pod \"142d1e8a-9aac-4c34-9301-1e069919fe82\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.527469 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-db-sync-config-data\") pod \"142d1e8a-9aac-4c34-9301-1e069919fe82\" (UID: \"142d1e8a-9aac-4c34-9301-1e069919fe82\") " Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.532735 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/142d1e8a-9aac-4c34-9301-1e069919fe82-kube-api-access-fkqzm" (OuterVolumeSpecName: "kube-api-access-fkqzm") pod "142d1e8a-9aac-4c34-9301-1e069919fe82" (UID: "142d1e8a-9aac-4c34-9301-1e069919fe82"). InnerVolumeSpecName "kube-api-access-fkqzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.537424 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "142d1e8a-9aac-4c34-9301-1e069919fe82" (UID: "142d1e8a-9aac-4c34-9301-1e069919fe82"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.585596 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "142d1e8a-9aac-4c34-9301-1e069919fe82" (UID: "142d1e8a-9aac-4c34-9301-1e069919fe82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.626910 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-config-data" (OuterVolumeSpecName: "config-data") pod "142d1e8a-9aac-4c34-9301-1e069919fe82" (UID: "142d1e8a-9aac-4c34-9301-1e069919fe82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.629039 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.629073 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkqzm\" (UniqueName: \"kubernetes.io/projected/142d1e8a-9aac-4c34-9301-1e069919fe82-kube-api-access-fkqzm\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.629090 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:10 crc kubenswrapper[4743]: I1122 08:42:10.629103 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/142d1e8a-9aac-4c34-9301-1e069919fe82-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.277061 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-92fnd" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.278022 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-92fnd" event={"ID":"142d1e8a-9aac-4c34-9301-1e069919fe82","Type":"ContainerDied","Data":"ef71d6ca813760641bc61f2304845f56771f83ac11216487448131c8b61e61bc"} Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.278068 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef71d6ca813760641bc61f2304845f56771f83ac11216487448131c8b61e61bc" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.878605 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-jkvtq"] Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.879025 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" podUID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" containerName="dnsmasq-dns" containerID="cri-o://64c8190fa95fba7afe24cc791b5c24ca88455d0f1a977faa91d5054beca8df4c" gracePeriod=10 Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.918411 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l6b4p"] Nov 22 08:42:11 crc kubenswrapper[4743]: E1122 08:42:11.918930 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b921a24-4b07-4d33-ab26-3dd171297e24" containerName="init" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.918954 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b921a24-4b07-4d33-ab26-3dd171297e24" containerName="init" Nov 22 08:42:11 crc kubenswrapper[4743]: E1122 08:42:11.918973 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772ec858-ecf4-4c00-82ab-c7e0ea020070" containerName="init" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.918980 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="772ec858-ecf4-4c00-82ab-c7e0ea020070" containerName="init" Nov 22 08:42:11 crc kubenswrapper[4743]: E1122 08:42:11.919007 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="142d1e8a-9aac-4c34-9301-1e069919fe82" containerName="glance-db-sync" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.919018 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="142d1e8a-9aac-4c34-9301-1e069919fe82" containerName="glance-db-sync" Nov 22 08:42:11 crc kubenswrapper[4743]: E1122 08:42:11.919038 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772ec858-ecf4-4c00-82ab-c7e0ea020070" containerName="dnsmasq-dns" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.919043 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="772ec858-ecf4-4c00-82ab-c7e0ea020070" containerName="dnsmasq-dns" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.919235 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="772ec858-ecf4-4c00-82ab-c7e0ea020070" containerName="dnsmasq-dns" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.919263 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="142d1e8a-9aac-4c34-9301-1e069919fe82" containerName="glance-db-sync" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.919274 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b921a24-4b07-4d33-ab26-3dd171297e24" containerName="init" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.920594 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:11 crc kubenswrapper[4743]: I1122 08:42:11.938843 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l6b4p"] Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.063932 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.063992 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.064013 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-config\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.064039 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.064062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg5nf\" (UniqueName: \"kubernetes.io/projected/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-kube-api-access-qg5nf\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.064100 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.166013 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.166361 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.166381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-config\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.166405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.166426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg5nf\" (UniqueName: \"kubernetes.io/projected/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-kube-api-access-qg5nf\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.166451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.167278 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.167851 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.168459 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-config\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.168517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.168728 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.186509 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg5nf\" (UniqueName: \"kubernetes.io/projected/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-kube-api-access-qg5nf\") pod \"dnsmasq-dns-56df8fb6b7-l6b4p\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.267694 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.286637 4743 generic.go:334] "Generic (PLEG): container finished" podID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" containerID="64c8190fa95fba7afe24cc791b5c24ca88455d0f1a977faa91d5054beca8df4c" exitCode=0 Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.286690 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" event={"ID":"9ccd8a3d-0d1b-4334-9808-d636f5c16e42","Type":"ContainerDied","Data":"64c8190fa95fba7afe24cc791b5c24ca88455d0f1a977faa91d5054beca8df4c"} Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.873206 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.875107 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.878194 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.878194 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5qcvv" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.880793 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.893687 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.980022 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-config-data\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.980095 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.980197 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-scripts\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.980226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.980279 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-logs\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.980319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:12 crc kubenswrapper[4743]: I1122 08:42:12.980350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sls6s\" (UniqueName: \"kubernetes.io/projected/e8444058-6228-4dd3-8b5b-25682eae00db-kube-api-access-sls6s\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.028735 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.030522 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.034358 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.038568 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082226 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-scripts\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082281 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082345 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-logs\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082411 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082439 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082464 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sls6s\" (UniqueName: \"kubernetes.io/projected/e8444058-6228-4dd3-8b5b-25682eae00db-kube-api-access-sls6s\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082594 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-config-data\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082634 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082721 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082762 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qjxj\" (UniqueName: \"kubernetes.io/projected/d3a08f25-001e-4124-a726-792cca7ced2b-kube-api-access-2qjxj\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.082941 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.083352 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-logs\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.083774 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.087141 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.096271 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-scripts\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.097178 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-config-data\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.122210 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sls6s\" (UniqueName: \"kubernetes.io/projected/e8444058-6228-4dd3-8b5b-25682eae00db-kube-api-access-sls6s\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.143557 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.184098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.184154 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.184187 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qjxj\" (UniqueName: \"kubernetes.io/projected/d3a08f25-001e-4124-a726-792cca7ced2b-kube-api-access-2qjxj\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.184243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.184276 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.184299 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.184336 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.184930 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.189657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-logs\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.189955 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.192357 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.192451 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.193378 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.199517 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.204474 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qjxj\" (UniqueName: \"kubernetes.io/projected/d3a08f25-001e-4124-a726-792cca7ced2b-kube-api-access-2qjxj\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.213091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:13 crc kubenswrapper[4743]: I1122 08:42:13.349288 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:14 crc kubenswrapper[4743]: I1122 08:42:14.606770 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:42:14 crc kubenswrapper[4743]: I1122 08:42:14.670450 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:42:17 crc kubenswrapper[4743]: E1122 08:42:17.951157 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Nov 22 08:42:17 crc kubenswrapper[4743]: E1122 08:42:17.951911 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d5h57bh546h684h5d6hf7h5b5h57bh99h68fh65dh56h5d8h8ch669h5c4hb4h65dh59fh67ch8hdch9hf5h95h5dh5f8h78h5ddh65bh587h9cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndxx9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(661914bd-2b43-425b-837a-8c4104173ef4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:42:20 crc kubenswrapper[4743]: I1122 08:42:20.170814 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" podUID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.051780 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.058896 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.103801 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-fernet-keys\") pod \"57220368-3097-46da-8f24-783a8e80327e\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.103858 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-sb\") pod \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.103895 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cmpx\" (UniqueName: \"kubernetes.io/projected/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-kube-api-access-2cmpx\") pod \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.103953 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-nb\") pod \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.105144 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnjsn\" (UniqueName: \"kubernetes.io/projected/57220368-3097-46da-8f24-783a8e80327e-kube-api-access-xnjsn\") pod \"57220368-3097-46da-8f24-783a8e80327e\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.105184 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-scripts\") pod \"57220368-3097-46da-8f24-783a8e80327e\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.105230 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-swift-storage-0\") pod \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.105264 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-config-data\") pod \"57220368-3097-46da-8f24-783a8e80327e\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.105322 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-credential-keys\") pod \"57220368-3097-46da-8f24-783a8e80327e\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.105342 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-combined-ca-bundle\") pod \"57220368-3097-46da-8f24-783a8e80327e\" (UID: \"57220368-3097-46da-8f24-783a8e80327e\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.105491 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-svc\") pod \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.105526 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-config\") pod \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\" (UID: \"9ccd8a3d-0d1b-4334-9808-d636f5c16e42\") " Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.126212 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-scripts" (OuterVolumeSpecName: "scripts") pod "57220368-3097-46da-8f24-783a8e80327e" (UID: "57220368-3097-46da-8f24-783a8e80327e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.137569 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-kube-api-access-2cmpx" (OuterVolumeSpecName: "kube-api-access-2cmpx") pod "9ccd8a3d-0d1b-4334-9808-d636f5c16e42" (UID: "9ccd8a3d-0d1b-4334-9808-d636f5c16e42"). InnerVolumeSpecName "kube-api-access-2cmpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.138092 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "57220368-3097-46da-8f24-783a8e80327e" (UID: "57220368-3097-46da-8f24-783a8e80327e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.142826 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "57220368-3097-46da-8f24-783a8e80327e" (UID: "57220368-3097-46da-8f24-783a8e80327e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.147426 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57220368-3097-46da-8f24-783a8e80327e-kube-api-access-xnjsn" (OuterVolumeSpecName: "kube-api-access-xnjsn") pod "57220368-3097-46da-8f24-783a8e80327e" (UID: "57220368-3097-46da-8f24-783a8e80327e"). InnerVolumeSpecName "kube-api-access-xnjsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.158090 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57220368-3097-46da-8f24-783a8e80327e" (UID: "57220368-3097-46da-8f24-783a8e80327e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.171368 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" podUID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.174873 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ccd8a3d-0d1b-4334-9808-d636f5c16e42" (UID: "9ccd8a3d-0d1b-4334-9808-d636f5c16e42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.176852 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9ccd8a3d-0d1b-4334-9808-d636f5c16e42" (UID: "9ccd8a3d-0d1b-4334-9808-d636f5c16e42"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.177993 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ccd8a3d-0d1b-4334-9808-d636f5c16e42" (UID: "9ccd8a3d-0d1b-4334-9808-d636f5c16e42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.180229 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-config-data" (OuterVolumeSpecName: "config-data") pod "57220368-3097-46da-8f24-783a8e80327e" (UID: "57220368-3097-46da-8f24-783a8e80327e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.182717 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-config" (OuterVolumeSpecName: "config") pod "9ccd8a3d-0d1b-4334-9808-d636f5c16e42" (UID: "9ccd8a3d-0d1b-4334-9808-d636f5c16e42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.204096 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ccd8a3d-0d1b-4334-9808-d636f5c16e42" (UID: "9ccd8a3d-0d1b-4334-9808-d636f5c16e42"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210060 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210115 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210127 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cmpx\" (UniqueName: \"kubernetes.io/projected/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-kube-api-access-2cmpx\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210141 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210151 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnjsn\" (UniqueName: \"kubernetes.io/projected/57220368-3097-46da-8f24-783a8e80327e-kube-api-access-xnjsn\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210162 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210172 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210183 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210195 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210208 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57220368-3097-46da-8f24-783a8e80327e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210221 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.210233 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ccd8a3d-0d1b-4334-9808-d636f5c16e42-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.413707 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7cfkq" event={"ID":"57220368-3097-46da-8f24-783a8e80327e","Type":"ContainerDied","Data":"0372fc10f436e238097cabdfce414ed349d0c8176301a5fef76c8fb8cf96abc9"} Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.413746 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0372fc10f436e238097cabdfce414ed349d0c8176301a5fef76c8fb8cf96abc9" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.413787 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7cfkq" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.416045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" event={"ID":"9ccd8a3d-0d1b-4334-9808-d636f5c16e42","Type":"ContainerDied","Data":"340ec3b21dec3b3593bf13421c2c23a326cd140b7c27ad5d9ac828cf097b1542"} Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.416097 4743 scope.go:117] "RemoveContainer" containerID="64c8190fa95fba7afe24cc791b5c24ca88455d0f1a977faa91d5054beca8df4c" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.416106 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cf78879c9-jkvtq" Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.446226 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-jkvtq"] Nov 22 08:42:25 crc kubenswrapper[4743]: I1122 08:42:25.452473 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cf78879c9-jkvtq"] Nov 22 08:42:25 crc kubenswrapper[4743]: E1122 08:42:25.554966 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Nov 22 08:42:25 crc kubenswrapper[4743]: E1122 08:42:25.555153 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-86k8q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-m9jrr_openstack(6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:42:25 crc kubenswrapper[4743]: E1122 08:42:25.556504 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-m9jrr" podUID="6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.164824 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7cfkq"] Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.172295 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7cfkq"] Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.275114 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sz5kf"] Nov 22 08:42:26 crc kubenswrapper[4743]: E1122 08:42:26.275656 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57220368-3097-46da-8f24-783a8e80327e" containerName="keystone-bootstrap" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.275752 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="57220368-3097-46da-8f24-783a8e80327e" containerName="keystone-bootstrap" Nov 22 08:42:26 crc kubenswrapper[4743]: E1122 08:42:26.275836 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" containerName="dnsmasq-dns" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.275909 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" containerName="dnsmasq-dns" Nov 22 08:42:26 crc kubenswrapper[4743]: E1122 08:42:26.275985 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" containerName="init" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.276050 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" containerName="init" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.276265 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" containerName="dnsmasq-dns" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.276330 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="57220368-3097-46da-8f24-783a8e80327e" containerName="keystone-bootstrap" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.276951 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.279825 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.280071 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.280185 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.280288 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-vtjvr" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.282386 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.294325 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sz5kf"] Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.329820 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22xq6\" (UniqueName: \"kubernetes.io/projected/2e8c36ca-0f50-4702-ad97-b1956797b4ab-kube-api-access-22xq6\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.329868 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-fernet-keys\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.329932 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-scripts\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.329962 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-config-data\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.329992 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-credential-keys\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.330026 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-combined-ca-bundle\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: E1122 08:42:26.427000 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-m9jrr" podUID="6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.432126 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-scripts\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.432210 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-config-data\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.432254 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-credential-keys\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.432304 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-combined-ca-bundle\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.432650 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22xq6\" (UniqueName: \"kubernetes.io/projected/2e8c36ca-0f50-4702-ad97-b1956797b4ab-kube-api-access-22xq6\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.432813 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-fernet-keys\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.438912 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-scripts\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.439472 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-credential-keys\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.440571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-fernet-keys\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.443733 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-config-data\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.452104 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-combined-ca-bundle\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.452915 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22xq6\" (UniqueName: \"kubernetes.io/projected/2e8c36ca-0f50-4702-ad97-b1956797b4ab-kube-api-access-22xq6\") pod \"keystone-bootstrap-sz5kf\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.603755 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:26 crc kubenswrapper[4743]: I1122 08:42:26.734844 4743 scope.go:117] "RemoveContainer" containerID="168effbbeb7e147054c39cabb5636edda277ef8d9ff9db122b31d231ddda70e4" Nov 22 08:42:26 crc kubenswrapper[4743]: E1122 08:42:26.755410 4743 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Nov 22 08:42:26 crc kubenswrapper[4743]: E1122 08:42:26.755616 4743 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98hsz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-sj8hg_openstack(a87658ca-ad68-4136-82dd-14201100b4ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Nov 22 08:42:26 crc kubenswrapper[4743]: E1122 08:42:26.756862 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-sj8hg" podUID="a87658ca-ad68-4136-82dd-14201100b4ea" Nov 22 08:42:27 crc kubenswrapper[4743]: I1122 08:42:27.190460 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57220368-3097-46da-8f24-783a8e80327e" path="/var/lib/kubelet/pods/57220368-3097-46da-8f24-783a8e80327e/volumes" Nov 22 08:42:27 crc kubenswrapper[4743]: I1122 08:42:27.192278 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ccd8a3d-0d1b-4334-9808-d636f5c16e42" path="/var/lib/kubelet/pods/9ccd8a3d-0d1b-4334-9808-d636f5c16e42/volumes" Nov 22 08:42:27 crc kubenswrapper[4743]: I1122 08:42:27.207861 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l6b4p"] Nov 22 08:42:27 crc kubenswrapper[4743]: I1122 08:42:27.452866 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661914bd-2b43-425b-837a-8c4104173ef4","Type":"ContainerStarted","Data":"7d9d7148adb064d5322e5ea819f5b41acd751275e9cf1ce1f5f112ec31fb9dcd"} Nov 22 08:42:27 crc kubenswrapper[4743]: I1122 08:42:27.455144 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n2d6p" event={"ID":"c5a99c13-319a-4df1-8061-8bb20463cd73","Type":"ContainerStarted","Data":"849fd71ccf74f0eb1a2e281f40b05417d8904c997955819096f6193efd1ab257"} Nov 22 08:42:27 crc kubenswrapper[4743]: I1122 08:42:27.463414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" event={"ID":"e7744a72-96d3-43bf-89f3-c56ae2a47cdf","Type":"ContainerStarted","Data":"3c486e2934e50b55097a78c1d13a3a8890a5652e0f8cb4233ea0d3f038a7f888"} Nov 22 08:42:27 crc kubenswrapper[4743]: E1122 08:42:27.465341 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-sj8hg" podUID="a87658ca-ad68-4136-82dd-14201100b4ea" Nov 22 08:42:27 crc kubenswrapper[4743]: I1122 08:42:27.479379 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-n2d6p" podStartSLOduration=2.642662838 podStartE2EDuration="23.479364227s" podCreationTimestamp="2025-11-22 08:42:04 +0000 UTC" firstStartedPulling="2025-11-22 08:42:05.856144925 +0000 UTC m=+1199.562505977" lastFinishedPulling="2025-11-22 08:42:26.692846314 +0000 UTC m=+1220.399207366" observedRunningTime="2025-11-22 08:42:27.475008851 +0000 UTC m=+1221.181369903" watchObservedRunningTime="2025-11-22 08:42:27.479364227 +0000 UTC m=+1221.185725279" Nov 22 08:42:27 crc kubenswrapper[4743]: I1122 08:42:27.625208 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sz5kf"] Nov 22 08:42:27 crc kubenswrapper[4743]: W1122 08:42:27.640368 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8444058_6228_4dd3_8b5b_25682eae00db.slice/crio-e6e74885106df00c7cfee5fce58136e0c5be59af52a61b52381b1a877eb74be9 WatchSource:0}: Error finding container e6e74885106df00c7cfee5fce58136e0c5be59af52a61b52381b1a877eb74be9: Status 404 returned error can't find the container with id e6e74885106df00c7cfee5fce58136e0c5be59af52a61b52381b1a877eb74be9 Nov 22 08:42:27 crc kubenswrapper[4743]: I1122 08:42:27.651024 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:42:28 crc kubenswrapper[4743]: I1122 08:42:28.336904 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:42:28 crc kubenswrapper[4743]: W1122 08:42:28.350031 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a08f25_001e_4124_a726_792cca7ced2b.slice/crio-0dcebd7318cb9780427ce9f548d969acf37b844fd8fe3ff66a7ad11a2aa71be5 WatchSource:0}: Error finding container 0dcebd7318cb9780427ce9f548d969acf37b844fd8fe3ff66a7ad11a2aa71be5: Status 404 returned error can't find the container with id 0dcebd7318cb9780427ce9f548d969acf37b844fd8fe3ff66a7ad11a2aa71be5 Nov 22 08:42:28 crc kubenswrapper[4743]: I1122 08:42:28.474163 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8444058-6228-4dd3-8b5b-25682eae00db","Type":"ContainerStarted","Data":"7a6fcce36b02703651f0faf02cf2925a1f6fbac5a0367fdb6f520f490c9d7ac0"} Nov 22 08:42:28 crc kubenswrapper[4743]: I1122 08:42:28.474212 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8444058-6228-4dd3-8b5b-25682eae00db","Type":"ContainerStarted","Data":"e6e74885106df00c7cfee5fce58136e0c5be59af52a61b52381b1a877eb74be9"} Nov 22 08:42:28 crc kubenswrapper[4743]: I1122 08:42:28.475921 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d3a08f25-001e-4124-a726-792cca7ced2b","Type":"ContainerStarted","Data":"0dcebd7318cb9780427ce9f548d969acf37b844fd8fe3ff66a7ad11a2aa71be5"} Nov 22 08:42:28 crc kubenswrapper[4743]: I1122 08:42:28.485426 4743 generic.go:334] "Generic (PLEG): container finished" podID="e7744a72-96d3-43bf-89f3-c56ae2a47cdf" containerID="61ff329d47b92a7e6d486e0d734bc9efe3bd93906e103e58e12a058cb93825e8" exitCode=0 Nov 22 08:42:28 crc kubenswrapper[4743]: I1122 08:42:28.485657 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" event={"ID":"e7744a72-96d3-43bf-89f3-c56ae2a47cdf","Type":"ContainerDied","Data":"61ff329d47b92a7e6d486e0d734bc9efe3bd93906e103e58e12a058cb93825e8"} Nov 22 08:42:28 crc kubenswrapper[4743]: I1122 08:42:28.491828 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sz5kf" event={"ID":"2e8c36ca-0f50-4702-ad97-b1956797b4ab","Type":"ContainerStarted","Data":"5b088da74bbd6737792507a6ad04b9744e4130e90691d8f27ec2c054e8c7fc8d"} Nov 22 08:42:28 crc kubenswrapper[4743]: I1122 08:42:28.491878 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sz5kf" event={"ID":"2e8c36ca-0f50-4702-ad97-b1956797b4ab","Type":"ContainerStarted","Data":"229b214f5f014a6d560b0312193d45b385bae48daf36e3a168e7e1e0abba599c"} Nov 22 08:42:28 crc kubenswrapper[4743]: I1122 08:42:28.527675 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sz5kf" podStartSLOduration=2.527653902 podStartE2EDuration="2.527653902s" podCreationTimestamp="2025-11-22 08:42:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:28.524395117 +0000 UTC m=+1222.230756189" watchObservedRunningTime="2025-11-22 08:42:28.527653902 +0000 UTC m=+1222.234014964" Nov 22 08:42:29 crc kubenswrapper[4743]: I1122 08:42:29.508680 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d3a08f25-001e-4124-a726-792cca7ced2b","Type":"ContainerStarted","Data":"4843acc76f50f57a650fe5e08719a928badcb126df600df940abd19e6c6cf682"} Nov 22 08:42:29 crc kubenswrapper[4743]: I1122 08:42:29.512637 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" event={"ID":"e7744a72-96d3-43bf-89f3-c56ae2a47cdf","Type":"ContainerStarted","Data":"ed1bd93819af5eff5fb7ea42bdeb4f410e196149a63cb232863696e0b6a769c3"} Nov 22 08:42:29 crc kubenswrapper[4743]: I1122 08:42:29.512775 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:29 crc kubenswrapper[4743]: I1122 08:42:29.516728 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e8444058-6228-4dd3-8b5b-25682eae00db" containerName="glance-log" containerID="cri-o://7a6fcce36b02703651f0faf02cf2925a1f6fbac5a0367fdb6f520f490c9d7ac0" gracePeriod=30 Nov 22 08:42:29 crc kubenswrapper[4743]: I1122 08:42:29.517189 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8444058-6228-4dd3-8b5b-25682eae00db","Type":"ContainerStarted","Data":"118ea83c0e3fa2140b6e47b0be03e5270e8d068d58a0ea6858aedf8bea38d363"} Nov 22 08:42:29 crc kubenswrapper[4743]: I1122 08:42:29.517260 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e8444058-6228-4dd3-8b5b-25682eae00db" containerName="glance-httpd" containerID="cri-o://118ea83c0e3fa2140b6e47b0be03e5270e8d068d58a0ea6858aedf8bea38d363" gracePeriod=30 Nov 22 08:42:29 crc kubenswrapper[4743]: I1122 08:42:29.535564 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" podStartSLOduration=18.535539691 podStartE2EDuration="18.535539691s" podCreationTimestamp="2025-11-22 08:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:29.531881006 +0000 UTC m=+1223.238242068" watchObservedRunningTime="2025-11-22 08:42:29.535539691 +0000 UTC m=+1223.241900743" Nov 22 08:42:29 crc kubenswrapper[4743]: I1122 08:42:29.572653 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=18.572635122 podStartE2EDuration="18.572635122s" podCreationTimestamp="2025-11-22 08:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:29.564403654 +0000 UTC m=+1223.270764816" watchObservedRunningTime="2025-11-22 08:42:29.572635122 +0000 UTC m=+1223.278996164" Nov 22 08:42:30 crc kubenswrapper[4743]: I1122 08:42:30.525242 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d3a08f25-001e-4124-a726-792cca7ced2b","Type":"ContainerStarted","Data":"030003d30340e12737bd048e585ac97af4ab119fcaf577264f04dfc26290b0ab"} Nov 22 08:42:30 crc kubenswrapper[4743]: I1122 08:42:30.525345 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d3a08f25-001e-4124-a726-792cca7ced2b" containerName="glance-log" containerID="cri-o://4843acc76f50f57a650fe5e08719a928badcb126df600df940abd19e6c6cf682" gracePeriod=30 Nov 22 08:42:30 crc kubenswrapper[4743]: I1122 08:42:30.525377 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d3a08f25-001e-4124-a726-792cca7ced2b" containerName="glance-httpd" containerID="cri-o://030003d30340e12737bd048e585ac97af4ab119fcaf577264f04dfc26290b0ab" gracePeriod=30 Nov 22 08:42:30 crc kubenswrapper[4743]: I1122 08:42:30.527839 4743 generic.go:334] "Generic (PLEG): container finished" podID="e8444058-6228-4dd3-8b5b-25682eae00db" containerID="118ea83c0e3fa2140b6e47b0be03e5270e8d068d58a0ea6858aedf8bea38d363" exitCode=0 Nov 22 08:42:30 crc kubenswrapper[4743]: I1122 08:42:30.527864 4743 generic.go:334] "Generic (PLEG): container finished" podID="e8444058-6228-4dd3-8b5b-25682eae00db" containerID="7a6fcce36b02703651f0faf02cf2925a1f6fbac5a0367fdb6f520f490c9d7ac0" exitCode=143 Nov 22 08:42:30 crc kubenswrapper[4743]: I1122 08:42:30.528110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8444058-6228-4dd3-8b5b-25682eae00db","Type":"ContainerDied","Data":"118ea83c0e3fa2140b6e47b0be03e5270e8d068d58a0ea6858aedf8bea38d363"} Nov 22 08:42:30 crc kubenswrapper[4743]: I1122 08:42:30.528155 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8444058-6228-4dd3-8b5b-25682eae00db","Type":"ContainerDied","Data":"7a6fcce36b02703651f0faf02cf2925a1f6fbac5a0367fdb6f520f490c9d7ac0"} Nov 22 08:42:30 crc kubenswrapper[4743]: I1122 08:42:30.551155 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=19.551134273 podStartE2EDuration="19.551134273s" podCreationTimestamp="2025-11-22 08:42:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:30.544984816 +0000 UTC m=+1224.251345888" watchObservedRunningTime="2025-11-22 08:42:30.551134273 +0000 UTC m=+1224.257495325" Nov 22 08:42:31 crc kubenswrapper[4743]: I1122 08:42:31.241688 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:42:31 crc kubenswrapper[4743]: I1122 08:42:31.241800 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:42:31 crc kubenswrapper[4743]: I1122 08:42:31.541744 4743 generic.go:334] "Generic (PLEG): container finished" podID="d3a08f25-001e-4124-a726-792cca7ced2b" containerID="030003d30340e12737bd048e585ac97af4ab119fcaf577264f04dfc26290b0ab" exitCode=0 Nov 22 08:42:31 crc kubenswrapper[4743]: I1122 08:42:31.542035 4743 generic.go:334] "Generic (PLEG): container finished" podID="d3a08f25-001e-4124-a726-792cca7ced2b" containerID="4843acc76f50f57a650fe5e08719a928badcb126df600df940abd19e6c6cf682" exitCode=143 Nov 22 08:42:31 crc kubenswrapper[4743]: I1122 08:42:31.541822 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d3a08f25-001e-4124-a726-792cca7ced2b","Type":"ContainerDied","Data":"030003d30340e12737bd048e585ac97af4ab119fcaf577264f04dfc26290b0ab"} Nov 22 08:42:31 crc kubenswrapper[4743]: I1122 08:42:31.542069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d3a08f25-001e-4124-a726-792cca7ced2b","Type":"ContainerDied","Data":"4843acc76f50f57a650fe5e08719a928badcb126df600df940abd19e6c6cf682"} Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.553428 4743 generic.go:334] "Generic (PLEG): container finished" podID="2e8c36ca-0f50-4702-ad97-b1956797b4ab" containerID="5b088da74bbd6737792507a6ad04b9744e4130e90691d8f27ec2c054e8c7fc8d" exitCode=0 Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.553513 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sz5kf" event={"ID":"2e8c36ca-0f50-4702-ad97-b1956797b4ab","Type":"ContainerDied","Data":"5b088da74bbd6737792507a6ad04b9744e4130e90691d8f27ec2c054e8c7fc8d"} Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.557664 4743 generic.go:334] "Generic (PLEG): container finished" podID="c5a99c13-319a-4df1-8061-8bb20463cd73" containerID="849fd71ccf74f0eb1a2e281f40b05417d8904c997955819096f6193efd1ab257" exitCode=0 Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.557736 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n2d6p" event={"ID":"c5a99c13-319a-4df1-8061-8bb20463cd73","Type":"ContainerDied","Data":"849fd71ccf74f0eb1a2e281f40b05417d8904c997955819096f6193efd1ab257"} Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.852449 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.899040 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.963798 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-httpd-run\") pod \"d3a08f25-001e-4124-a726-792cca7ced2b\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.963859 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sls6s\" (UniqueName: \"kubernetes.io/projected/e8444058-6228-4dd3-8b5b-25682eae00db-kube-api-access-sls6s\") pod \"e8444058-6228-4dd3-8b5b-25682eae00db\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.963887 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-logs\") pod \"e8444058-6228-4dd3-8b5b-25682eae00db\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.963916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-combined-ca-bundle\") pod \"d3a08f25-001e-4124-a726-792cca7ced2b\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.963950 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-logs\") pod \"d3a08f25-001e-4124-a726-792cca7ced2b\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.964010 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-httpd-run\") pod \"e8444058-6228-4dd3-8b5b-25682eae00db\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.964041 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-config-data\") pod \"d3a08f25-001e-4124-a726-792cca7ced2b\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.964067 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-scripts\") pod \"d3a08f25-001e-4124-a726-792cca7ced2b\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.964098 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e8444058-6228-4dd3-8b5b-25682eae00db\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.964158 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-scripts\") pod \"e8444058-6228-4dd3-8b5b-25682eae00db\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.964196 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-config-data\") pod \"e8444058-6228-4dd3-8b5b-25682eae00db\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.964219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-combined-ca-bundle\") pod \"e8444058-6228-4dd3-8b5b-25682eae00db\" (UID: \"e8444058-6228-4dd3-8b5b-25682eae00db\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.964282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qjxj\" (UniqueName: \"kubernetes.io/projected/d3a08f25-001e-4124-a726-792cca7ced2b-kube-api-access-2qjxj\") pod \"d3a08f25-001e-4124-a726-792cca7ced2b\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.964320 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d3a08f25-001e-4124-a726-792cca7ced2b\" (UID: \"d3a08f25-001e-4124-a726-792cca7ced2b\") " Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.965914 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-logs" (OuterVolumeSpecName: "logs") pod "d3a08f25-001e-4124-a726-792cca7ced2b" (UID: "d3a08f25-001e-4124-a726-792cca7ced2b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.965930 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e8444058-6228-4dd3-8b5b-25682eae00db" (UID: "e8444058-6228-4dd3-8b5b-25682eae00db"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.965963 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-logs" (OuterVolumeSpecName: "logs") pod "e8444058-6228-4dd3-8b5b-25682eae00db" (UID: "e8444058-6228-4dd3-8b5b-25682eae00db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.966025 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d3a08f25-001e-4124-a726-792cca7ced2b" (UID: "d3a08f25-001e-4124-a726-792cca7ced2b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.973168 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-scripts" (OuterVolumeSpecName: "scripts") pod "e8444058-6228-4dd3-8b5b-25682eae00db" (UID: "e8444058-6228-4dd3-8b5b-25682eae00db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.973207 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8444058-6228-4dd3-8b5b-25682eae00db-kube-api-access-sls6s" (OuterVolumeSpecName: "kube-api-access-sls6s") pod "e8444058-6228-4dd3-8b5b-25682eae00db" (UID: "e8444058-6228-4dd3-8b5b-25682eae00db"). InnerVolumeSpecName "kube-api-access-sls6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.973221 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "d3a08f25-001e-4124-a726-792cca7ced2b" (UID: "d3a08f25-001e-4124-a726-792cca7ced2b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.973283 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a08f25-001e-4124-a726-792cca7ced2b-kube-api-access-2qjxj" (OuterVolumeSpecName: "kube-api-access-2qjxj") pod "d3a08f25-001e-4124-a726-792cca7ced2b" (UID: "d3a08f25-001e-4124-a726-792cca7ced2b"). InnerVolumeSpecName "kube-api-access-2qjxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.974257 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e8444058-6228-4dd3-8b5b-25682eae00db" (UID: "e8444058-6228-4dd3-8b5b-25682eae00db"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.974723 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-scripts" (OuterVolumeSpecName: "scripts") pod "d3a08f25-001e-4124-a726-792cca7ced2b" (UID: "d3a08f25-001e-4124-a726-792cca7ced2b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:32 crc kubenswrapper[4743]: I1122 08:42:32.994916 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8444058-6228-4dd3-8b5b-25682eae00db" (UID: "e8444058-6228-4dd3-8b5b-25682eae00db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.004295 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3a08f25-001e-4124-a726-792cca7ced2b" (UID: "d3a08f25-001e-4124-a726-792cca7ced2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.024675 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-config-data" (OuterVolumeSpecName: "config-data") pod "d3a08f25-001e-4124-a726-792cca7ced2b" (UID: "d3a08f25-001e-4124-a726-792cca7ced2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.025567 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-config-data" (OuterVolumeSpecName: "config-data") pod "e8444058-6228-4dd3-8b5b-25682eae00db" (UID: "e8444058-6228-4dd3-8b5b-25682eae00db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066132 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066178 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066191 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8444058-6228-4dd3-8b5b-25682eae00db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066204 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qjxj\" (UniqueName: \"kubernetes.io/projected/d3a08f25-001e-4124-a726-792cca7ced2b-kube-api-access-2qjxj\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066242 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066254 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066264 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sls6s\" (UniqueName: \"kubernetes.io/projected/e8444058-6228-4dd3-8b5b-25682eae00db-kube-api-access-sls6s\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066275 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066284 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066293 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3a08f25-001e-4124-a726-792cca7ced2b-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066303 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8444058-6228-4dd3-8b5b-25682eae00db-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066312 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066321 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3a08f25-001e-4124-a726-792cca7ced2b-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.066335 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.083668 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.083858 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.167669 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.167743 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.570258 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d3a08f25-001e-4124-a726-792cca7ced2b","Type":"ContainerDied","Data":"0dcebd7318cb9780427ce9f548d969acf37b844fd8fe3ff66a7ad11a2aa71be5"} Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.570315 4743 scope.go:117] "RemoveContainer" containerID="030003d30340e12737bd048e585ac97af4ab119fcaf577264f04dfc26290b0ab" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.570275 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.575400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661914bd-2b43-425b-837a-8c4104173ef4","Type":"ContainerStarted","Data":"ca77000a763c249391614d5f690fb6b5a8606358f4a791b769a787874274d6e9"} Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.582851 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.583301 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e8444058-6228-4dd3-8b5b-25682eae00db","Type":"ContainerDied","Data":"e6e74885106df00c7cfee5fce58136e0c5be59af52a61b52381b1a877eb74be9"} Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.607441 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.613466 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.622200 4743 scope.go:117] "RemoveContainer" containerID="4843acc76f50f57a650fe5e08719a928badcb126df600df940abd19e6c6cf682" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.642137 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:42:33 crc kubenswrapper[4743]: E1122 08:42:33.642620 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a08f25-001e-4124-a726-792cca7ced2b" containerName="glance-log" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.642644 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a08f25-001e-4124-a726-792cca7ced2b" containerName="glance-log" Nov 22 08:42:33 crc kubenswrapper[4743]: E1122 08:42:33.642663 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8444058-6228-4dd3-8b5b-25682eae00db" containerName="glance-log" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.642669 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8444058-6228-4dd3-8b5b-25682eae00db" containerName="glance-log" Nov 22 08:42:33 crc kubenswrapper[4743]: E1122 08:42:33.642682 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8444058-6228-4dd3-8b5b-25682eae00db" containerName="glance-httpd" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.642690 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8444058-6228-4dd3-8b5b-25682eae00db" containerName="glance-httpd" Nov 22 08:42:33 crc kubenswrapper[4743]: E1122 08:42:33.642716 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a08f25-001e-4124-a726-792cca7ced2b" containerName="glance-httpd" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.642727 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a08f25-001e-4124-a726-792cca7ced2b" containerName="glance-httpd" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.642916 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8444058-6228-4dd3-8b5b-25682eae00db" containerName="glance-log" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.642933 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a08f25-001e-4124-a726-792cca7ced2b" containerName="glance-log" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.642951 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a08f25-001e-4124-a726-792cca7ced2b" containerName="glance-httpd" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.642962 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8444058-6228-4dd3-8b5b-25682eae00db" containerName="glance-httpd" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.643887 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.651428 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.651870 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.652156 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.652359 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5qcvv" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.664333 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.674653 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.677239 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.677294 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.677368 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdnp\" (UniqueName: \"kubernetes.io/projected/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-kube-api-access-7vdnp\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.677408 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.677460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.677516 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.677556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.677664 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.686421 4743 scope.go:117] "RemoveContainer" containerID="118ea83c0e3fa2140b6e47b0be03e5270e8d068d58a0ea6858aedf8bea38d363" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.689693 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.709947 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.711938 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.716759 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.719729 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.720325 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.769226 4743 scope.go:117] "RemoveContainer" containerID="7a6fcce36b02703651f0faf02cf2925a1f6fbac5a0367fdb6f520f490c9d7ac0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.781472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.781644 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.781962 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdnp\" (UniqueName: \"kubernetes.io/projected/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-kube-api-access-7vdnp\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.782113 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.782408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.782497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.782538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.782711 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.783581 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.784413 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.793953 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.803000 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.805603 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdnp\" (UniqueName: \"kubernetes.io/projected/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-kube-api-access-7vdnp\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.820154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.822969 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.827477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.884538 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sldp4\" (UniqueName: \"kubernetes.io/projected/f3494746-cd7f-4497-b123-6ca7196e6480-kube-api-access-sldp4\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.884679 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.884718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.884857 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.884895 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.884962 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-logs\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.885004 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.885031 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.918296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.965706 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.986108 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.986154 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.986181 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-logs\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.986203 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.986221 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.986261 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sldp4\" (UniqueName: \"kubernetes.io/projected/f3494746-cd7f-4497-b123-6ca7196e6480-kube-api-access-sldp4\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.986296 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.986314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.987107 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.987108 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Nov 22 08:42:33 crc kubenswrapper[4743]: I1122 08:42:33.989177 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-logs\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.001525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.001534 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.001668 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-scripts\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.001748 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-config-data\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.009674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sldp4\" (UniqueName: \"kubernetes.io/projected/f3494746-cd7f-4497-b123-6ca7196e6480-kube-api-access-sldp4\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.015357 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " pod="openstack/glance-default-external-api-0" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.054038 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.523452 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.652901 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-n2d6p" event={"ID":"c5a99c13-319a-4df1-8061-8bb20463cd73","Type":"ContainerDied","Data":"50f89b58cd2780bf4ddf2de8c19039079de19896e31192e383acd335fa63949e"} Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.652978 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50f89b58cd2780bf4ddf2de8c19039079de19896e31192e383acd335fa63949e" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.678921 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sz5kf" event={"ID":"2e8c36ca-0f50-4702-ad97-b1956797b4ab","Type":"ContainerDied","Data":"229b214f5f014a6d560b0312193d45b385bae48daf36e3a168e7e1e0abba599c"} Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.678962 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="229b214f5f014a6d560b0312193d45b385bae48daf36e3a168e7e1e0abba599c" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.679079 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sz5kf" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.681923 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.707798 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-fernet-keys\") pod \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.707856 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-combined-ca-bundle\") pod \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.707928 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-scripts\") pod \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.707965 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-config-data\") pod \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.707979 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-credential-keys\") pod \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.708004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22xq6\" (UniqueName: \"kubernetes.io/projected/2e8c36ca-0f50-4702-ad97-b1956797b4ab-kube-api-access-22xq6\") pod \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\" (UID: \"2e8c36ca-0f50-4702-ad97-b1956797b4ab\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.729799 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2e8c36ca-0f50-4702-ad97-b1956797b4ab" (UID: "2e8c36ca-0f50-4702-ad97-b1956797b4ab"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.733649 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2e8c36ca-0f50-4702-ad97-b1956797b4ab" (UID: "2e8c36ca-0f50-4702-ad97-b1956797b4ab"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.743655 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8c36ca-0f50-4702-ad97-b1956797b4ab-kube-api-access-22xq6" (OuterVolumeSpecName: "kube-api-access-22xq6") pod "2e8c36ca-0f50-4702-ad97-b1956797b4ab" (UID: "2e8c36ca-0f50-4702-ad97-b1956797b4ab"). InnerVolumeSpecName "kube-api-access-22xq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.748194 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-scripts" (OuterVolumeSpecName: "scripts") pod "2e8c36ca-0f50-4702-ad97-b1956797b4ab" (UID: "2e8c36ca-0f50-4702-ad97-b1956797b4ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.796417 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-config-data" (OuterVolumeSpecName: "config-data") pod "2e8c36ca-0f50-4702-ad97-b1956797b4ab" (UID: "2e8c36ca-0f50-4702-ad97-b1956797b4ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.808935 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-scripts\") pod \"c5a99c13-319a-4df1-8061-8bb20463cd73\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.811002 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbc4s\" (UniqueName: \"kubernetes.io/projected/c5a99c13-319a-4df1-8061-8bb20463cd73-kube-api-access-tbc4s\") pod \"c5a99c13-319a-4df1-8061-8bb20463cd73\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.811215 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5a99c13-319a-4df1-8061-8bb20463cd73-logs\") pod \"c5a99c13-319a-4df1-8061-8bb20463cd73\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.811324 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-config-data\") pod \"c5a99c13-319a-4df1-8061-8bb20463cd73\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.811621 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-combined-ca-bundle\") pod \"c5a99c13-319a-4df1-8061-8bb20463cd73\" (UID: \"c5a99c13-319a-4df1-8061-8bb20463cd73\") " Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.812054 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5a99c13-319a-4df1-8061-8bb20463cd73-logs" (OuterVolumeSpecName: "logs") pod "c5a99c13-319a-4df1-8061-8bb20463cd73" (UID: "c5a99c13-319a-4df1-8061-8bb20463cd73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.817022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e8c36ca-0f50-4702-ad97-b1956797b4ab" (UID: "2e8c36ca-0f50-4702-ad97-b1956797b4ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.818012 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.818317 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.818400 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.818470 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.818547 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2e8c36ca-0f50-4702-ad97-b1956797b4ab-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.818789 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5a99c13-319a-4df1-8061-8bb20463cd73-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.818881 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22xq6\" (UniqueName: \"kubernetes.io/projected/2e8c36ca-0f50-4702-ad97-b1956797b4ab-kube-api-access-22xq6\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.847866 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-scripts" (OuterVolumeSpecName: "scripts") pod "c5a99c13-319a-4df1-8061-8bb20463cd73" (UID: "c5a99c13-319a-4df1-8061-8bb20463cd73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.848307 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-66c4f7f76d-b9q4p"] Nov 22 08:42:34 crc kubenswrapper[4743]: E1122 08:42:34.848652 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8c36ca-0f50-4702-ad97-b1956797b4ab" containerName="keystone-bootstrap" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.848665 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8c36ca-0f50-4702-ad97-b1956797b4ab" containerName="keystone-bootstrap" Nov 22 08:42:34 crc kubenswrapper[4743]: E1122 08:42:34.848682 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5a99c13-319a-4df1-8061-8bb20463cd73" containerName="placement-db-sync" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.848689 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5a99c13-319a-4df1-8061-8bb20463cd73" containerName="placement-db-sync" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.848870 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5a99c13-319a-4df1-8061-8bb20463cd73" containerName="placement-db-sync" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.848887 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8c36ca-0f50-4702-ad97-b1956797b4ab" containerName="keystone-bootstrap" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.849443 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.855796 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5a99c13-319a-4df1-8061-8bb20463cd73-kube-api-access-tbc4s" (OuterVolumeSpecName: "kube-api-access-tbc4s") pod "c5a99c13-319a-4df1-8061-8bb20463cd73" (UID: "c5a99c13-319a-4df1-8061-8bb20463cd73"). InnerVolumeSpecName "kube-api-access-tbc4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.855993 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.856504 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.882169 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-config-data" (OuterVolumeSpecName: "config-data") pod "c5a99c13-319a-4df1-8061-8bb20463cd73" (UID: "c5a99c13-319a-4df1-8061-8bb20463cd73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.883634 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-66c4f7f76d-b9q4p"] Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.890739 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5a99c13-319a-4df1-8061-8bb20463cd73" (UID: "c5a99c13-319a-4df1-8061-8bb20463cd73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.920075 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-scripts\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.920420 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-credential-keys\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.920523 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6zt9\" (UniqueName: \"kubernetes.io/projected/f5b21104-eefe-4583-9af8-731d561b78c2-kube-api-access-l6zt9\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.920598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-fernet-keys\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.920686 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-internal-tls-certs\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.920828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-config-data\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.920916 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-combined-ca-bundle\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.920961 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-public-tls-certs\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.921011 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.921023 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.921032 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbc4s\" (UniqueName: \"kubernetes.io/projected/c5a99c13-319a-4df1-8061-8bb20463cd73-kube-api-access-tbc4s\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:34 crc kubenswrapper[4743]: I1122 08:42:34.921041 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5a99c13-319a-4df1-8061-8bb20463cd73-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.022797 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-combined-ca-bundle\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.023073 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-public-tls-certs\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.023109 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-scripts\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.023169 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-credential-keys\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.023189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6zt9\" (UniqueName: \"kubernetes.io/projected/f5b21104-eefe-4583-9af8-731d561b78c2-kube-api-access-l6zt9\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.023209 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-fernet-keys\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.023233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-internal-tls-certs\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.023257 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-config-data\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.031409 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-config-data\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.031959 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-fernet-keys\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.032041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-combined-ca-bundle\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.032047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-scripts\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.032942 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.034768 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-public-tls-certs\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.035154 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-credential-keys\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: W1122 08:42:35.037255 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3494746_cd7f_4497_b123_6ca7196e6480.slice/crio-c5c798f83a478c33ffbeeacfb2c41f29a486eee82fdde21e145333640afea8b9 WatchSource:0}: Error finding container c5c798f83a478c33ffbeeacfb2c41f29a486eee82fdde21e145333640afea8b9: Status 404 returned error can't find the container with id c5c798f83a478c33ffbeeacfb2c41f29a486eee82fdde21e145333640afea8b9 Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.037972 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-internal-tls-certs\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.058173 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6zt9\" (UniqueName: \"kubernetes.io/projected/f5b21104-eefe-4583-9af8-731d561b78c2-kube-api-access-l6zt9\") pod \"keystone-66c4f7f76d-b9q4p\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.166180 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a08f25-001e-4124-a726-792cca7ced2b" path="/var/lib/kubelet/pods/d3a08f25-001e-4124-a726-792cca7ced2b/volumes" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.167063 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8444058-6228-4dd3-8b5b-25682eae00db" path="/var/lib/kubelet/pods/e8444058-6228-4dd3-8b5b-25682eae00db/volumes" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.181731 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.190951 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.726658 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-66c4f7f76d-b9q4p"] Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.742603 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a2dcda9-85a3-4b08-a32b-14710e0a3b55","Type":"ContainerStarted","Data":"8f3c3d1c229c349a1948a71ce5fb20da18b7e832eaf187d6d0feaeebfd103bd3"} Nov 22 08:42:35 crc kubenswrapper[4743]: W1122 08:42:35.746281 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5b21104_eefe_4583_9af8_731d561b78c2.slice/crio-645a9beeeb4567422b02850f9ebfc65784da718ccac8780aca3db28db4c0fb2b WatchSource:0}: Error finding container 645a9beeeb4567422b02850f9ebfc65784da718ccac8780aca3db28db4c0fb2b: Status 404 returned error can't find the container with id 645a9beeeb4567422b02850f9ebfc65784da718ccac8780aca3db28db4c0fb2b Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.750864 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-n2d6p" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.750853 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3494746-cd7f-4497-b123-6ca7196e6480","Type":"ContainerStarted","Data":"c5c798f83a478c33ffbeeacfb2c41f29a486eee82fdde21e145333640afea8b9"} Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.906781 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-84df6c6d8d-v9vxr"] Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.930716 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.933582 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.937369 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-825fq" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.938418 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.938696 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.938850 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-84df6c6d8d-v9vxr"] Nov 22 08:42:35 crc kubenswrapper[4743]: I1122 08:42:35.943155 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.045544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-internal-tls-certs\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.045928 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-combined-ca-bundle\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.045976 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42gjr\" (UniqueName: \"kubernetes.io/projected/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-kube-api-access-42gjr\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.046006 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-logs\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.046051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-scripts\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.046214 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-config-data\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.046337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-public-tls-certs\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.153511 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-config-data\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.153755 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-public-tls-certs\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.154699 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-internal-tls-certs\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.154745 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-combined-ca-bundle\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.154800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42gjr\" (UniqueName: \"kubernetes.io/projected/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-kube-api-access-42gjr\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.155798 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-logs\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.155954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-scripts\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.156454 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-logs\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.159783 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-internal-tls-certs\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.160434 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-combined-ca-bundle\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.160980 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-scripts\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.169524 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-config-data\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.172102 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42gjr\" (UniqueName: \"kubernetes.io/projected/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-kube-api-access-42gjr\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.183378 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-public-tls-certs\") pod \"placement-84df6c6d8d-v9vxr\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.285080 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.786849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66c4f7f76d-b9q4p" event={"ID":"f5b21104-eefe-4583-9af8-731d561b78c2","Type":"ContainerStarted","Data":"8e1277095f530b9d213cf681f4500af6bf174fddfe554f6556001dae2a813e03"} Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.787224 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66c4f7f76d-b9q4p" event={"ID":"f5b21104-eefe-4583-9af8-731d561b78c2","Type":"ContainerStarted","Data":"645a9beeeb4567422b02850f9ebfc65784da718ccac8780aca3db28db4c0fb2b"} Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.788475 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.797400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3494746-cd7f-4497-b123-6ca7196e6480","Type":"ContainerStarted","Data":"6d178b1ea7a4e098244482ffdc9fe6d4b9d77eeab15b91ba026d2d050bfa9a72"} Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.799103 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a2dcda9-85a3-4b08-a32b-14710e0a3b55","Type":"ContainerStarted","Data":"c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7"} Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.827591 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-66c4f7f76d-b9q4p" podStartSLOduration=2.82755708 podStartE2EDuration="2.82755708s" podCreationTimestamp="2025-11-22 08:42:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:36.824289456 +0000 UTC m=+1230.530650508" watchObservedRunningTime="2025-11-22 08:42:36.82755708 +0000 UTC m=+1230.533918132" Nov 22 08:42:36 crc kubenswrapper[4743]: I1122 08:42:36.851725 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-84df6c6d8d-v9vxr"] Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.271541 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.348466 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-sx88h"] Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.349393 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" podUID="da5e223c-67c0-4f09-8f50-bc6be61305d1" containerName="dnsmasq-dns" containerID="cri-o://5215c7383d75012abf3d0d94618fad8a23559b994de0167f56986ac7a14b929d" gracePeriod=10 Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.831938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a2dcda9-85a3-4b08-a32b-14710e0a3b55","Type":"ContainerStarted","Data":"a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce"} Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.837140 4743 generic.go:334] "Generic (PLEG): container finished" podID="95f6e846-532f-419c-bd4a-7d2e7eb41a2c" containerID="d86df506e2e6495d0d0573fb95792122de57a18bc48b48390f2fddae46e7a46f" exitCode=0 Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.837205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-22g48" event={"ID":"95f6e846-532f-419c-bd4a-7d2e7eb41a2c","Type":"ContainerDied","Data":"d86df506e2e6495d0d0573fb95792122de57a18bc48b48390f2fddae46e7a46f"} Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.839259 4743 generic.go:334] "Generic (PLEG): container finished" podID="da5e223c-67c0-4f09-8f50-bc6be61305d1" containerID="5215c7383d75012abf3d0d94618fad8a23559b994de0167f56986ac7a14b929d" exitCode=0 Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.839331 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" event={"ID":"da5e223c-67c0-4f09-8f50-bc6be61305d1","Type":"ContainerDied","Data":"5215c7383d75012abf3d0d94618fad8a23559b994de0167f56986ac7a14b929d"} Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.839362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" event={"ID":"da5e223c-67c0-4f09-8f50-bc6be61305d1","Type":"ContainerDied","Data":"df30e24e5c0c990b9f39654dadf06d8343ce82543b8641427a3b375da3739148"} Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.839374 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df30e24e5c0c990b9f39654dadf06d8343ce82543b8641427a3b375da3739148" Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.854165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84df6c6d8d-v9vxr" event={"ID":"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca","Type":"ContainerStarted","Data":"12838b3e542aa21904acab03d6b27d30ec54f1471909fca6df88ff3e1aee935d"} Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.854235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84df6c6d8d-v9vxr" event={"ID":"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca","Type":"ContainerStarted","Data":"7711ec056fa213f2eee796483c27379cd7a134fa6030ba1e23b52a3457a46cec"} Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.854245 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84df6c6d8d-v9vxr" event={"ID":"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca","Type":"ContainerStarted","Data":"308da01ed407044c48117ec420e360d5dda27692aad8d33916b3bb6c489ba6ad"} Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.855035 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.855056 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.879753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3494746-cd7f-4497-b123-6ca7196e6480","Type":"ContainerStarted","Data":"6de714267c676d7c85c60c654a52b66ec288ebaa622fecd0837430e2a8ee8f23"} Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.894558 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.894603 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.894561697 podStartE2EDuration="4.894561697s" podCreationTimestamp="2025-11-22 08:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:37.865012634 +0000 UTC m=+1231.571373686" watchObservedRunningTime="2025-11-22 08:42:37.894561697 +0000 UTC m=+1231.600922769" Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.895699 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-84df6c6d8d-v9vxr" podStartSLOduration=2.8956890189999998 podStartE2EDuration="2.895689019s" podCreationTimestamp="2025-11-22 08:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:37.887351349 +0000 UTC m=+1231.593712401" watchObservedRunningTime="2025-11-22 08:42:37.895689019 +0000 UTC m=+1231.602050071" Nov 22 08:42:37 crc kubenswrapper[4743]: I1122 08:42:37.943129 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.943109619 podStartE2EDuration="4.943109619s" podCreationTimestamp="2025-11-22 08:42:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:37.938941788 +0000 UTC m=+1231.645302840" watchObservedRunningTime="2025-11-22 08:42:37.943109619 +0000 UTC m=+1231.649470671" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.020105 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-config\") pod \"da5e223c-67c0-4f09-8f50-bc6be61305d1\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.020316 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-nb\") pod \"da5e223c-67c0-4f09-8f50-bc6be61305d1\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.020354 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cjh5\" (UniqueName: \"kubernetes.io/projected/da5e223c-67c0-4f09-8f50-bc6be61305d1-kube-api-access-8cjh5\") pod \"da5e223c-67c0-4f09-8f50-bc6be61305d1\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.020435 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-sb\") pod \"da5e223c-67c0-4f09-8f50-bc6be61305d1\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.020468 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-dns-svc\") pod \"da5e223c-67c0-4f09-8f50-bc6be61305d1\" (UID: \"da5e223c-67c0-4f09-8f50-bc6be61305d1\") " Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.026743 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da5e223c-67c0-4f09-8f50-bc6be61305d1-kube-api-access-8cjh5" (OuterVolumeSpecName: "kube-api-access-8cjh5") pod "da5e223c-67c0-4f09-8f50-bc6be61305d1" (UID: "da5e223c-67c0-4f09-8f50-bc6be61305d1"). InnerVolumeSpecName "kube-api-access-8cjh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.076038 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da5e223c-67c0-4f09-8f50-bc6be61305d1" (UID: "da5e223c-67c0-4f09-8f50-bc6be61305d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.077101 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da5e223c-67c0-4f09-8f50-bc6be61305d1" (UID: "da5e223c-67c0-4f09-8f50-bc6be61305d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.084127 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da5e223c-67c0-4f09-8f50-bc6be61305d1" (UID: "da5e223c-67c0-4f09-8f50-bc6be61305d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.089531 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-config" (OuterVolumeSpecName: "config") pod "da5e223c-67c0-4f09-8f50-bc6be61305d1" (UID: "da5e223c-67c0-4f09-8f50-bc6be61305d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.124638 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.124693 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.124703 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.124712 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da5e223c-67c0-4f09-8f50-bc6be61305d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.124723 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cjh5\" (UniqueName: \"kubernetes.io/projected/da5e223c-67c0-4f09-8f50-bc6be61305d1-kube-api-access-8cjh5\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.886351 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-sx88h" Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.964708 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-sx88h"] Nov 22 08:42:38 crc kubenswrapper[4743]: I1122 08:42:38.971312 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-sx88h"] Nov 22 08:42:39 crc kubenswrapper[4743]: I1122 08:42:39.167682 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da5e223c-67c0-4f09-8f50-bc6be61305d1" path="/var/lib/kubelet/pods/da5e223c-67c0-4f09-8f50-bc6be61305d1/volumes" Nov 22 08:42:43 crc kubenswrapper[4743]: I1122 08:42:43.966663 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:43 crc kubenswrapper[4743]: I1122 08:42:43.967219 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:43 crc kubenswrapper[4743]: I1122 08:42:43.994954 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:44 crc kubenswrapper[4743]: I1122 08:42:44.008721 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:44 crc kubenswrapper[4743]: I1122 08:42:44.054817 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 08:42:44 crc kubenswrapper[4743]: I1122 08:42:44.054872 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 08:42:44 crc kubenswrapper[4743]: I1122 08:42:44.085119 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 08:42:44 crc kubenswrapper[4743]: I1122 08:42:44.095441 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 08:42:44 crc kubenswrapper[4743]: I1122 08:42:44.940604 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 08:42:44 crc kubenswrapper[4743]: I1122 08:42:44.940905 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:44 crc kubenswrapper[4743]: I1122 08:42:44.940918 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:44 crc kubenswrapper[4743]: I1122 08:42:44.940929 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 08:42:46 crc kubenswrapper[4743]: I1122 08:42:46.834419 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:46 crc kubenswrapper[4743]: I1122 08:42:46.977379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-22g48" event={"ID":"95f6e846-532f-419c-bd4a-7d2e7eb41a2c","Type":"ContainerDied","Data":"b49c5a68e2654efe006e6c7f2fc4e4b07ce75a46d076a1b8de9c355f13091294"} Nov 22 08:42:46 crc kubenswrapper[4743]: I1122 08:42:46.977419 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b49c5a68e2654efe006e6c7f2fc4e4b07ce75a46d076a1b8de9c355f13091294" Nov 22 08:42:46 crc kubenswrapper[4743]: I1122 08:42:46.977450 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-22g48" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.008151 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-combined-ca-bundle\") pod \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.008423 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-config\") pod \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.008477 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sllsw\" (UniqueName: \"kubernetes.io/projected/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-kube-api-access-sllsw\") pod \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\" (UID: \"95f6e846-532f-419c-bd4a-7d2e7eb41a2c\") " Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.023836 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-kube-api-access-sllsw" (OuterVolumeSpecName: "kube-api-access-sllsw") pod "95f6e846-532f-419c-bd4a-7d2e7eb41a2c" (UID: "95f6e846-532f-419c-bd4a-7d2e7eb41a2c"). InnerVolumeSpecName "kube-api-access-sllsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.044140 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-config" (OuterVolumeSpecName: "config") pod "95f6e846-532f-419c-bd4a-7d2e7eb41a2c" (UID: "95f6e846-532f-419c-bd4a-7d2e7eb41a2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.050801 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95f6e846-532f-419c-bd4a-7d2e7eb41a2c" (UID: "95f6e846-532f-419c-bd4a-7d2e7eb41a2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.058347 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.058452 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.110735 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.111077 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sllsw\" (UniqueName: \"kubernetes.io/projected/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-kube-api-access-sllsw\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.111092 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f6e846-532f-419c-bd4a-7d2e7eb41a2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.164250 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.164290 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.167285 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.187654 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 08:42:47 crc kubenswrapper[4743]: E1122 08:42:47.971399 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="661914bd-2b43-425b-837a-8c4104173ef4" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.991122 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661914bd-2b43-425b-837a-8c4104173ef4","Type":"ContainerStarted","Data":"d5abc080430047fe8249a09ffc99b5e38fecf3ed98339ace1e5bbb8120a1400c"} Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.991282 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="ceilometer-notification-agent" containerID="cri-o://7d9d7148adb064d5322e5ea819f5b41acd751275e9cf1ce1f5f112ec31fb9dcd" gracePeriod=30 Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.991362 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.991993 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="proxy-httpd" containerID="cri-o://d5abc080430047fe8249a09ffc99b5e38fecf3ed98339ace1e5bbb8120a1400c" gracePeriod=30 Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.992047 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="sg-core" containerID="cri-o://ca77000a763c249391614d5f690fb6b5a8606358f4a791b769a787874274d6e9" gracePeriod=30 Nov 22 08:42:47 crc kubenswrapper[4743]: I1122 08:42:47.996367 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m9jrr" event={"ID":"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8","Type":"ContainerStarted","Data":"c4858ef9316fe50db2369ef4f71a3b1345ef98bcbaf371bd82d3cde2ff7b09b8"} Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.046965 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-m9jrr" podStartSLOduration=2.103302715 podStartE2EDuration="44.046951606s" podCreationTimestamp="2025-11-22 08:42:04 +0000 UTC" firstStartedPulling="2025-11-22 08:42:05.713982574 +0000 UTC m=+1199.420343626" lastFinishedPulling="2025-11-22 08:42:47.657631465 +0000 UTC m=+1241.363992517" observedRunningTime="2025-11-22 08:42:48.044290019 +0000 UTC m=+1241.750651071" watchObservedRunningTime="2025-11-22 08:42:48.046951606 +0000 UTC m=+1241.753312658" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.139369 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-m2jhn"] Nov 22 08:42:48 crc kubenswrapper[4743]: E1122 08:42:48.139739 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f6e846-532f-419c-bd4a-7d2e7eb41a2c" containerName="neutron-db-sync" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.139751 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f6e846-532f-419c-bd4a-7d2e7eb41a2c" containerName="neutron-db-sync" Nov 22 08:42:48 crc kubenswrapper[4743]: E1122 08:42:48.139779 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5e223c-67c0-4f09-8f50-bc6be61305d1" containerName="dnsmasq-dns" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.139786 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5e223c-67c0-4f09-8f50-bc6be61305d1" containerName="dnsmasq-dns" Nov 22 08:42:48 crc kubenswrapper[4743]: E1122 08:42:48.139804 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5e223c-67c0-4f09-8f50-bc6be61305d1" containerName="init" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.139810 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5e223c-67c0-4f09-8f50-bc6be61305d1" containerName="init" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.140507 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5e223c-67c0-4f09-8f50-bc6be61305d1" containerName="dnsmasq-dns" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.140528 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f6e846-532f-419c-bd4a-7d2e7eb41a2c" containerName="neutron-db-sync" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.141424 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.153878 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-m2jhn"] Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.238865 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njlpz\" (UniqueName: \"kubernetes.io/projected/6b1be60a-df67-4846-90c8-a53fb6acd7f8-kube-api-access-njlpz\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.238921 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-svc\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.238972 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-config\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.238989 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.239014 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.239131 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.253063 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c66bff4c4-wzr6r"] Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.265876 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c66bff4c4-wzr6r"] Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.266007 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.273677 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mgkwl" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.273902 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.274717 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.274873 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345067 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njlpz\" (UniqueName: \"kubernetes.io/projected/6b1be60a-df67-4846-90c8-a53fb6acd7f8-kube-api-access-njlpz\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-svc\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345201 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-httpd-config\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-combined-ca-bundle\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345257 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-config\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345276 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345307 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqvpt\" (UniqueName: \"kubernetes.io/projected/d58da1fb-cb7a-4b26-9753-317919c3d2c9-kube-api-access-vqvpt\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345329 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345411 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-config\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.345472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-ovndb-tls-certs\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.347432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-svc\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.348143 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-config\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.348833 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.349420 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.349552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.367777 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njlpz\" (UniqueName: \"kubernetes.io/projected/6b1be60a-df67-4846-90c8-a53fb6acd7f8-kube-api-access-njlpz\") pod \"dnsmasq-dns-6b7b667979-m2jhn\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.447094 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-ovndb-tls-certs\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.447186 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-combined-ca-bundle\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.447204 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-httpd-config\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.447238 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqvpt\" (UniqueName: \"kubernetes.io/projected/d58da1fb-cb7a-4b26-9753-317919c3d2c9-kube-api-access-vqvpt\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.447291 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-config\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.460134 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-ovndb-tls-certs\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.462372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-combined-ca-bundle\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.463469 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-config\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.466255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-httpd-config\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.477941 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.479155 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqvpt\" (UniqueName: \"kubernetes.io/projected/d58da1fb-cb7a-4b26-9753-317919c3d2c9-kube-api-access-vqvpt\") pod \"neutron-7c66bff4c4-wzr6r\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.596739 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:48 crc kubenswrapper[4743]: I1122 08:42:48.895564 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-m2jhn"] Nov 22 08:42:49 crc kubenswrapper[4743]: I1122 08:42:49.039333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sj8hg" event={"ID":"a87658ca-ad68-4136-82dd-14201100b4ea","Type":"ContainerStarted","Data":"98e216804cd00368cfd172ea133b9f2e2806dd9d0733496c7ada731c9979c7c3"} Nov 22 08:42:49 crc kubenswrapper[4743]: I1122 08:42:49.049815 4743 generic.go:334] "Generic (PLEG): container finished" podID="661914bd-2b43-425b-837a-8c4104173ef4" containerID="ca77000a763c249391614d5f690fb6b5a8606358f4a791b769a787874274d6e9" exitCode=2 Nov 22 08:42:49 crc kubenswrapper[4743]: I1122 08:42:49.049873 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661914bd-2b43-425b-837a-8c4104173ef4","Type":"ContainerDied","Data":"ca77000a763c249391614d5f690fb6b5a8606358f4a791b769a787874274d6e9"} Nov 22 08:42:49 crc kubenswrapper[4743]: I1122 08:42:49.050655 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" event={"ID":"6b1be60a-df67-4846-90c8-a53fb6acd7f8","Type":"ContainerStarted","Data":"88aa6f85dd67b7a7ccfa6dffe0a363b3cad5117f11ac28bd52b466c61434b5b4"} Nov 22 08:42:49 crc kubenswrapper[4743]: I1122 08:42:49.329156 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-sj8hg" podStartSLOduration=3.296714875 podStartE2EDuration="45.329133887s" podCreationTimestamp="2025-11-22 08:42:04 +0000 UTC" firstStartedPulling="2025-11-22 08:42:05.627449727 +0000 UTC m=+1199.333810799" lastFinishedPulling="2025-11-22 08:42:47.659868759 +0000 UTC m=+1241.366229811" observedRunningTime="2025-11-22 08:42:49.069929294 +0000 UTC m=+1242.776290346" watchObservedRunningTime="2025-11-22 08:42:49.329133887 +0000 UTC m=+1243.035494939" Nov 22 08:42:49 crc kubenswrapper[4743]: I1122 08:42:49.333759 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c66bff4c4-wzr6r"] Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.058917 4743 generic.go:334] "Generic (PLEG): container finished" podID="6b1be60a-df67-4846-90c8-a53fb6acd7f8" containerID="676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4" exitCode=0 Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.059018 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" event={"ID":"6b1be60a-df67-4846-90c8-a53fb6acd7f8","Type":"ContainerDied","Data":"676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4"} Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.066287 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c66bff4c4-wzr6r" event={"ID":"d58da1fb-cb7a-4b26-9753-317919c3d2c9","Type":"ContainerStarted","Data":"8b5900c88e83e85276245596e3f8b4ba507a1f40a7ab7fd151132a27262e2cfe"} Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.066344 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c66bff4c4-wzr6r" event={"ID":"d58da1fb-cb7a-4b26-9753-317919c3d2c9","Type":"ContainerStarted","Data":"036c461cf62561016b88bb0ead6384ec8c57f22a831737233da0605b7e3d7436"} Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.066364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c66bff4c4-wzr6r" event={"ID":"d58da1fb-cb7a-4b26-9753-317919c3d2c9","Type":"ContainerStarted","Data":"d74ad8deaa4b62fa1f081d44b768d03143a504b27925e995f41b53015508d1eb"} Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.067305 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.120052 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c66bff4c4-wzr6r" podStartSLOduration=2.120029334 podStartE2EDuration="2.120029334s" podCreationTimestamp="2025-11-22 08:42:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:50.112997711 +0000 UTC m=+1243.819358773" watchObservedRunningTime="2025-11-22 08:42:50.120029334 +0000 UTC m=+1243.826390406" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.676222 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5568cf9dfc-ghfzl"] Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.678145 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.680340 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.680429 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.689730 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5568cf9dfc-ghfzl"] Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.794929 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hrhs\" (UniqueName: \"kubernetes.io/projected/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-kube-api-access-7hrhs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.795065 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-httpd-config\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.795155 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-public-tls-certs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.795178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-ovndb-tls-certs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.795205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-combined-ca-bundle\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.795226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-config\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.795246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-internal-tls-certs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.897137 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-httpd-config\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.897216 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-public-tls-certs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.897239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-ovndb-tls-certs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.897259 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-combined-ca-bundle\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.897276 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-config\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.897291 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-internal-tls-certs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.897344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hrhs\" (UniqueName: \"kubernetes.io/projected/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-kube-api-access-7hrhs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.902745 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-combined-ca-bundle\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.902815 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-internal-tls-certs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.903302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-httpd-config\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.904003 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-config\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.904248 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-ovndb-tls-certs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.908113 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-public-tls-certs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.927614 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hrhs\" (UniqueName: \"kubernetes.io/projected/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-kube-api-access-7hrhs\") pod \"neutron-5568cf9dfc-ghfzl\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:50 crc kubenswrapper[4743]: I1122 08:42:50.995931 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:51 crc kubenswrapper[4743]: I1122 08:42:51.103407 4743 generic.go:334] "Generic (PLEG): container finished" podID="661914bd-2b43-425b-837a-8c4104173ef4" containerID="7d9d7148adb064d5322e5ea819f5b41acd751275e9cf1ce1f5f112ec31fb9dcd" exitCode=0 Nov 22 08:42:51 crc kubenswrapper[4743]: I1122 08:42:51.103743 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661914bd-2b43-425b-837a-8c4104173ef4","Type":"ContainerDied","Data":"7d9d7148adb064d5322e5ea819f5b41acd751275e9cf1ce1f5f112ec31fb9dcd"} Nov 22 08:42:51 crc kubenswrapper[4743]: I1122 08:42:51.111615 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" event={"ID":"6b1be60a-df67-4846-90c8-a53fb6acd7f8","Type":"ContainerStarted","Data":"d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d"} Nov 22 08:42:51 crc kubenswrapper[4743]: I1122 08:42:51.112816 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:51 crc kubenswrapper[4743]: I1122 08:42:51.118147 4743 generic.go:334] "Generic (PLEG): container finished" podID="6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8" containerID="c4858ef9316fe50db2369ef4f71a3b1345ef98bcbaf371bd82d3cde2ff7b09b8" exitCode=0 Nov 22 08:42:51 crc kubenswrapper[4743]: I1122 08:42:51.118262 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m9jrr" event={"ID":"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8","Type":"ContainerDied","Data":"c4858ef9316fe50db2369ef4f71a3b1345ef98bcbaf371bd82d3cde2ff7b09b8"} Nov 22 08:42:51 crc kubenswrapper[4743]: I1122 08:42:51.140109 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" podStartSLOduration=3.140062756 podStartE2EDuration="3.140062756s" podCreationTimestamp="2025-11-22 08:42:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:51.133805586 +0000 UTC m=+1244.840166638" watchObservedRunningTime="2025-11-22 08:42:51.140062756 +0000 UTC m=+1244.846423808" Nov 22 08:42:51 crc kubenswrapper[4743]: I1122 08:42:51.527680 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5568cf9dfc-ghfzl"] Nov 22 08:42:51 crc kubenswrapper[4743]: W1122 08:42:51.532109 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcd06b6d_e1b1_44dd_b2a6_8d2a8cca4d48.slice/crio-8faad5719e36efa859c72dde84b12b5cae9cc0bcb0b55d021f8b97425c658e9d WatchSource:0}: Error finding container 8faad5719e36efa859c72dde84b12b5cae9cc0bcb0b55d021f8b97425c658e9d: Status 404 returned error can't find the container with id 8faad5719e36efa859c72dde84b12b5cae9cc0bcb0b55d021f8b97425c658e9d Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.128271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568cf9dfc-ghfzl" event={"ID":"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48","Type":"ContainerStarted","Data":"59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3"} Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.129054 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.129087 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568cf9dfc-ghfzl" event={"ID":"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48","Type":"ContainerStarted","Data":"f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424"} Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.129106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568cf9dfc-ghfzl" event={"ID":"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48","Type":"ContainerStarted","Data":"8faad5719e36efa859c72dde84b12b5cae9cc0bcb0b55d021f8b97425c658e9d"} Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.161455 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5568cf9dfc-ghfzl" podStartSLOduration=2.159562104 podStartE2EDuration="2.159562104s" podCreationTimestamp="2025-11-22 08:42:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:52.151240213 +0000 UTC m=+1245.857601265" watchObservedRunningTime="2025-11-22 08:42:52.159562104 +0000 UTC m=+1245.865923156" Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.484494 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.523727 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-combined-ca-bundle\") pod \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.524357 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86k8q\" (UniqueName: \"kubernetes.io/projected/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-kube-api-access-86k8q\") pod \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.524426 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-db-sync-config-data\") pod \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\" (UID: \"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8\") " Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.528681 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8" (UID: "6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.529867 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-kube-api-access-86k8q" (OuterVolumeSpecName: "kube-api-access-86k8q") pod "6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8" (UID: "6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8"). InnerVolumeSpecName "kube-api-access-86k8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.549771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8" (UID: "6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.626082 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86k8q\" (UniqueName: \"kubernetes.io/projected/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-kube-api-access-86k8q\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.626119 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:52 crc kubenswrapper[4743]: I1122 08:42:52.626128 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.142947 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-m9jrr" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.150799 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-m9jrr" event={"ID":"6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8","Type":"ContainerDied","Data":"20c95cacde332b673bf3e87e0639c008dbb8696fac411165237e187da0a0f8b7"} Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.150854 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20c95cacde332b673bf3e87e0639c008dbb8696fac411165237e187da0a0f8b7" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.448567 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-84bbbc9bdb-72lc6"] Nov 22 08:42:53 crc kubenswrapper[4743]: E1122 08:42:53.449445 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8" containerName="barbican-db-sync" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.449463 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8" containerName="barbican-db-sync" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.449710 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8" containerName="barbican-db-sync" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.450910 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.454433 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8gdtg" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.454792 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.454968 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.477132 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7cd8fdf575-7kd5c"] Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.479707 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.482209 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.498557 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-84bbbc9bdb-72lc6"] Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.509647 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cd8fdf575-7kd5c"] Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.543856 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-combined-ca-bundle\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.543906 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.543924 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data-custom\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.543977 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-logs\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.544020 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fe5d70f-5277-4803-ae45-de61d0eefe27-logs\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.544055 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.544073 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-combined-ca-bundle\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.544103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2nfs\" (UniqueName: \"kubernetes.io/projected/8fe5d70f-5277-4803-ae45-de61d0eefe27-kube-api-access-p2nfs\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.544124 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data-custom\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.544139 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bflkx\" (UniqueName: \"kubernetes.io/projected/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-kube-api-access-bflkx\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.551869 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-m2jhn"] Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.573921 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jkt28"] Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.575333 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.608517 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jkt28"] Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.645951 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646012 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646044 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-config\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646068 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-combined-ca-bundle\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646121 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2nfs\" (UniqueName: \"kubernetes.io/projected/8fe5d70f-5277-4803-ae45-de61d0eefe27-kube-api-access-p2nfs\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646157 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data-custom\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bflkx\" (UniqueName: \"kubernetes.io/projected/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-kube-api-access-bflkx\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-combined-ca-bundle\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646266 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data-custom\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646345 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-logs\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646373 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646429 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fe5d70f-5277-4803-ae45-de61d0eefe27-logs\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.646665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq8fn\" (UniqueName: \"kubernetes.io/projected/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-kube-api-access-jq8fn\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.650203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fe5d70f-5277-4803-ae45-de61d0eefe27-logs\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.651074 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data-custom\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.651861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.652164 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-logs\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.655100 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data-custom\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.660240 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.666427 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-combined-ca-bundle\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.666975 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bflkx\" (UniqueName: \"kubernetes.io/projected/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-kube-api-access-bflkx\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.668204 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-combined-ca-bundle\") pod \"barbican-worker-7cd8fdf575-7kd5c\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.683801 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2nfs\" (UniqueName: \"kubernetes.io/projected/8fe5d70f-5277-4803-ae45-de61d0eefe27-kube-api-access-p2nfs\") pod \"barbican-keystone-listener-84bbbc9bdb-72lc6\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.685285 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-58c8b4cc8-lgv67"] Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.687068 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.695007 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.703027 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58c8b4cc8-lgv67"] Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.747840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-logs\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.748211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7wn\" (UniqueName: \"kubernetes.io/projected/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-kube-api-access-bd7wn\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.748320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.748407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.748484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.748558 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-combined-ca-bundle\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.748756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq8fn\" (UniqueName: \"kubernetes.io/projected/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-kube-api-access-jq8fn\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.748872 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.748957 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-config\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.749072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data-custom\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.749239 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.750190 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.750926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.751611 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.752518 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.755028 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-config\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.773095 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq8fn\" (UniqueName: \"kubernetes.io/projected/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-kube-api-access-jq8fn\") pod \"dnsmasq-dns-848cf88cfc-jkt28\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.787156 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.810867 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.851137 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data-custom\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.852792 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.852876 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-logs\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.852935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7wn\" (UniqueName: \"kubernetes.io/projected/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-kube-api-access-bd7wn\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.853000 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-combined-ca-bundle\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.853747 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-logs\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.860045 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data-custom\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.860682 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-combined-ca-bundle\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.862025 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.874157 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7wn\" (UniqueName: \"kubernetes.io/projected/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-kube-api-access-bd7wn\") pod \"barbican-api-58c8b4cc8-lgv67\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:53 crc kubenswrapper[4743]: I1122 08:42:53.905927 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.076939 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.154789 4743 generic.go:334] "Generic (PLEG): container finished" podID="a87658ca-ad68-4136-82dd-14201100b4ea" containerID="98e216804cd00368cfd172ea133b9f2e2806dd9d0733496c7ada731c9979c7c3" exitCode=0 Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.156209 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" podUID="6b1be60a-df67-4846-90c8-a53fb6acd7f8" containerName="dnsmasq-dns" containerID="cri-o://d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d" gracePeriod=10 Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.156714 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sj8hg" event={"ID":"a87658ca-ad68-4136-82dd-14201100b4ea","Type":"ContainerDied","Data":"98e216804cd00368cfd172ea133b9f2e2806dd9d0733496c7ada731c9979c7c3"} Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.303642 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-84bbbc9bdb-72lc6"] Nov 22 08:42:54 crc kubenswrapper[4743]: W1122 08:42:54.311418 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fe5d70f_5277_4803_ae45_de61d0eefe27.slice/crio-0ba3551be192c9ca67dafe3b3334189b068599cd139082c54c7923a0610758ca WatchSource:0}: Error finding container 0ba3551be192c9ca67dafe3b3334189b068599cd139082c54c7923a0610758ca: Status 404 returned error can't find the container with id 0ba3551be192c9ca67dafe3b3334189b068599cd139082c54c7923a0610758ca Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.401770 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7cd8fdf575-7kd5c"] Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.544361 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jkt28"] Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.653477 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-58c8b4cc8-lgv67"] Nov 22 08:42:54 crc kubenswrapper[4743]: W1122 08:42:54.656671 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3ff215f_5cdd_4fa3_8112_82c4cd14ac82.slice/crio-82537d7de7d143f19733cd925382b1dbb1fbe83a8ce5ee7f916f7b1b057b6d05 WatchSource:0}: Error finding container 82537d7de7d143f19733cd925382b1dbb1fbe83a8ce5ee7f916f7b1b057b6d05: Status 404 returned error can't find the container with id 82537d7de7d143f19733cd925382b1dbb1fbe83a8ce5ee7f916f7b1b057b6d05 Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.662267 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.680027 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-swift-storage-0\") pod \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.680121 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-sb\") pod \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.680163 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-svc\") pod \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.680220 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-nb\") pod \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.680341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njlpz\" (UniqueName: \"kubernetes.io/projected/6b1be60a-df67-4846-90c8-a53fb6acd7f8-kube-api-access-njlpz\") pod \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.680386 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-config\") pod \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\" (UID: \"6b1be60a-df67-4846-90c8-a53fb6acd7f8\") " Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.703111 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1be60a-df67-4846-90c8-a53fb6acd7f8-kube-api-access-njlpz" (OuterVolumeSpecName: "kube-api-access-njlpz") pod "6b1be60a-df67-4846-90c8-a53fb6acd7f8" (UID: "6b1be60a-df67-4846-90c8-a53fb6acd7f8"). InnerVolumeSpecName "kube-api-access-njlpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.760875 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6b1be60a-df67-4846-90c8-a53fb6acd7f8" (UID: "6b1be60a-df67-4846-90c8-a53fb6acd7f8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.774085 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6b1be60a-df67-4846-90c8-a53fb6acd7f8" (UID: "6b1be60a-df67-4846-90c8-a53fb6acd7f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.781802 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6b1be60a-df67-4846-90c8-a53fb6acd7f8" (UID: "6b1be60a-df67-4846-90c8-a53fb6acd7f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.782623 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.782644 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.782653 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njlpz\" (UniqueName: \"kubernetes.io/projected/6b1be60a-df67-4846-90c8-a53fb6acd7f8-kube-api-access-njlpz\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.782663 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.809047 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6b1be60a-df67-4846-90c8-a53fb6acd7f8" (UID: "6b1be60a-df67-4846-90c8-a53fb6acd7f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.817185 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-config" (OuterVolumeSpecName: "config") pod "6b1be60a-df67-4846-90c8-a53fb6acd7f8" (UID: "6b1be60a-df67-4846-90c8-a53fb6acd7f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.884890 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:54 crc kubenswrapper[4743]: I1122 08:42:54.884921 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b1be60a-df67-4846-90c8-a53fb6acd7f8-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.168846 4743 generic.go:334] "Generic (PLEG): container finished" podID="d2a8e670-15d3-4d05-b0c9-386ec9befc9a" containerID="ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73" exitCode=0 Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.168930 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" event={"ID":"d2a8e670-15d3-4d05-b0c9-386ec9befc9a","Type":"ContainerDied","Data":"ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73"} Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.169204 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" event={"ID":"d2a8e670-15d3-4d05-b0c9-386ec9befc9a","Type":"ContainerStarted","Data":"75d1b150c5e5d1171c534d475a426f13fd421952b42cf2e9d47e30e7356baaba"} Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.178192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" event={"ID":"89d8e638-b97a-4273-9391-5e0c7dd1bfb1","Type":"ContainerStarted","Data":"fde53c906c1f4f9a19f61b1d47e8097f66d4506cfd818fa986ab4390d6a26513"} Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.194509 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" event={"ID":"8fe5d70f-5277-4803-ae45-de61d0eefe27","Type":"ContainerStarted","Data":"0ba3551be192c9ca67dafe3b3334189b068599cd139082c54c7923a0610758ca"} Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.198979 4743 generic.go:334] "Generic (PLEG): container finished" podID="6b1be60a-df67-4846-90c8-a53fb6acd7f8" containerID="d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d" exitCode=0 Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.199018 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.199051 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" event={"ID":"6b1be60a-df67-4846-90c8-a53fb6acd7f8","Type":"ContainerDied","Data":"d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d"} Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.199081 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-m2jhn" event={"ID":"6b1be60a-df67-4846-90c8-a53fb6acd7f8","Type":"ContainerDied","Data":"88aa6f85dd67b7a7ccfa6dffe0a363b3cad5117f11ac28bd52b466c61434b5b4"} Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.199108 4743 scope.go:117] "RemoveContainer" containerID="d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d" Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.206942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58c8b4cc8-lgv67" event={"ID":"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82","Type":"ContainerStarted","Data":"789ae711beac7ce8ccb05e661fd1b5b173a919c21b3d48b0c864746ee72cd73f"} Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.206989 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58c8b4cc8-lgv67" event={"ID":"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82","Type":"ContainerStarted","Data":"82537d7de7d143f19733cd925382b1dbb1fbe83a8ce5ee7f916f7b1b057b6d05"} Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.255443 4743 scope.go:117] "RemoveContainer" containerID="676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4" Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.273108 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-m2jhn"] Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.293368 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-m2jhn"] Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.344539 4743 scope.go:117] "RemoveContainer" containerID="d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d" Nov 22 08:42:55 crc kubenswrapper[4743]: E1122 08:42:55.345316 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d\": container with ID starting with d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d not found: ID does not exist" containerID="d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d" Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.345394 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d"} err="failed to get container status \"d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d\": rpc error: code = NotFound desc = could not find container \"d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d\": container with ID starting with d33f19e62055f2e6e98addff23e14994650a20dcc05b66965d5dc6f3be23a95d not found: ID does not exist" Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.345429 4743 scope.go:117] "RemoveContainer" containerID="676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4" Nov 22 08:42:55 crc kubenswrapper[4743]: E1122 08:42:55.345814 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4\": container with ID starting with 676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4 not found: ID does not exist" containerID="676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4" Nov 22 08:42:55 crc kubenswrapper[4743]: I1122 08:42:55.345854 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4"} err="failed to get container status \"676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4\": rpc error: code = NotFound desc = could not find container \"676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4\": container with ID starting with 676f8393e500dfb1ff0506bfb3ba3ea8dcfead6ac6603e054377b5b131b62ac4 not found: ID does not exist" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.222334 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58c8b4cc8-lgv67" event={"ID":"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82","Type":"ContainerStarted","Data":"c53c143e140439571a5c21151cef803359eb1910e81614489e491c618d3928bd"} Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.232644 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" event={"ID":"d2a8e670-15d3-4d05-b0c9-386ec9befc9a","Type":"ContainerStarted","Data":"7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10"} Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.233905 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.241764 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-58c8b4cc8-lgv67" podStartSLOduration=3.241744752 podStartE2EDuration="3.241744752s" podCreationTimestamp="2025-11-22 08:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:56.239888589 +0000 UTC m=+1249.946249651" watchObservedRunningTime="2025-11-22 08:42:56.241744752 +0000 UTC m=+1249.948105804" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.269321 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" podStartSLOduration=3.269267427 podStartE2EDuration="3.269267427s" podCreationTimestamp="2025-11-22 08:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:56.267490156 +0000 UTC m=+1249.973851218" watchObservedRunningTime="2025-11-22 08:42:56.269267427 +0000 UTC m=+1249.975628479" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.399756 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.430345 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-db-sync-config-data\") pod \"a87658ca-ad68-4136-82dd-14201100b4ea\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.430423 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-combined-ca-bundle\") pod \"a87658ca-ad68-4136-82dd-14201100b4ea\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.430520 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98hsz\" (UniqueName: \"kubernetes.io/projected/a87658ca-ad68-4136-82dd-14201100b4ea-kube-api-access-98hsz\") pod \"a87658ca-ad68-4136-82dd-14201100b4ea\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.430640 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-config-data\") pod \"a87658ca-ad68-4136-82dd-14201100b4ea\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.430674 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-scripts\") pod \"a87658ca-ad68-4136-82dd-14201100b4ea\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.430719 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a87658ca-ad68-4136-82dd-14201100b4ea-etc-machine-id\") pod \"a87658ca-ad68-4136-82dd-14201100b4ea\" (UID: \"a87658ca-ad68-4136-82dd-14201100b4ea\") " Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.431111 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a87658ca-ad68-4136-82dd-14201100b4ea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a87658ca-ad68-4136-82dd-14201100b4ea" (UID: "a87658ca-ad68-4136-82dd-14201100b4ea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.437249 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87658ca-ad68-4136-82dd-14201100b4ea-kube-api-access-98hsz" (OuterVolumeSpecName: "kube-api-access-98hsz") pod "a87658ca-ad68-4136-82dd-14201100b4ea" (UID: "a87658ca-ad68-4136-82dd-14201100b4ea"). InnerVolumeSpecName "kube-api-access-98hsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.437818 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-scripts" (OuterVolumeSpecName: "scripts") pod "a87658ca-ad68-4136-82dd-14201100b4ea" (UID: "a87658ca-ad68-4136-82dd-14201100b4ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.437951 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a87658ca-ad68-4136-82dd-14201100b4ea" (UID: "a87658ca-ad68-4136-82dd-14201100b4ea"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.479201 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a87658ca-ad68-4136-82dd-14201100b4ea" (UID: "a87658ca-ad68-4136-82dd-14201100b4ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.501017 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-config-data" (OuterVolumeSpecName: "config-data") pod "a87658ca-ad68-4136-82dd-14201100b4ea" (UID: "a87658ca-ad68-4136-82dd-14201100b4ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.533878 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a87658ca-ad68-4136-82dd-14201100b4ea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.533918 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.533940 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.533953 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98hsz\" (UniqueName: \"kubernetes.io/projected/a87658ca-ad68-4136-82dd-14201100b4ea-kube-api-access-98hsz\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.533968 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.533980 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a87658ca-ad68-4136-82dd-14201100b4ea-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.635992 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6dcbbd6f66-kjrm8"] Nov 22 08:42:56 crc kubenswrapper[4743]: E1122 08:42:56.636415 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1be60a-df67-4846-90c8-a53fb6acd7f8" containerName="init" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.636435 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1be60a-df67-4846-90c8-a53fb6acd7f8" containerName="init" Nov 22 08:42:56 crc kubenswrapper[4743]: E1122 08:42:56.636445 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87658ca-ad68-4136-82dd-14201100b4ea" containerName="cinder-db-sync" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.636452 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87658ca-ad68-4136-82dd-14201100b4ea" containerName="cinder-db-sync" Nov 22 08:42:56 crc kubenswrapper[4743]: E1122 08:42:56.636466 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1be60a-df67-4846-90c8-a53fb6acd7f8" containerName="dnsmasq-dns" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.636473 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1be60a-df67-4846-90c8-a53fb6acd7f8" containerName="dnsmasq-dns" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.636746 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1be60a-df67-4846-90c8-a53fb6acd7f8" containerName="dnsmasq-dns" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.636768 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87658ca-ad68-4136-82dd-14201100b4ea" containerName="cinder-db-sync" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.638248 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.643072 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.643175 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.667407 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dcbbd6f66-kjrm8"] Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.737434 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tv7j\" (UniqueName: \"kubernetes.io/projected/dc034ce8-656e-4c88-92f1-18f384ae1a18-kube-api-access-5tv7j\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.737514 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc034ce8-656e-4c88-92f1-18f384ae1a18-logs\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.737552 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-combined-ca-bundle\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.737618 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.737644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-public-tls-certs\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.737734 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-internal-tls-certs\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.737773 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data-custom\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.839641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-internal-tls-certs\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.839691 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data-custom\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.839749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tv7j\" (UniqueName: \"kubernetes.io/projected/dc034ce8-656e-4c88-92f1-18f384ae1a18-kube-api-access-5tv7j\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.839781 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc034ce8-656e-4c88-92f1-18f384ae1a18-logs\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.839802 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-combined-ca-bundle\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.839829 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.839847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-public-tls-certs\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.840456 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc034ce8-656e-4c88-92f1-18f384ae1a18-logs\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.844943 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-internal-tls-certs\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.845521 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-combined-ca-bundle\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.846821 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-public-tls-certs\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.847005 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.854152 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data-custom\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.861274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tv7j\" (UniqueName: \"kubernetes.io/projected/dc034ce8-656e-4c88-92f1-18f384ae1a18-kube-api-access-5tv7j\") pod \"barbican-api-6dcbbd6f66-kjrm8\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:56 crc kubenswrapper[4743]: I1122 08:42:56.954936 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.167458 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b1be60a-df67-4846-90c8-a53fb6acd7f8" path="/var/lib/kubelet/pods/6b1be60a-df67-4846-90c8-a53fb6acd7f8/volumes" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.244205 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" event={"ID":"89d8e638-b97a-4273-9391-5e0c7dd1bfb1","Type":"ContainerStarted","Data":"b2dbbf998042cd2c9fe978946c70615b254e4f75ec35ea7cfe22feace7d787f6"} Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.248071 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" event={"ID":"8fe5d70f-5277-4803-ae45-de61d0eefe27","Type":"ContainerStarted","Data":"b87c8deb0c6f1f3c1134e38ff7289f1edfe2b60e90ae6b47a46057bcb212868c"} Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.250444 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sj8hg" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.250915 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sj8hg" event={"ID":"a87658ca-ad68-4136-82dd-14201100b4ea","Type":"ContainerDied","Data":"ceac13374c9a82fa11f67e9670abf103a4bd8acb26242db71aad86b0b0b24720"} Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.250934 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceac13374c9a82fa11f67e9670abf103a4bd8acb26242db71aad86b0b0b24720" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.250950 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.251401 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.489660 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dcbbd6f66-kjrm8"] Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.642836 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.645056 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.648960 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.649245 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.649460 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-2bw4c" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.649689 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.652976 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.747556 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jkt28"] Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.756627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.756699 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qltmq\" (UniqueName: \"kubernetes.io/projected/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-kube-api-access-qltmq\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.756730 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.756781 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.756857 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.756922 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-scripts\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.771925 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w9dgz"] Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.773872 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.798852 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w9dgz"] Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.858699 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-scripts\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.858787 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.858834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qltmq\" (UniqueName: \"kubernetes.io/projected/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-kube-api-access-qltmq\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.858861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.858909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.858989 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.866240 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.866332 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.867983 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-scripts\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.875255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.875907 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.897288 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qltmq\" (UniqueName: \"kubernetes.io/projected/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-kube-api-access-qltmq\") pod \"cinder-scheduler-0\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " pod="openstack/cinder-scheduler-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.960764 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-svc\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.960896 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tdfd\" (UniqueName: \"kubernetes.io/projected/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-kube-api-access-5tdfd\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.960918 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-config\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.960957 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.960984 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.961004 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.979714 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.981516 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.984948 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 08:42:57 crc kubenswrapper[4743]: I1122 08:42:57.989944 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.011378 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.070269 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tdfd\" (UniqueName: \"kubernetes.io/projected/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-kube-api-access-5tdfd\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.070314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-config\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.070392 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.070441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.070462 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.070538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-svc\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.071750 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-svc\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.071855 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.072701 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.073314 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.074141 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-config\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.096844 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tdfd\" (UniqueName: \"kubernetes.io/projected/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-kube-api-access-5tdfd\") pod \"dnsmasq-dns-6578955fd5-w9dgz\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.128413 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.173642 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-scripts\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.174092 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gkbc\" (UniqueName: \"kubernetes.io/projected/e87c4744-cc03-4cbf-929a-45c912d64c0c-kube-api-access-7gkbc\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.174116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data-custom\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.174193 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e87c4744-cc03-4cbf-929a-45c912d64c0c-logs\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.174252 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.174318 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.174417 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e87c4744-cc03-4cbf-929a-45c912d64c0c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.278168 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-scripts\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.278236 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gkbc\" (UniqueName: \"kubernetes.io/projected/e87c4744-cc03-4cbf-929a-45c912d64c0c-kube-api-access-7gkbc\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.278259 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data-custom\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.278280 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e87c4744-cc03-4cbf-929a-45c912d64c0c-logs\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.280960 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e87c4744-cc03-4cbf-929a-45c912d64c0c-logs\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.281848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.281940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.282162 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e87c4744-cc03-4cbf-929a-45c912d64c0c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.284014 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e87c4744-cc03-4cbf-929a-45c912d64c0c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.296855 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data-custom\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.297313 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-scripts\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.298254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" event={"ID":"89d8e638-b97a-4273-9391-5e0c7dd1bfb1","Type":"ContainerStarted","Data":"c4a492c46b22ecd2cb2ce30f4d5cbacdf5b41359fdcc9e9ef3d84f92e3284551"} Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.300839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.300863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gkbc\" (UniqueName: \"kubernetes.io/projected/e87c4744-cc03-4cbf-929a-45c912d64c0c-kube-api-access-7gkbc\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.301697 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data\") pod \"cinder-api-0\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.307521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" event={"ID":"8fe5d70f-5277-4803-ae45-de61d0eefe27","Type":"ContainerStarted","Data":"596f95b1d0cc9abb230b4c2a4a8c4b0c1af12cc6eed82f9960b7ca6e13289379"} Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.321875 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" podStartSLOduration=2.912786284 podStartE2EDuration="5.321858123s" podCreationTimestamp="2025-11-22 08:42:53 +0000 UTC" firstStartedPulling="2025-11-22 08:42:54.409731424 +0000 UTC m=+1248.116092476" lastFinishedPulling="2025-11-22 08:42:56.818803263 +0000 UTC m=+1250.525164315" observedRunningTime="2025-11-22 08:42:58.317563939 +0000 UTC m=+1252.023924991" watchObservedRunningTime="2025-11-22 08:42:58.321858123 +0000 UTC m=+1252.028219175" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.327665 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" event={"ID":"dc034ce8-656e-4c88-92f1-18f384ae1a18","Type":"ContainerStarted","Data":"0febb6e2d7ff4813fd6df7b99de1a803ade35dd751b487778b4585a6c0ce4d64"} Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.327718 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" event={"ID":"dc034ce8-656e-4c88-92f1-18f384ae1a18","Type":"ContainerStarted","Data":"5c99c956fa088361d1835204709a0f50ffb52b9cc07253eced2b8fa90aa30577"} Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.354817 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" podStartSLOduration=2.861205665 podStartE2EDuration="5.354793544s" podCreationTimestamp="2025-11-22 08:42:53 +0000 UTC" firstStartedPulling="2025-11-22 08:42:54.315760691 +0000 UTC m=+1248.022121743" lastFinishedPulling="2025-11-22 08:42:56.80934856 +0000 UTC m=+1250.515709622" observedRunningTime="2025-11-22 08:42:58.344780615 +0000 UTC m=+1252.051141667" watchObservedRunningTime="2025-11-22 08:42:58.354793544 +0000 UTC m=+1252.061154596" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.436365 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.547394 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:42:58 crc kubenswrapper[4743]: W1122 08:42:58.625913 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78b5ac9d_e133_4b32_9b44_fc7b9ff3f71f.slice/crio-38e512ce4f1a232415b02044237d1d6b7f269311c082095169fa45a05136c8bd WatchSource:0}: Error finding container 38e512ce4f1a232415b02044237d1d6b7f269311c082095169fa45a05136c8bd: Status 404 returned error can't find the container with id 38e512ce4f1a232415b02044237d1d6b7f269311c082095169fa45a05136c8bd Nov 22 08:42:58 crc kubenswrapper[4743]: I1122 08:42:58.729730 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w9dgz"] Nov 22 08:42:59 crc kubenswrapper[4743]: I1122 08:42:59.117951 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:42:59 crc kubenswrapper[4743]: I1122 08:42:59.340416 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e87c4744-cc03-4cbf-929a-45c912d64c0c","Type":"ContainerStarted","Data":"c24a6bb0427bf976ea1a858a32570c3fb95a6b05ecb2f46e613202c382c34884"} Nov 22 08:42:59 crc kubenswrapper[4743]: I1122 08:42:59.342783 4743 generic.go:334] "Generic (PLEG): container finished" podID="8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" containerID="a20fdbb0f842e51ec330438640c16b55c9539f6c5fb956a8d31a2287c2f59d62" exitCode=0 Nov 22 08:42:59 crc kubenswrapper[4743]: I1122 08:42:59.342840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" event={"ID":"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe","Type":"ContainerDied","Data":"a20fdbb0f842e51ec330438640c16b55c9539f6c5fb956a8d31a2287c2f59d62"} Nov 22 08:42:59 crc kubenswrapper[4743]: I1122 08:42:59.342876 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" event={"ID":"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe","Type":"ContainerStarted","Data":"31c6b04926ef4af3fc66bf22ce89d3b4aac4b2a07bc7856bc783fa19ba616a81"} Nov 22 08:42:59 crc kubenswrapper[4743]: I1122 08:42:59.344603 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f","Type":"ContainerStarted","Data":"38e512ce4f1a232415b02044237d1d6b7f269311c082095169fa45a05136c8bd"} Nov 22 08:42:59 crc kubenswrapper[4743]: I1122 08:42:59.355507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" event={"ID":"dc034ce8-656e-4c88-92f1-18f384ae1a18","Type":"ContainerStarted","Data":"455de173684d6834930eabe9a480ac739569a3760776782c2c81d8591d036411"} Nov 22 08:42:59 crc kubenswrapper[4743]: I1122 08:42:59.355853 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" podUID="d2a8e670-15d3-4d05-b0c9-386ec9befc9a" containerName="dnsmasq-dns" containerID="cri-o://7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10" gracePeriod=10 Nov 22 08:42:59 crc kubenswrapper[4743]: I1122 08:42:59.394075 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" podStartSLOduration=3.394032141 podStartE2EDuration="3.394032141s" podCreationTimestamp="2025-11-22 08:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:42:59.389950103 +0000 UTC m=+1253.096311155" watchObservedRunningTime="2025-11-22 08:42:59.394032141 +0000 UTC m=+1253.100393203" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.141154 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.324497 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-sb\") pod \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.324672 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-nb\") pod \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.324729 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-swift-storage-0\") pod \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.324756 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq8fn\" (UniqueName: \"kubernetes.io/projected/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-kube-api-access-jq8fn\") pod \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.324803 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-svc\") pod \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.324833 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-config\") pod \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\" (UID: \"d2a8e670-15d3-4d05-b0c9-386ec9befc9a\") " Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.331190 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-kube-api-access-jq8fn" (OuterVolumeSpecName: "kube-api-access-jq8fn") pod "d2a8e670-15d3-4d05-b0c9-386ec9befc9a" (UID: "d2a8e670-15d3-4d05-b0c9-386ec9befc9a"). InnerVolumeSpecName "kube-api-access-jq8fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.390514 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e87c4744-cc03-4cbf-929a-45c912d64c0c","Type":"ContainerStarted","Data":"4d28bbe29fc448e35abc64e8a734311fe6a648e6bd6b421e1355130eba81ef7e"} Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.391093 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-config" (OuterVolumeSpecName: "config") pod "d2a8e670-15d3-4d05-b0c9-386ec9befc9a" (UID: "d2a8e670-15d3-4d05-b0c9-386ec9befc9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.393054 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d2a8e670-15d3-4d05-b0c9-386ec9befc9a" (UID: "d2a8e670-15d3-4d05-b0c9-386ec9befc9a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.405962 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" event={"ID":"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe","Type":"ContainerStarted","Data":"e9e305800baf94abf462f598104fa32c5ed7dcf8670598fe185ffc0786bbcc6a"} Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.406041 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.418319 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f","Type":"ContainerStarted","Data":"fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d"} Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.424531 4743 generic.go:334] "Generic (PLEG): container finished" podID="d2a8e670-15d3-4d05-b0c9-386ec9befc9a" containerID="7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10" exitCode=0 Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.424769 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" event={"ID":"d2a8e670-15d3-4d05-b0c9-386ec9befc9a","Type":"ContainerDied","Data":"7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10"} Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.424854 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" event={"ID":"d2a8e670-15d3-4d05-b0c9-386ec9befc9a","Type":"ContainerDied","Data":"75d1b150c5e5d1171c534d475a426f13fd421952b42cf2e9d47e30e7356baaba"} Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.424881 4743 scope.go:117] "RemoveContainer" containerID="7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.425133 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jkt28" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.426032 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.426121 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.426684 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" podStartSLOduration=3.426673786 podStartE2EDuration="3.426673786s" podCreationTimestamp="2025-11-22 08:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:43:00.424450612 +0000 UTC m=+1254.130811664" watchObservedRunningTime="2025-11-22 08:43:00.426673786 +0000 UTC m=+1254.133034838" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.429304 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.429330 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq8fn\" (UniqueName: \"kubernetes.io/projected/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-kube-api-access-jq8fn\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.429339 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.444026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d2a8e670-15d3-4d05-b0c9-386ec9befc9a" (UID: "d2a8e670-15d3-4d05-b0c9-386ec9befc9a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.444271 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d2a8e670-15d3-4d05-b0c9-386ec9befc9a" (UID: "d2a8e670-15d3-4d05-b0c9-386ec9befc9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.445336 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d2a8e670-15d3-4d05-b0c9-386ec9befc9a" (UID: "d2a8e670-15d3-4d05-b0c9-386ec9befc9a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.518557 4743 scope.go:117] "RemoveContainer" containerID="ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.533341 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.533378 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.533391 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d2a8e670-15d3-4d05-b0c9-386ec9befc9a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.561647 4743 scope.go:117] "RemoveContainer" containerID="7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10" Nov 22 08:43:00 crc kubenswrapper[4743]: E1122 08:43:00.562462 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10\": container with ID starting with 7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10 not found: ID does not exist" containerID="7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.562532 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10"} err="failed to get container status \"7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10\": rpc error: code = NotFound desc = could not find container \"7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10\": container with ID starting with 7bfeca78c33b7a526ed6b6d9739fc9cbea10e0cde56226e95a262d802f29dc10 not found: ID does not exist" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.562597 4743 scope.go:117] "RemoveContainer" containerID="ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73" Nov 22 08:43:00 crc kubenswrapper[4743]: E1122 08:43:00.563008 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73\": container with ID starting with ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73 not found: ID does not exist" containerID="ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.563082 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73"} err="failed to get container status \"ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73\": rpc error: code = NotFound desc = could not find container \"ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73\": container with ID starting with ec7f86e85324dfa871f3385b267ae5ba3eb8418125b9b1c3019341d965463a73 not found: ID does not exist" Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.593540 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.776675 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jkt28"] Nov 22 08:43:00 crc kubenswrapper[4743]: I1122 08:43:00.792613 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jkt28"] Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.164791 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a8e670-15d3-4d05-b0c9-386ec9befc9a" path="/var/lib/kubelet/pods/d2a8e670-15d3-4d05-b0c9-386ec9befc9a/volumes" Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.241765 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.241831 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.241885 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.242680 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b78c811f3b4d026db1f9c1117668378bf529a268e8a2883a781fcb01b039225"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.242747 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://5b78c811f3b4d026db1f9c1117668378bf529a268e8a2883a781fcb01b039225" gracePeriod=600 Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.445997 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e87c4744-cc03-4cbf-929a-45c912d64c0c","Type":"ContainerStarted","Data":"1353a3a23996d88dd42b52f243bc37775202d4377fb8072678af08d9a69e8b34"} Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.446173 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e87c4744-cc03-4cbf-929a-45c912d64c0c" containerName="cinder-api-log" containerID="cri-o://4d28bbe29fc448e35abc64e8a734311fe6a648e6bd6b421e1355130eba81ef7e" gracePeriod=30 Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.446241 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.446595 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e87c4744-cc03-4cbf-929a-45c912d64c0c" containerName="cinder-api" containerID="cri-o://1353a3a23996d88dd42b52f243bc37775202d4377fb8072678af08d9a69e8b34" gracePeriod=30 Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.453057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f","Type":"ContainerStarted","Data":"e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4"} Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.474808 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="5b78c811f3b4d026db1f9c1117668378bf529a268e8a2883a781fcb01b039225" exitCode=0 Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.475768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"5b78c811f3b4d026db1f9c1117668378bf529a268e8a2883a781fcb01b039225"} Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.475804 4743 scope.go:117] "RemoveContainer" containerID="86f8f3accbf0662fa413321f99b5f1afa28e63a1e90c6983235baff64b7561bc" Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.513438 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.513417105 podStartE2EDuration="4.513417105s" podCreationTimestamp="2025-11-22 08:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:43:01.484231443 +0000 UTC m=+1255.190592495" watchObservedRunningTime="2025-11-22 08:43:01.513417105 +0000 UTC m=+1255.219778157" Nov 22 08:43:01 crc kubenswrapper[4743]: I1122 08:43:01.524351 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.866030512 podStartE2EDuration="4.52432924s" podCreationTimestamp="2025-11-22 08:42:57 +0000 UTC" firstStartedPulling="2025-11-22 08:42:58.628709603 +0000 UTC m=+1252.335070665" lastFinishedPulling="2025-11-22 08:42:59.287008341 +0000 UTC m=+1252.993369393" observedRunningTime="2025-11-22 08:43:01.510318076 +0000 UTC m=+1255.216679148" watchObservedRunningTime="2025-11-22 08:43:01.52432924 +0000 UTC m=+1255.230690302" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.486631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"986e2145b00a5a447ecb09e84f860b781baabf3cc2562b60d26d99a571cd2cc8"} Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.488610 4743 generic.go:334] "Generic (PLEG): container finished" podID="e87c4744-cc03-4cbf-929a-45c912d64c0c" containerID="1353a3a23996d88dd42b52f243bc37775202d4377fb8072678af08d9a69e8b34" exitCode=0 Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.488640 4743 generic.go:334] "Generic (PLEG): container finished" podID="e87c4744-cc03-4cbf-929a-45c912d64c0c" containerID="4d28bbe29fc448e35abc64e8a734311fe6a648e6bd6b421e1355130eba81ef7e" exitCode=143 Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.488680 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e87c4744-cc03-4cbf-929a-45c912d64c0c","Type":"ContainerDied","Data":"1353a3a23996d88dd42b52f243bc37775202d4377fb8072678af08d9a69e8b34"} Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.488728 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e87c4744-cc03-4cbf-929a-45c912d64c0c","Type":"ContainerDied","Data":"4d28bbe29fc448e35abc64e8a734311fe6a648e6bd6b421e1355130eba81ef7e"} Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.488740 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e87c4744-cc03-4cbf-929a-45c912d64c0c","Type":"ContainerDied","Data":"c24a6bb0427bf976ea1a858a32570c3fb95a6b05ecb2f46e613202c382c34884"} Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.488752 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c24a6bb0427bf976ea1a858a32570c3fb95a6b05ecb2f46e613202c382c34884" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.495907 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.616921 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data-custom\") pod \"e87c4744-cc03-4cbf-929a-45c912d64c0c\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.617374 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e87c4744-cc03-4cbf-929a-45c912d64c0c-etc-machine-id\") pod \"e87c4744-cc03-4cbf-929a-45c912d64c0c\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.617423 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e87c4744-cc03-4cbf-929a-45c912d64c0c-logs\") pod \"e87c4744-cc03-4cbf-929a-45c912d64c0c\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.617503 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-scripts\") pod \"e87c4744-cc03-4cbf-929a-45c912d64c0c\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.617504 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e87c4744-cc03-4cbf-929a-45c912d64c0c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e87c4744-cc03-4cbf-929a-45c912d64c0c" (UID: "e87c4744-cc03-4cbf-929a-45c912d64c0c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.617637 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gkbc\" (UniqueName: \"kubernetes.io/projected/e87c4744-cc03-4cbf-929a-45c912d64c0c-kube-api-access-7gkbc\") pod \"e87c4744-cc03-4cbf-929a-45c912d64c0c\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.617684 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data\") pod \"e87c4744-cc03-4cbf-929a-45c912d64c0c\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.617716 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-combined-ca-bundle\") pod \"e87c4744-cc03-4cbf-929a-45c912d64c0c\" (UID: \"e87c4744-cc03-4cbf-929a-45c912d64c0c\") " Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.617928 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e87c4744-cc03-4cbf-929a-45c912d64c0c-logs" (OuterVolumeSpecName: "logs") pod "e87c4744-cc03-4cbf-929a-45c912d64c0c" (UID: "e87c4744-cc03-4cbf-929a-45c912d64c0c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.618308 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e87c4744-cc03-4cbf-929a-45c912d64c0c-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.618334 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e87c4744-cc03-4cbf-929a-45c912d64c0c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.624039 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e87c4744-cc03-4cbf-929a-45c912d64c0c" (UID: "e87c4744-cc03-4cbf-929a-45c912d64c0c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.638611 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e87c4744-cc03-4cbf-929a-45c912d64c0c-kube-api-access-7gkbc" (OuterVolumeSpecName: "kube-api-access-7gkbc") pod "e87c4744-cc03-4cbf-929a-45c912d64c0c" (UID: "e87c4744-cc03-4cbf-929a-45c912d64c0c"). InnerVolumeSpecName "kube-api-access-7gkbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.640436 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-scripts" (OuterVolumeSpecName: "scripts") pod "e87c4744-cc03-4cbf-929a-45c912d64c0c" (UID: "e87c4744-cc03-4cbf-929a-45c912d64c0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.649379 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e87c4744-cc03-4cbf-929a-45c912d64c0c" (UID: "e87c4744-cc03-4cbf-929a-45c912d64c0c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.679956 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data" (OuterVolumeSpecName: "config-data") pod "e87c4744-cc03-4cbf-929a-45c912d64c0c" (UID: "e87c4744-cc03-4cbf-929a-45c912d64c0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.719935 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.719980 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gkbc\" (UniqueName: \"kubernetes.io/projected/e87c4744-cc03-4cbf-929a-45c912d64c0c-kube-api-access-7gkbc\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.719990 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.720000 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:02 crc kubenswrapper[4743]: I1122 08:43:02.720009 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e87c4744-cc03-4cbf-929a-45c912d64c0c-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.011987 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.497076 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.526339 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.534008 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.556815 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:43:03 crc kubenswrapper[4743]: E1122 08:43:03.557466 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87c4744-cc03-4cbf-929a-45c912d64c0c" containerName="cinder-api" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.557490 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87c4744-cc03-4cbf-929a-45c912d64c0c" containerName="cinder-api" Nov 22 08:43:03 crc kubenswrapper[4743]: E1122 08:43:03.558669 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e87c4744-cc03-4cbf-929a-45c912d64c0c" containerName="cinder-api-log" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.558679 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e87c4744-cc03-4cbf-929a-45c912d64c0c" containerName="cinder-api-log" Nov 22 08:43:03 crc kubenswrapper[4743]: E1122 08:43:03.558708 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a8e670-15d3-4d05-b0c9-386ec9befc9a" containerName="dnsmasq-dns" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.558715 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a8e670-15d3-4d05-b0c9-386ec9befc9a" containerName="dnsmasq-dns" Nov 22 08:43:03 crc kubenswrapper[4743]: E1122 08:43:03.558728 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a8e670-15d3-4d05-b0c9-386ec9befc9a" containerName="init" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.558734 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a8e670-15d3-4d05-b0c9-386ec9befc9a" containerName="init" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.558936 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87c4744-cc03-4cbf-929a-45c912d64c0c" containerName="cinder-api-log" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.558952 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e87c4744-cc03-4cbf-929a-45c912d64c0c" containerName="cinder-api" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.558962 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a8e670-15d3-4d05-b0c9-386ec9befc9a" containerName="dnsmasq-dns" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.561153 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.567395 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.567492 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.567394 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.570264 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.737096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.737160 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.737189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.737319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.737404 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29bf9036-d8fc-43f7-9153-f133a723c6df-etc-machine-id\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.737447 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.737476 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29bf9036-d8fc-43f7-9153-f133a723c6df-logs\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.737559 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgcsc\" (UniqueName: \"kubernetes.io/projected/29bf9036-d8fc-43f7-9153-f133a723c6df-kube-api-access-fgcsc\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.737660 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-public-tls-certs\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.839973 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgcsc\" (UniqueName: \"kubernetes.io/projected/29bf9036-d8fc-43f7-9153-f133a723c6df-kube-api-access-fgcsc\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.840341 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-public-tls-certs\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.840481 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.840519 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.840558 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.840607 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.840650 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29bf9036-d8fc-43f7-9153-f133a723c6df-etc-machine-id\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.840675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.840701 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29bf9036-d8fc-43f7-9153-f133a723c6df-logs\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.841331 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29bf9036-d8fc-43f7-9153-f133a723c6df-logs\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.843647 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29bf9036-d8fc-43f7-9153-f133a723c6df-etc-machine-id\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.847084 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.851870 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.852368 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.853069 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-public-tls-certs\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.854147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.859384 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgcsc\" (UniqueName: \"kubernetes.io/projected/29bf9036-d8fc-43f7-9153-f133a723c6df-kube-api-access-fgcsc\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.868044 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom\") pod \"cinder-api-0\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " pod="openstack/cinder-api-0" Nov 22 08:43:03 crc kubenswrapper[4743]: I1122 08:43:03.896223 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 08:43:04 crc kubenswrapper[4743]: I1122 08:43:04.394059 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:43:04 crc kubenswrapper[4743]: I1122 08:43:04.507178 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29bf9036-d8fc-43f7-9153-f133a723c6df","Type":"ContainerStarted","Data":"fb1a2f68abea61d3fee74635e859d7ba84e53bbf1e5a5f005f35b7ba574c63c1"} Nov 22 08:43:05 crc kubenswrapper[4743]: I1122 08:43:05.163267 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e87c4744-cc03-4cbf-929a-45c912d64c0c" path="/var/lib/kubelet/pods/e87c4744-cc03-4cbf-929a-45c912d64c0c/volumes" Nov 22 08:43:05 crc kubenswrapper[4743]: I1122 08:43:05.168976 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Nov 22 08:43:05 crc kubenswrapper[4743]: I1122 08:43:05.523128 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29bf9036-d8fc-43f7-9153-f133a723c6df","Type":"ContainerStarted","Data":"bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7"} Nov 22 08:43:05 crc kubenswrapper[4743]: I1122 08:43:05.741225 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:43:05 crc kubenswrapper[4743]: I1122 08:43:05.753550 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:43:06 crc kubenswrapper[4743]: I1122 08:43:06.537077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29bf9036-d8fc-43f7-9153-f133a723c6df","Type":"ContainerStarted","Data":"85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895"} Nov 22 08:43:06 crc kubenswrapper[4743]: I1122 08:43:06.537147 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 08:43:06 crc kubenswrapper[4743]: I1122 08:43:06.567050 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.5670311630000002 podStartE2EDuration="3.567031163s" podCreationTimestamp="2025-11-22 08:43:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:43:06.562342528 +0000 UTC m=+1260.268703580" watchObservedRunningTime="2025-11-22 08:43:06.567031163 +0000 UTC m=+1260.273392215" Nov 22 08:43:06 crc kubenswrapper[4743]: I1122 08:43:06.965833 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:43:07 crc kubenswrapper[4743]: I1122 08:43:07.523435 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:43:07 crc kubenswrapper[4743]: I1122 08:43:07.571092 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.132787 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.205405 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l6b4p"] Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.205665 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" podUID="e7744a72-96d3-43bf-89f3-c56ae2a47cdf" containerName="dnsmasq-dns" containerID="cri-o://ed1bd93819af5eff5fb7ea42bdeb4f410e196149a63cb232863696e0b6a769c3" gracePeriod=10 Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.394718 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.459847 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.570886 4743 generic.go:334] "Generic (PLEG): container finished" podID="e7744a72-96d3-43bf-89f3-c56ae2a47cdf" containerID="ed1bd93819af5eff5fb7ea42bdeb4f410e196149a63cb232863696e0b6a769c3" exitCode=0 Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.571119 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" containerName="cinder-scheduler" containerID="cri-o://fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d" gracePeriod=30 Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.571201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" event={"ID":"e7744a72-96d3-43bf-89f3-c56ae2a47cdf","Type":"ContainerDied","Data":"ed1bd93819af5eff5fb7ea42bdeb4f410e196149a63cb232863696e0b6a769c3"} Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.571252 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" containerName="probe" containerID="cri-o://e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4" gracePeriod=30 Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.832418 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.955662 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-svc\") pod \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.955743 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-sb\") pod \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.955814 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-swift-storage-0\") pod \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.955985 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-config\") pod \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.956005 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-nb\") pod \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.956032 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5nf\" (UniqueName: \"kubernetes.io/projected/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-kube-api-access-qg5nf\") pod \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\" (UID: \"e7744a72-96d3-43bf-89f3-c56ae2a47cdf\") " Nov 22 08:43:08 crc kubenswrapper[4743]: I1122 08:43:08.962943 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-kube-api-access-qg5nf" (OuterVolumeSpecName: "kube-api-access-qg5nf") pod "e7744a72-96d3-43bf-89f3-c56ae2a47cdf" (UID: "e7744a72-96d3-43bf-89f3-c56ae2a47cdf"). InnerVolumeSpecName "kube-api-access-qg5nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.013950 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.045459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e7744a72-96d3-43bf-89f3-c56ae2a47cdf" (UID: "e7744a72-96d3-43bf-89f3-c56ae2a47cdf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.058911 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5nf\" (UniqueName: \"kubernetes.io/projected/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-kube-api-access-qg5nf\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.058950 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.068478 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-config" (OuterVolumeSpecName: "config") pod "e7744a72-96d3-43bf-89f3-c56ae2a47cdf" (UID: "e7744a72-96d3-43bf-89f3-c56ae2a47cdf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.076030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e7744a72-96d3-43bf-89f3-c56ae2a47cdf" (UID: "e7744a72-96d3-43bf-89f3-c56ae2a47cdf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.078850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e7744a72-96d3-43bf-89f3-c56ae2a47cdf" (UID: "e7744a72-96d3-43bf-89f3-c56ae2a47cdf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.108704 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e7744a72-96d3-43bf-89f3-c56ae2a47cdf" (UID: "e7744a72-96d3-43bf-89f3-c56ae2a47cdf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.160559 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.160607 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.160617 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.160625 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e7744a72-96d3-43bf-89f3-c56ae2a47cdf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.229616 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.303734 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58c8b4cc8-lgv67"] Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.304245 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58c8b4cc8-lgv67" podUID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" containerName="barbican-api" containerID="cri-o://c53c143e140439571a5c21151cef803359eb1910e81614489e491c618d3928bd" gracePeriod=30 Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.304176 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-58c8b4cc8-lgv67" podUID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" containerName="barbican-api-log" containerID="cri-o://789ae711beac7ce8ccb05e661fd1b5b173a919c21b3d48b0c864746ee72cd73f" gracePeriod=30 Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.595193 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 22 08:43:09 crc kubenswrapper[4743]: E1122 08:43:09.595914 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7744a72-96d3-43bf-89f3-c56ae2a47cdf" containerName="dnsmasq-dns" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.595927 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7744a72-96d3-43bf-89f3-c56ae2a47cdf" containerName="dnsmasq-dns" Nov 22 08:43:09 crc kubenswrapper[4743]: E1122 08:43:09.595946 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7744a72-96d3-43bf-89f3-c56ae2a47cdf" containerName="init" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.595964 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7744a72-96d3-43bf-89f3-c56ae2a47cdf" containerName="init" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.596220 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7744a72-96d3-43bf-89f3-c56ae2a47cdf" containerName="dnsmasq-dns" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.596837 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.601772 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.601916 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-h7ltk" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.602859 4743 generic.go:334] "Generic (PLEG): container finished" podID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" containerID="789ae711beac7ce8ccb05e661fd1b5b173a919c21b3d48b0c864746ee72cd73f" exitCode=143 Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.603004 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58c8b4cc8-lgv67" event={"ID":"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82","Type":"ContainerDied","Data":"789ae711beac7ce8ccb05e661fd1b5b173a919c21b3d48b0c864746ee72cd73f"} Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.603733 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.615499 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.618917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" event={"ID":"e7744a72-96d3-43bf-89f3-c56ae2a47cdf","Type":"ContainerDied","Data":"3c486e2934e50b55097a78c1d13a3a8890a5652e0f8cb4233ea0d3f038a7f888"} Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.618979 4743 scope.go:117] "RemoveContainer" containerID="ed1bd93819af5eff5fb7ea42bdeb4f410e196149a63cb232863696e0b6a769c3" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.619165 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-l6b4p" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.639503 4743 generic.go:334] "Generic (PLEG): container finished" podID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" containerID="e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4" exitCode=0 Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.639905 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f","Type":"ContainerDied","Data":"e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4"} Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.661638 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l6b4p"] Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.665374 4743 scope.go:117] "RemoveContainer" containerID="61ff329d47b92a7e6d486e0d734bc9efe3bd93906e103e58e12a058cb93825e8" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.667603 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.667727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.667787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff594\" (UniqueName: \"kubernetes.io/projected/5987ad61-2878-4efc-98ca-ea29b123f26e-kube-api-access-ff594\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.667857 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config-secret\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.669040 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-l6b4p"] Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.769437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config-secret\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.771706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.773530 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.773618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff594\" (UniqueName: \"kubernetes.io/projected/5987ad61-2878-4efc-98ca-ea29b123f26e-kube-api-access-ff594\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.775468 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.781251 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config-secret\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.788324 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.815233 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff594\" (UniqueName: \"kubernetes.io/projected/5987ad61-2878-4efc-98ca-ea29b123f26e-kube-api-access-ff594\") pod \"openstackclient\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " pod="openstack/openstackclient" Nov 22 08:43:09 crc kubenswrapper[4743]: I1122 08:43:09.934690 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 08:43:10 crc kubenswrapper[4743]: I1122 08:43:10.435211 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 08:43:10 crc kubenswrapper[4743]: W1122 08:43:10.437434 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5987ad61_2878_4efc_98ca_ea29b123f26e.slice/crio-99dd846f2bd260ec1d96421239aa5d7f6d473f8d72dfc7715e12e47b4055489a WatchSource:0}: Error finding container 99dd846f2bd260ec1d96421239aa5d7f6d473f8d72dfc7715e12e47b4055489a: Status 404 returned error can't find the container with id 99dd846f2bd260ec1d96421239aa5d7f6d473f8d72dfc7715e12e47b4055489a Nov 22 08:43:10 crc kubenswrapper[4743]: I1122 08:43:10.650918 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5987ad61-2878-4efc-98ca-ea29b123f26e","Type":"ContainerStarted","Data":"99dd846f2bd260ec1d96421239aa5d7f6d473f8d72dfc7715e12e47b4055489a"} Nov 22 08:43:11 crc kubenswrapper[4743]: I1122 08:43:11.170402 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7744a72-96d3-43bf-89f3-c56ae2a47cdf" path="/var/lib/kubelet/pods/e7744a72-96d3-43bf-89f3-c56ae2a47cdf/volumes" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.584091 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.693367 4743 generic.go:334] "Generic (PLEG): container finished" podID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" containerID="fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d" exitCode=0 Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.693432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f","Type":"ContainerDied","Data":"fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d"} Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.693463 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f","Type":"ContainerDied","Data":"38e512ce4f1a232415b02044237d1d6b7f269311c082095169fa45a05136c8bd"} Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.693484 4743 scope.go:117] "RemoveContainer" containerID="e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.693589 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.698985 4743 generic.go:334] "Generic (PLEG): container finished" podID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" containerID="c53c143e140439571a5c21151cef803359eb1910e81614489e491c618d3928bd" exitCode=0 Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.699063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58c8b4cc8-lgv67" event={"ID":"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82","Type":"ContainerDied","Data":"c53c143e140439571a5c21151cef803359eb1910e81614489e491c618d3928bd"} Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.726354 4743 scope.go:117] "RemoveContainer" containerID="fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.735326 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data\") pod \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.735396 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-etc-machine-id\") pod \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.735427 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-scripts\") pod \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.735528 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-combined-ca-bundle\") pod \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.735687 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qltmq\" (UniqueName: \"kubernetes.io/projected/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-kube-api-access-qltmq\") pod \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.735718 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data-custom\") pod \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\" (UID: \"78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f\") " Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.737092 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" (UID: "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.747411 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" (UID: "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.754422 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-scripts" (OuterVolumeSpecName: "scripts") pod "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" (UID: "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.762792 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-kube-api-access-qltmq" (OuterVolumeSpecName: "kube-api-access-qltmq") pod "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" (UID: "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f"). InnerVolumeSpecName "kube-api-access-qltmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.784924 4743 scope.go:117] "RemoveContainer" containerID="e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4" Nov 22 08:43:12 crc kubenswrapper[4743]: E1122 08:43:12.785378 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4\": container with ID starting with e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4 not found: ID does not exist" containerID="e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.785414 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4"} err="failed to get container status \"e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4\": rpc error: code = NotFound desc = could not find container \"e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4\": container with ID starting with e68c207b4fa306e289421797665f0ee1bcc7bde62918a5d46dfc3980ee088cc4 not found: ID does not exist" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.785438 4743 scope.go:117] "RemoveContainer" containerID="fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d" Nov 22 08:43:12 crc kubenswrapper[4743]: E1122 08:43:12.785767 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d\": container with ID starting with fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d not found: ID does not exist" containerID="fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.785792 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d"} err="failed to get container status \"fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d\": rpc error: code = NotFound desc = could not find container \"fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d\": container with ID starting with fac4ffe8c87c5cd5cb700655b2f655c9999d6dce1e1985a610187688db06383d not found: ID does not exist" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.840270 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.840301 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.840313 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qltmq\" (UniqueName: \"kubernetes.io/projected/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-kube-api-access-qltmq\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.840324 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.845998 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" (UID: "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.933423 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data" (OuterVolumeSpecName: "config-data") pod "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" (UID: "78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.942393 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.942423 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:12 crc kubenswrapper[4743]: I1122 08:43:12.979744 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.038876 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.056400 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.068737 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:43:13 crc kubenswrapper[4743]: E1122 08:43:13.069116 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" containerName="cinder-scheduler" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.069133 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" containerName="cinder-scheduler" Nov 22 08:43:13 crc kubenswrapper[4743]: E1122 08:43:13.069151 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" containerName="probe" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.069159 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" containerName="probe" Nov 22 08:43:13 crc kubenswrapper[4743]: E1122 08:43:13.069186 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" containerName="barbican-api-log" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.069193 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" containerName="barbican-api-log" Nov 22 08:43:13 crc kubenswrapper[4743]: E1122 08:43:13.069209 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" containerName="barbican-api" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.069215 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" containerName="barbican-api" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.069384 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" containerName="cinder-scheduler" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.069398 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" containerName="barbican-api-log" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.069418 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" containerName="barbican-api" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.069426 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" containerName="probe" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.070439 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.073826 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.074372 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.146063 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data\") pod \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.146172 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-combined-ca-bundle\") pod \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.146221 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd7wn\" (UniqueName: \"kubernetes.io/projected/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-kube-api-access-bd7wn\") pod \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.146301 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-logs\") pod \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.146332 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data-custom\") pod \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\" (UID: \"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82\") " Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.148901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-logs" (OuterVolumeSpecName: "logs") pod "b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" (UID: "b3ff215f-5cdd-4fa3-8112-82c4cd14ac82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.152037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" (UID: "b3ff215f-5cdd-4fa3-8112-82c4cd14ac82"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.162180 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-kube-api-access-bd7wn" (OuterVolumeSpecName: "kube-api-access-bd7wn") pod "b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" (UID: "b3ff215f-5cdd-4fa3-8112-82c4cd14ac82"). InnerVolumeSpecName "kube-api-access-bd7wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.177064 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" (UID: "b3ff215f-5cdd-4fa3-8112-82c4cd14ac82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.179901 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f" path="/var/lib/kubelet/pods/78b5ac9d-e133-4b32-9b44-fc7b9ff3f71f/volumes" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.199209 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data" (OuterVolumeSpecName: "config-data") pod "b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" (UID: "b3ff215f-5cdd-4fa3-8112-82c4cd14ac82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.248860 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/145d3340-8ded-4082-b9c8-7b1a21390097-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.248919 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.249035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-scripts\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.249055 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.249089 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndv6n\" (UniqueName: \"kubernetes.io/projected/145d3340-8ded-4082-b9c8-7b1a21390097-kube-api-access-ndv6n\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.249106 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.249260 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.249293 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.249310 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bd7wn\" (UniqueName: \"kubernetes.io/projected/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-kube-api-access-bd7wn\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.249324 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.249337 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.351084 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.351241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-scripts\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.351267 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.351304 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndv6n\" (UniqueName: \"kubernetes.io/projected/145d3340-8ded-4082-b9c8-7b1a21390097-kube-api-access-ndv6n\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.351322 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.351843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/145d3340-8ded-4082-b9c8-7b1a21390097-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.351923 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/145d3340-8ded-4082-b9c8-7b1a21390097-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.355847 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.360341 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-scripts\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.362274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.362426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.371020 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndv6n\" (UniqueName: \"kubernetes.io/projected/145d3340-8ded-4082-b9c8-7b1a21390097-kube-api-access-ndv6n\") pod \"cinder-scheduler-0\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.393010 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.711035 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-58c8b4cc8-lgv67" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.711038 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-58c8b4cc8-lgv67" event={"ID":"b3ff215f-5cdd-4fa3-8112-82c4cd14ac82","Type":"ContainerDied","Data":"82537d7de7d143f19733cd925382b1dbb1fbe83a8ce5ee7f916f7b1b057b6d05"} Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.711512 4743 scope.go:117] "RemoveContainer" containerID="c53c143e140439571a5c21151cef803359eb1910e81614489e491c618d3928bd" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.747522 4743 scope.go:117] "RemoveContainer" containerID="789ae711beac7ce8ccb05e661fd1b5b173a919c21b3d48b0c864746ee72cd73f" Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.750426 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-58c8b4cc8-lgv67"] Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.758994 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-58c8b4cc8-lgv67"] Nov 22 08:43:13 crc kubenswrapper[4743]: I1122 08:43:13.862745 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:43:13 crc kubenswrapper[4743]: W1122 08:43:13.881154 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod145d3340_8ded_4082_b9c8_7b1a21390097.slice/crio-1962cb16879bf483bc8e74cf26e36f96f2fed16beaf2b0e74f674eb51c6b683e WatchSource:0}: Error finding container 1962cb16879bf483bc8e74cf26e36f96f2fed16beaf2b0e74f674eb51c6b683e: Status 404 returned error can't find the container with id 1962cb16879bf483bc8e74cf26e36f96f2fed16beaf2b0e74f674eb51c6b683e Nov 22 08:43:14 crc kubenswrapper[4743]: I1122 08:43:14.799073 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"145d3340-8ded-4082-b9c8-7b1a21390097","Type":"ContainerStarted","Data":"253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826"} Nov 22 08:43:14 crc kubenswrapper[4743]: I1122 08:43:14.799613 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"145d3340-8ded-4082-b9c8-7b1a21390097","Type":"ContainerStarted","Data":"1962cb16879bf483bc8e74cf26e36f96f2fed16beaf2b0e74f674eb51c6b683e"} Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.167861 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ff215f-5cdd-4fa3-8112-82c4cd14ac82" path="/var/lib/kubelet/pods/b3ff215f-5cdd-4fa3-8112-82c4cd14ac82/volumes" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.202514 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5ff985d64c-mnpj5"] Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.204568 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.209684 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.210031 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.210329 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.215739 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5ff985d64c-mnpj5"] Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.293350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-log-httpd\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.293414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-etc-swift\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.293522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-config-data\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.293548 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-combined-ca-bundle\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.293623 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-run-httpd\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.293661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-public-tls-certs\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.293687 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n52xh\" (UniqueName: \"kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-kube-api-access-n52xh\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.293763 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-internal-tls-certs\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.395175 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-config-data\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.395526 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-combined-ca-bundle\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.395629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-run-httpd\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.395658 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-public-tls-certs\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.395682 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n52xh\" (UniqueName: \"kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-kube-api-access-n52xh\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.395741 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-internal-tls-certs\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.395798 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-log-httpd\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.395828 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-etc-swift\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.397391 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-run-httpd\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.397713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-log-httpd\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.404610 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-public-tls-certs\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.406225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-etc-swift\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.406531 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-combined-ca-bundle\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.406856 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-config-data\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.412878 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-internal-tls-certs\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.415433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n52xh\" (UniqueName: \"kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-kube-api-access-n52xh\") pod \"swift-proxy-5ff985d64c-mnpj5\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.536380 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.814110 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"145d3340-8ded-4082-b9c8-7b1a21390097","Type":"ContainerStarted","Data":"fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af"} Nov 22 08:43:15 crc kubenswrapper[4743]: I1122 08:43:15.835692 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.835673834 podStartE2EDuration="2.835673834s" podCreationTimestamp="2025-11-22 08:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:43:15.833909493 +0000 UTC m=+1269.540270545" watchObservedRunningTime="2025-11-22 08:43:15.835673834 +0000 UTC m=+1269.542034886" Nov 22 08:43:16 crc kubenswrapper[4743]: I1122 08:43:16.161148 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 22 08:43:18 crc kubenswrapper[4743]: I1122 08:43:18.393780 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 08:43:18 crc kubenswrapper[4743]: I1122 08:43:18.604973 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:43:18 crc kubenswrapper[4743]: I1122 08:43:18.849009 4743 generic.go:334] "Generic (PLEG): container finished" podID="661914bd-2b43-425b-837a-8c4104173ef4" containerID="d5abc080430047fe8249a09ffc99b5e38fecf3ed98339ace1e5bbb8120a1400c" exitCode=137 Nov 22 08:43:18 crc kubenswrapper[4743]: I1122 08:43:18.849050 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661914bd-2b43-425b-837a-8c4104173ef4","Type":"ContainerDied","Data":"d5abc080430047fe8249a09ffc99b5e38fecf3ed98339ace1e5bbb8120a1400c"} Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.016298 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.089266 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c66bff4c4-wzr6r"] Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.089515 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c66bff4c4-wzr6r" podUID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" containerName="neutron-api" containerID="cri-o://036c461cf62561016b88bb0ead6384ec8c57f22a831737233da0605b7e3d7436" gracePeriod=30 Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.089710 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7c66bff4c4-wzr6r" podUID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" containerName="neutron-httpd" containerID="cri-o://8b5900c88e83e85276245596e3f8b4ba507a1f40a7ab7fd151132a27262e2cfe" gracePeriod=30 Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.250925 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.310224 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-combined-ca-bundle\") pod \"661914bd-2b43-425b-837a-8c4104173ef4\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.310894 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-run-httpd\") pod \"661914bd-2b43-425b-837a-8c4104173ef4\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.310994 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-scripts\") pod \"661914bd-2b43-425b-837a-8c4104173ef4\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.311125 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndxx9\" (UniqueName: \"kubernetes.io/projected/661914bd-2b43-425b-837a-8c4104173ef4-kube-api-access-ndxx9\") pod \"661914bd-2b43-425b-837a-8c4104173ef4\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.311206 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-config-data\") pod \"661914bd-2b43-425b-837a-8c4104173ef4\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.311293 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-sg-core-conf-yaml\") pod \"661914bd-2b43-425b-837a-8c4104173ef4\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.311476 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-log-httpd\") pod \"661914bd-2b43-425b-837a-8c4104173ef4\" (UID: \"661914bd-2b43-425b-837a-8c4104173ef4\") " Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.312117 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "661914bd-2b43-425b-837a-8c4104173ef4" (UID: "661914bd-2b43-425b-837a-8c4104173ef4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.312667 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "661914bd-2b43-425b-837a-8c4104173ef4" (UID: "661914bd-2b43-425b-837a-8c4104173ef4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.317859 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/661914bd-2b43-425b-837a-8c4104173ef4-kube-api-access-ndxx9" (OuterVolumeSpecName: "kube-api-access-ndxx9") pod "661914bd-2b43-425b-837a-8c4104173ef4" (UID: "661914bd-2b43-425b-837a-8c4104173ef4"). InnerVolumeSpecName "kube-api-access-ndxx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.318658 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-scripts" (OuterVolumeSpecName: "scripts") pod "661914bd-2b43-425b-837a-8c4104173ef4" (UID: "661914bd-2b43-425b-837a-8c4104173ef4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.362418 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "661914bd-2b43-425b-837a-8c4104173ef4" (UID: "661914bd-2b43-425b-837a-8c4104173ef4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.394703 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "661914bd-2b43-425b-837a-8c4104173ef4" (UID: "661914bd-2b43-425b-837a-8c4104173ef4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.417164 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.417203 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.417260 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndxx9\" (UniqueName: \"kubernetes.io/projected/661914bd-2b43-425b-837a-8c4104173ef4-kube-api-access-ndxx9\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.417276 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.417289 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661914bd-2b43-425b-837a-8c4104173ef4-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.417299 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.441425 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-config-data" (OuterVolumeSpecName: "config-data") pod "661914bd-2b43-425b-837a-8c4104173ef4" (UID: "661914bd-2b43-425b-837a-8c4104173ef4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.486046 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5ff985d64c-mnpj5"] Nov 22 08:43:21 crc kubenswrapper[4743]: W1122 08:43:21.491144 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod178ccbe4_360f_4a0d_b97c_edf5b8b8dcba.slice/crio-61d37ec764385fefd39781c27a55f0988a7bc61d3a51286576ac5cb2f4e35267 WatchSource:0}: Error finding container 61d37ec764385fefd39781c27a55f0988a7bc61d3a51286576ac5cb2f4e35267: Status 404 returned error can't find the container with id 61d37ec764385fefd39781c27a55f0988a7bc61d3a51286576ac5cb2f4e35267 Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.519103 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661914bd-2b43-425b-837a-8c4104173ef4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.894462 4743 generic.go:334] "Generic (PLEG): container finished" podID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" containerID="8b5900c88e83e85276245596e3f8b4ba507a1f40a7ab7fd151132a27262e2cfe" exitCode=0 Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.894569 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c66bff4c4-wzr6r" event={"ID":"d58da1fb-cb7a-4b26-9753-317919c3d2c9","Type":"ContainerDied","Data":"8b5900c88e83e85276245596e3f8b4ba507a1f40a7ab7fd151132a27262e2cfe"} Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.896559 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5987ad61-2878-4efc-98ca-ea29b123f26e","Type":"ContainerStarted","Data":"46baaf42142233869f49a3ed3725aeb263cb2291e27ce1211801aed7212ad955"} Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.901924 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661914bd-2b43-425b-837a-8c4104173ef4","Type":"ContainerDied","Data":"13981e383c530b2c42163461a9cfbeb9cda55f404c0daf4b27ab57e9d61a0e88"} Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.901989 4743 scope.go:117] "RemoveContainer" containerID="d5abc080430047fe8249a09ffc99b5e38fecf3ed98339ace1e5bbb8120a1400c" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.902187 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.909522 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ff985d64c-mnpj5" event={"ID":"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba","Type":"ContainerStarted","Data":"69a331217c6e9870990cf0477268ec07b586afa72d1bd546c97e364e672bdc27"} Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.909604 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ff985d64c-mnpj5" event={"ID":"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba","Type":"ContainerStarted","Data":"ff8003a3594d25ec03aad9438f8a8b6e3c4495c012f444863e724569495817e4"} Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.909625 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ff985d64c-mnpj5" event={"ID":"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba","Type":"ContainerStarted","Data":"61d37ec764385fefd39781c27a55f0988a7bc61d3a51286576ac5cb2f4e35267"} Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.909919 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.934505 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.4184084710000002 podStartE2EDuration="12.93448225s" podCreationTimestamp="2025-11-22 08:43:09 +0000 UTC" firstStartedPulling="2025-11-22 08:43:10.442355709 +0000 UTC m=+1264.148716761" lastFinishedPulling="2025-11-22 08:43:20.958429488 +0000 UTC m=+1274.664790540" observedRunningTime="2025-11-22 08:43:21.915283205 +0000 UTC m=+1275.621644277" watchObservedRunningTime="2025-11-22 08:43:21.93448225 +0000 UTC m=+1275.640843302" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.938585 4743 scope.go:117] "RemoveContainer" containerID="ca77000a763c249391614d5f690fb6b5a8606358f4a791b769a787874274d6e9" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.949950 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5ff985d64c-mnpj5" podStartSLOduration=6.949930796 podStartE2EDuration="6.949930796s" podCreationTimestamp="2025-11-22 08:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:43:21.939934597 +0000 UTC m=+1275.646295639" watchObservedRunningTime="2025-11-22 08:43:21.949930796 +0000 UTC m=+1275.656291848" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.967285 4743 scope.go:117] "RemoveContainer" containerID="7d9d7148adb064d5322e5ea819f5b41acd751275e9cf1ce1f5f112ec31fb9dcd" Nov 22 08:43:21 crc kubenswrapper[4743]: I1122 08:43:21.995863 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.023494 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.040641 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:22 crc kubenswrapper[4743]: E1122 08:43:22.041332 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="sg-core" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.041412 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="sg-core" Nov 22 08:43:22 crc kubenswrapper[4743]: E1122 08:43:22.041471 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="ceilometer-notification-agent" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.041522 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="ceilometer-notification-agent" Nov 22 08:43:22 crc kubenswrapper[4743]: E1122 08:43:22.041644 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="proxy-httpd" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.041698 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="proxy-httpd" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.041909 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="ceilometer-notification-agent" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.041982 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="sg-core" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.042055 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="661914bd-2b43-425b-837a-8c4104173ef4" containerName="proxy-httpd" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.043814 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.050473 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.053593 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.055497 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.131645 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-scripts\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.131725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.131846 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-config-data\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.131949 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-log-httpd\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.132058 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-run-httpd\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.132117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.132144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwh2k\" (UniqueName: \"kubernetes.io/projected/63d49750-ff45-4b82-a623-3141ba782527-kube-api-access-fwh2k\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.233985 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-scripts\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.234341 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.234363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-config-data\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.234395 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-log-httpd\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.234431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-run-httpd\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.234464 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.234484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwh2k\" (UniqueName: \"kubernetes.io/projected/63d49750-ff45-4b82-a623-3141ba782527-kube-api-access-fwh2k\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.235363 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-run-httpd\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.238175 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-log-httpd\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.239235 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-scripts\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.242372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.243173 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.260335 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-config-data\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.269275 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwh2k\" (UniqueName: \"kubernetes.io/projected/63d49750-ff45-4b82-a623-3141ba782527-kube-api-access-fwh2k\") pod \"ceilometer-0\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.367134 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.849705 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.921395 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d49750-ff45-4b82-a623-3141ba782527","Type":"ContainerStarted","Data":"38d208d8fcc1ee694b290fcfeda40bef40cbc7b669eb30d4926e138a73e2867b"} Nov 22 08:43:22 crc kubenswrapper[4743]: I1122 08:43:22.921957 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:23 crc kubenswrapper[4743]: I1122 08:43:23.179388 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="661914bd-2b43-425b-837a-8c4104173ef4" path="/var/lib/kubelet/pods/661914bd-2b43-425b-837a-8c4104173ef4/volumes" Nov 22 08:43:23 crc kubenswrapper[4743]: I1122 08:43:23.719463 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 08:43:23 crc kubenswrapper[4743]: I1122 08:43:23.931850 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d49750-ff45-4b82-a623-3141ba782527","Type":"ContainerStarted","Data":"f44445ddf83e1d514c434eaab5bfaedd8290638a60c11187f376f64995033089"} Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.104461 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.379173 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-449vs"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.380231 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-449vs" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.393009 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-449vs"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.482694 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f4c19ea-fd20-422d-a4c0-efce91c256fc-operator-scripts\") pod \"nova-api-db-create-449vs\" (UID: \"5f4c19ea-fd20-422d-a4c0-efce91c256fc\") " pod="openstack/nova-api-db-create-449vs" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.483106 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94p8j\" (UniqueName: \"kubernetes.io/projected/5f4c19ea-fd20-422d-a4c0-efce91c256fc-kube-api-access-94p8j\") pod \"nova-api-db-create-449vs\" (UID: \"5f4c19ea-fd20-422d-a4c0-efce91c256fc\") " pod="openstack/nova-api-db-create-449vs" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.538809 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-mwrrj"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.540323 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mwrrj" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.563691 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1d8b-account-create-x94zr"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.569064 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1d8b-account-create-x94zr" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.572875 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.591629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94p8j\" (UniqueName: \"kubernetes.io/projected/5f4c19ea-fd20-422d-a4c0-efce91c256fc-kube-api-access-94p8j\") pod \"nova-api-db-create-449vs\" (UID: \"5f4c19ea-fd20-422d-a4c0-efce91c256fc\") " pod="openstack/nova-api-db-create-449vs" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.591833 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f4c19ea-fd20-422d-a4c0-efce91c256fc-operator-scripts\") pod \"nova-api-db-create-449vs\" (UID: \"5f4c19ea-fd20-422d-a4c0-efce91c256fc\") " pod="openstack/nova-api-db-create-449vs" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.593026 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f4c19ea-fd20-422d-a4c0-efce91c256fc-operator-scripts\") pod \"nova-api-db-create-449vs\" (UID: \"5f4c19ea-fd20-422d-a4c0-efce91c256fc\") " pod="openstack/nova-api-db-create-449vs" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.593075 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mwrrj"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.623438 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94p8j\" (UniqueName: \"kubernetes.io/projected/5f4c19ea-fd20-422d-a4c0-efce91c256fc-kube-api-access-94p8j\") pod \"nova-api-db-create-449vs\" (UID: \"5f4c19ea-fd20-422d-a4c0-efce91c256fc\") " pod="openstack/nova-api-db-create-449vs" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.625950 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1d8b-account-create-x94zr"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.646880 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-vzs2f"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.648397 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vzs2f" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.655668 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vzs2f"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.694614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mswcd\" (UniqueName: \"kubernetes.io/projected/e5bb8cbe-6922-4961-9327-f3711da41234-kube-api-access-mswcd\") pod \"nova-api-1d8b-account-create-x94zr\" (UID: \"e5bb8cbe-6922-4961-9327-f3711da41234\") " pod="openstack/nova-api-1d8b-account-create-x94zr" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.695681 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5bb8cbe-6922-4961-9327-f3711da41234-operator-scripts\") pod \"nova-api-1d8b-account-create-x94zr\" (UID: \"e5bb8cbe-6922-4961-9327-f3711da41234\") " pod="openstack/nova-api-1d8b-account-create-x94zr" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.695778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-operator-scripts\") pod \"nova-cell0-db-create-mwrrj\" (UID: \"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b\") " pod="openstack/nova-cell0-db-create-mwrrj" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.695822 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz5dd\" (UniqueName: \"kubernetes.io/projected/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-kube-api-access-xz5dd\") pod \"nova-cell0-db-create-mwrrj\" (UID: \"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b\") " pod="openstack/nova-cell0-db-create-mwrrj" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.700971 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-449vs" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.751825 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ad52-account-create-gpn4k"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.764799 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ad52-account-create-gpn4k" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.770526 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.771755 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ad52-account-create-gpn4k"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.799744 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-operator-scripts\") pod \"nova-cell0-db-create-mwrrj\" (UID: \"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b\") " pod="openstack/nova-cell0-db-create-mwrrj" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.799804 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz5dd\" (UniqueName: \"kubernetes.io/projected/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-kube-api-access-xz5dd\") pod \"nova-cell0-db-create-mwrrj\" (UID: \"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b\") " pod="openstack/nova-cell0-db-create-mwrrj" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.799930 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-operator-scripts\") pod \"nova-cell1-db-create-vzs2f\" (UID: \"e5f93e69-5601-45dc-a1f5-0e086d3dce5d\") " pod="openstack/nova-cell1-db-create-vzs2f" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.800045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mswcd\" (UniqueName: \"kubernetes.io/projected/e5bb8cbe-6922-4961-9327-f3711da41234-kube-api-access-mswcd\") pod \"nova-api-1d8b-account-create-x94zr\" (UID: \"e5bb8cbe-6922-4961-9327-f3711da41234\") " pod="openstack/nova-api-1d8b-account-create-x94zr" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.800078 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frwtb\" (UniqueName: \"kubernetes.io/projected/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-kube-api-access-frwtb\") pod \"nova-cell1-db-create-vzs2f\" (UID: \"e5f93e69-5601-45dc-a1f5-0e086d3dce5d\") " pod="openstack/nova-cell1-db-create-vzs2f" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.803297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5bb8cbe-6922-4961-9327-f3711da41234-operator-scripts\") pod \"nova-api-1d8b-account-create-x94zr\" (UID: \"e5bb8cbe-6922-4961-9327-f3711da41234\") " pod="openstack/nova-api-1d8b-account-create-x94zr" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.803976 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-operator-scripts\") pod \"nova-cell0-db-create-mwrrj\" (UID: \"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b\") " pod="openstack/nova-cell0-db-create-mwrrj" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.804360 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5bb8cbe-6922-4961-9327-f3711da41234-operator-scripts\") pod \"nova-api-1d8b-account-create-x94zr\" (UID: \"e5bb8cbe-6922-4961-9327-f3711da41234\") " pod="openstack/nova-api-1d8b-account-create-x94zr" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.827409 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mswcd\" (UniqueName: \"kubernetes.io/projected/e5bb8cbe-6922-4961-9327-f3711da41234-kube-api-access-mswcd\") pod \"nova-api-1d8b-account-create-x94zr\" (UID: \"e5bb8cbe-6922-4961-9327-f3711da41234\") " pod="openstack/nova-api-1d8b-account-create-x94zr" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.832386 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz5dd\" (UniqueName: \"kubernetes.io/projected/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-kube-api-access-xz5dd\") pod \"nova-cell0-db-create-mwrrj\" (UID: \"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b\") " pod="openstack/nova-cell0-db-create-mwrrj" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.839105 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-c099-account-create-r2vpp"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.840491 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c099-account-create-r2vpp" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.843804 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.860737 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c099-account-create-r2vpp"] Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.877433 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mwrrj" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.907374 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1d8b-account-create-x94zr" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.911232 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkp6v\" (UniqueName: \"kubernetes.io/projected/1b40c85c-3938-405e-902c-c4ea5a19fe20-kube-api-access-pkp6v\") pod \"nova-cell0-ad52-account-create-gpn4k\" (UID: \"1b40c85c-3938-405e-902c-c4ea5a19fe20\") " pod="openstack/nova-cell0-ad52-account-create-gpn4k" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.911318 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-operator-scripts\") pod \"nova-cell1-db-create-vzs2f\" (UID: \"e5f93e69-5601-45dc-a1f5-0e086d3dce5d\") " pod="openstack/nova-cell1-db-create-vzs2f" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.911415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frwtb\" (UniqueName: \"kubernetes.io/projected/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-kube-api-access-frwtb\") pod \"nova-cell1-db-create-vzs2f\" (UID: \"e5f93e69-5601-45dc-a1f5-0e086d3dce5d\") " pod="openstack/nova-cell1-db-create-vzs2f" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.912468 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-operator-scripts\") pod \"nova-cell1-db-create-vzs2f\" (UID: \"e5f93e69-5601-45dc-a1f5-0e086d3dce5d\") " pod="openstack/nova-cell1-db-create-vzs2f" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.911445 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b40c85c-3938-405e-902c-c4ea5a19fe20-operator-scripts\") pod \"nova-cell0-ad52-account-create-gpn4k\" (UID: \"1b40c85c-3938-405e-902c-c4ea5a19fe20\") " pod="openstack/nova-cell0-ad52-account-create-gpn4k" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.936750 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frwtb\" (UniqueName: \"kubernetes.io/projected/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-kube-api-access-frwtb\") pod \"nova-cell1-db-create-vzs2f\" (UID: \"e5f93e69-5601-45dc-a1f5-0e086d3dce5d\") " pod="openstack/nova-cell1-db-create-vzs2f" Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.966492 4743 generic.go:334] "Generic (PLEG): container finished" podID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" containerID="036c461cf62561016b88bb0ead6384ec8c57f22a831737233da0605b7e3d7436" exitCode=0 Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.966805 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c66bff4c4-wzr6r" event={"ID":"d58da1fb-cb7a-4b26-9753-317919c3d2c9","Type":"ContainerDied","Data":"036c461cf62561016b88bb0ead6384ec8c57f22a831737233da0605b7e3d7436"} Nov 22 08:43:24 crc kubenswrapper[4743]: I1122 08:43:24.985988 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vzs2f" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.014919 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b40c85c-3938-405e-902c-c4ea5a19fe20-operator-scripts\") pod \"nova-cell0-ad52-account-create-gpn4k\" (UID: \"1b40c85c-3938-405e-902c-c4ea5a19fe20\") " pod="openstack/nova-cell0-ad52-account-create-gpn4k" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.015060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh8qv\" (UniqueName: \"kubernetes.io/projected/009dc869-9ae6-40f0-a055-1303494f16f1-kube-api-access-zh8qv\") pod \"nova-cell1-c099-account-create-r2vpp\" (UID: \"009dc869-9ae6-40f0-a055-1303494f16f1\") " pod="openstack/nova-cell1-c099-account-create-r2vpp" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.015090 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/009dc869-9ae6-40f0-a055-1303494f16f1-operator-scripts\") pod \"nova-cell1-c099-account-create-r2vpp\" (UID: \"009dc869-9ae6-40f0-a055-1303494f16f1\") " pod="openstack/nova-cell1-c099-account-create-r2vpp" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.015162 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkp6v\" (UniqueName: \"kubernetes.io/projected/1b40c85c-3938-405e-902c-c4ea5a19fe20-kube-api-access-pkp6v\") pod \"nova-cell0-ad52-account-create-gpn4k\" (UID: \"1b40c85c-3938-405e-902c-c4ea5a19fe20\") " pod="openstack/nova-cell0-ad52-account-create-gpn4k" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.015875 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b40c85c-3938-405e-902c-c4ea5a19fe20-operator-scripts\") pod \"nova-cell0-ad52-account-create-gpn4k\" (UID: \"1b40c85c-3938-405e-902c-c4ea5a19fe20\") " pod="openstack/nova-cell0-ad52-account-create-gpn4k" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.037731 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkp6v\" (UniqueName: \"kubernetes.io/projected/1b40c85c-3938-405e-902c-c4ea5a19fe20-kube-api-access-pkp6v\") pod \"nova-cell0-ad52-account-create-gpn4k\" (UID: \"1b40c85c-3938-405e-902c-c4ea5a19fe20\") " pod="openstack/nova-cell0-ad52-account-create-gpn4k" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.117400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh8qv\" (UniqueName: \"kubernetes.io/projected/009dc869-9ae6-40f0-a055-1303494f16f1-kube-api-access-zh8qv\") pod \"nova-cell1-c099-account-create-r2vpp\" (UID: \"009dc869-9ae6-40f0-a055-1303494f16f1\") " pod="openstack/nova-cell1-c099-account-create-r2vpp" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.117447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/009dc869-9ae6-40f0-a055-1303494f16f1-operator-scripts\") pod \"nova-cell1-c099-account-create-r2vpp\" (UID: \"009dc869-9ae6-40f0-a055-1303494f16f1\") " pod="openstack/nova-cell1-c099-account-create-r2vpp" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.118192 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/009dc869-9ae6-40f0-a055-1303494f16f1-operator-scripts\") pod \"nova-cell1-c099-account-create-r2vpp\" (UID: \"009dc869-9ae6-40f0-a055-1303494f16f1\") " pod="openstack/nova-cell1-c099-account-create-r2vpp" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.146151 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh8qv\" (UniqueName: \"kubernetes.io/projected/009dc869-9ae6-40f0-a055-1303494f16f1-kube-api-access-zh8qv\") pod \"nova-cell1-c099-account-create-r2vpp\" (UID: \"009dc869-9ae6-40f0-a055-1303494f16f1\") " pod="openstack/nova-cell1-c099-account-create-r2vpp" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.221346 4743 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6b1be60a-df67-4846-90c8-a53fb6acd7f8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6b1be60a-df67-4846-90c8-a53fb6acd7f8] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6b1be60a_df67_4846_90c8_a53fb6acd7f8.slice" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.241834 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-449vs"] Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.242228 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ad52-account-create-gpn4k" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.264009 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c099-account-create-r2vpp" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.376246 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-mwrrj"] Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.488214 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1d8b-account-create-x94zr"] Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.647283 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vzs2f"] Nov 22 08:43:25 crc kubenswrapper[4743]: W1122 08:43:25.657769 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5f93e69_5601_45dc_a1f5_0e086d3dce5d.slice/crio-ce29cfd39fd962d58a9d1e9628c5ec8696fb5b2158d7449d9e3903d74c5b8dc5 WatchSource:0}: Error finding container ce29cfd39fd962d58a9d1e9628c5ec8696fb5b2158d7449d9e3903d74c5b8dc5: Status 404 returned error can't find the container with id ce29cfd39fd962d58a9d1e9628c5ec8696fb5b2158d7449d9e3903d74c5b8dc5 Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.806101 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.939439 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-config\") pod \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.939495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-combined-ca-bundle\") pod \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.939656 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqvpt\" (UniqueName: \"kubernetes.io/projected/d58da1fb-cb7a-4b26-9753-317919c3d2c9-kube-api-access-vqvpt\") pod \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.939689 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-ovndb-tls-certs\") pod \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.939816 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-httpd-config\") pod \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\" (UID: \"d58da1fb-cb7a-4b26-9753-317919c3d2c9\") " Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.951841 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d58da1fb-cb7a-4b26-9753-317919c3d2c9" (UID: "d58da1fb-cb7a-4b26-9753-317919c3d2c9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.954683 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d58da1fb-cb7a-4b26-9753-317919c3d2c9-kube-api-access-vqvpt" (OuterVolumeSpecName: "kube-api-access-vqvpt") pod "d58da1fb-cb7a-4b26-9753-317919c3d2c9" (UID: "d58da1fb-cb7a-4b26-9753-317919c3d2c9"). InnerVolumeSpecName "kube-api-access-vqvpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.989105 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1d8b-account-create-x94zr" event={"ID":"e5bb8cbe-6922-4961-9327-f3711da41234","Type":"ContainerStarted","Data":"b628f200e7a2bedf0d8e8e4b5953a55be8ad9ac041c328380e5595675b2b42f4"} Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.989153 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1d8b-account-create-x94zr" event={"ID":"e5bb8cbe-6922-4961-9327-f3711da41234","Type":"ContainerStarted","Data":"a017e01b220cad515fb3856943bfae484babe91c382eaac018f721e1d0307f1b"} Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.990776 4743 generic.go:334] "Generic (PLEG): container finished" podID="5f4c19ea-fd20-422d-a4c0-efce91c256fc" containerID="4c285ff99f16c40f8e9fb6688027dc480aaea87e547c44567abd12812584f0a8" exitCode=0 Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.990858 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-449vs" event={"ID":"5f4c19ea-fd20-422d-a4c0-efce91c256fc","Type":"ContainerDied","Data":"4c285ff99f16c40f8e9fb6688027dc480aaea87e547c44567abd12812584f0a8"} Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.990990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-449vs" event={"ID":"5f4c19ea-fd20-422d-a4c0-efce91c256fc","Type":"ContainerStarted","Data":"34989faab92d8d8398516807958dc568df7fecf652f3ff64d732bce1449791a0"} Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.992282 4743 generic.go:334] "Generic (PLEG): container finished" podID="b429a37f-69bf-4a7d-93c4-a3fa043b5f9b" containerID="8cd97805271c2689ffeedd2861c6862a07880da94fdba3980f8515ddcf5802bb" exitCode=0 Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.992325 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mwrrj" event={"ID":"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b","Type":"ContainerDied","Data":"8cd97805271c2689ffeedd2861c6862a07880da94fdba3980f8515ddcf5802bb"} Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.992536 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mwrrj" event={"ID":"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b","Type":"ContainerStarted","Data":"afa643a114d5554dbe5a430dbc8030f10f1b5e2aa26fdd66d1c237b654f2407c"} Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.996239 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vzs2f" event={"ID":"e5f93e69-5601-45dc-a1f5-0e086d3dce5d","Type":"ContainerStarted","Data":"ce29cfd39fd962d58a9d1e9628c5ec8696fb5b2158d7449d9e3903d74c5b8dc5"} Nov 22 08:43:25 crc kubenswrapper[4743]: I1122 08:43:25.998451 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d49750-ff45-4b82-a623-3141ba782527","Type":"ContainerStarted","Data":"14b4171a89f85e941f20bb4503eb57f8651913716868b7f1d2b75320cdb4186b"} Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.011131 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c66bff4c4-wzr6r" event={"ID":"d58da1fb-cb7a-4b26-9753-317919c3d2c9","Type":"ContainerDied","Data":"d74ad8deaa4b62fa1f081d44b768d03143a504b27925e995f41b53015508d1eb"} Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.011375 4743 scope.go:117] "RemoveContainer" containerID="8b5900c88e83e85276245596e3f8b4ba507a1f40a7ab7fd151132a27262e2cfe" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.011603 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c66bff4c4-wzr6r" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.015183 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-config" (OuterVolumeSpecName: "config") pod "d58da1fb-cb7a-4b26-9753-317919c3d2c9" (UID: "d58da1fb-cb7a-4b26-9753-317919c3d2c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.017551 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d58da1fb-cb7a-4b26-9753-317919c3d2c9" (UID: "d58da1fb-cb7a-4b26-9753-317919c3d2c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.031911 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ad52-account-create-gpn4k"] Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.042100 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqvpt\" (UniqueName: \"kubernetes.io/projected/d58da1fb-cb7a-4b26-9753-317919c3d2c9-kube-api-access-vqvpt\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.042148 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.042165 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.042178 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.085881 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-c099-account-create-r2vpp"] Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.096640 4743 scope.go:117] "RemoveContainer" containerID="036c461cf62561016b88bb0ead6384ec8c57f22a831737233da0605b7e3d7436" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.134720 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d58da1fb-cb7a-4b26-9753-317919c3d2c9" (UID: "d58da1fb-cb7a-4b26-9753-317919c3d2c9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.144704 4743 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58da1fb-cb7a-4b26-9753-317919c3d2c9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.372972 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7c66bff4c4-wzr6r"] Nov 22 08:43:26 crc kubenswrapper[4743]: I1122 08:43:26.384328 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7c66bff4c4-wzr6r"] Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.026730 4743 generic.go:334] "Generic (PLEG): container finished" podID="1b40c85c-3938-405e-902c-c4ea5a19fe20" containerID="cbc06588fe9760520abd9ff10252f044f013bc9dd64e11f663750ea38a18547b" exitCode=0 Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.026889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ad52-account-create-gpn4k" event={"ID":"1b40c85c-3938-405e-902c-c4ea5a19fe20","Type":"ContainerDied","Data":"cbc06588fe9760520abd9ff10252f044f013bc9dd64e11f663750ea38a18547b"} Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.027079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ad52-account-create-gpn4k" event={"ID":"1b40c85c-3938-405e-902c-c4ea5a19fe20","Type":"ContainerStarted","Data":"3b44c24b4116c73834a261579aaf89fab37d0ba79b525828769f73ce704e7610"} Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.029347 4743 generic.go:334] "Generic (PLEG): container finished" podID="e5f93e69-5601-45dc-a1f5-0e086d3dce5d" containerID="ed4a51a510556630bce53eb084d8e72e4a5c86ebf6ea25c0ba40d17086b1eead" exitCode=0 Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.029437 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vzs2f" event={"ID":"e5f93e69-5601-45dc-a1f5-0e086d3dce5d","Type":"ContainerDied","Data":"ed4a51a510556630bce53eb084d8e72e4a5c86ebf6ea25c0ba40d17086b1eead"} Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.033339 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d49750-ff45-4b82-a623-3141ba782527","Type":"ContainerStarted","Data":"340c52ef76ee165648f284d7add1bddb1ee48ee0ab8cd09485340771b0178b45"} Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.035248 4743 generic.go:334] "Generic (PLEG): container finished" podID="009dc869-9ae6-40f0-a055-1303494f16f1" containerID="c35c08056d876c19f4f4c85186c089afae3fdfa5b6c85d74b609121293b9cccf" exitCode=0 Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.035329 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c099-account-create-r2vpp" event={"ID":"009dc869-9ae6-40f0-a055-1303494f16f1","Type":"ContainerDied","Data":"c35c08056d876c19f4f4c85186c089afae3fdfa5b6c85d74b609121293b9cccf"} Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.035364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c099-account-create-r2vpp" event={"ID":"009dc869-9ae6-40f0-a055-1303494f16f1","Type":"ContainerStarted","Data":"22f8b26eec395755b0494ccf4105fc0087bac1f78faf80005eb4c2c1f6c54efe"} Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.038621 4743 generic.go:334] "Generic (PLEG): container finished" podID="e5bb8cbe-6922-4961-9327-f3711da41234" containerID="b628f200e7a2bedf0d8e8e4b5953a55be8ad9ac041c328380e5595675b2b42f4" exitCode=0 Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.038716 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1d8b-account-create-x94zr" event={"ID":"e5bb8cbe-6922-4961-9327-f3711da41234","Type":"ContainerDied","Data":"b628f200e7a2bedf0d8e8e4b5953a55be8ad9ac041c328380e5595675b2b42f4"} Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.197836 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" path="/var/lib/kubelet/pods/d58da1fb-cb7a-4b26-9753-317919c3d2c9/volumes" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.622071 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mwrrj" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.671223 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-449vs" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.706771 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1d8b-account-create-x94zr" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.778292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz5dd\" (UniqueName: \"kubernetes.io/projected/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-kube-api-access-xz5dd\") pod \"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b\" (UID: \"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b\") " Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.778409 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94p8j\" (UniqueName: \"kubernetes.io/projected/5f4c19ea-fd20-422d-a4c0-efce91c256fc-kube-api-access-94p8j\") pod \"5f4c19ea-fd20-422d-a4c0-efce91c256fc\" (UID: \"5f4c19ea-fd20-422d-a4c0-efce91c256fc\") " Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.778512 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f4c19ea-fd20-422d-a4c0-efce91c256fc-operator-scripts\") pod \"5f4c19ea-fd20-422d-a4c0-efce91c256fc\" (UID: \"5f4c19ea-fd20-422d-a4c0-efce91c256fc\") " Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.778595 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-operator-scripts\") pod \"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b\" (UID: \"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b\") " Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.779088 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f4c19ea-fd20-422d-a4c0-efce91c256fc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f4c19ea-fd20-422d-a4c0-efce91c256fc" (UID: "5f4c19ea-fd20-422d-a4c0-efce91c256fc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.779329 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b429a37f-69bf-4a7d-93c4-a3fa043b5f9b" (UID: "b429a37f-69bf-4a7d-93c4-a3fa043b5f9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.799059 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f4c19ea-fd20-422d-a4c0-efce91c256fc-kube-api-access-94p8j" (OuterVolumeSpecName: "kube-api-access-94p8j") pod "5f4c19ea-fd20-422d-a4c0-efce91c256fc" (UID: "5f4c19ea-fd20-422d-a4c0-efce91c256fc"). InnerVolumeSpecName "kube-api-access-94p8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.800780 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-kube-api-access-xz5dd" (OuterVolumeSpecName: "kube-api-access-xz5dd") pod "b429a37f-69bf-4a7d-93c4-a3fa043b5f9b" (UID: "b429a37f-69bf-4a7d-93c4-a3fa043b5f9b"). InnerVolumeSpecName "kube-api-access-xz5dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.883566 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mswcd\" (UniqueName: \"kubernetes.io/projected/e5bb8cbe-6922-4961-9327-f3711da41234-kube-api-access-mswcd\") pod \"e5bb8cbe-6922-4961-9327-f3711da41234\" (UID: \"e5bb8cbe-6922-4961-9327-f3711da41234\") " Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.883754 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5bb8cbe-6922-4961-9327-f3711da41234-operator-scripts\") pod \"e5bb8cbe-6922-4961-9327-f3711da41234\" (UID: \"e5bb8cbe-6922-4961-9327-f3711da41234\") " Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.884245 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94p8j\" (UniqueName: \"kubernetes.io/projected/5f4c19ea-fd20-422d-a4c0-efce91c256fc-kube-api-access-94p8j\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.884269 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f4c19ea-fd20-422d-a4c0-efce91c256fc-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.884282 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.884294 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz5dd\" (UniqueName: \"kubernetes.io/projected/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b-kube-api-access-xz5dd\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.884544 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5bb8cbe-6922-4961-9327-f3711da41234-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5bb8cbe-6922-4961-9327-f3711da41234" (UID: "e5bb8cbe-6922-4961-9327-f3711da41234"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.890004 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5bb8cbe-6922-4961-9327-f3711da41234-kube-api-access-mswcd" (OuterVolumeSpecName: "kube-api-access-mswcd") pod "e5bb8cbe-6922-4961-9327-f3711da41234" (UID: "e5bb8cbe-6922-4961-9327-f3711da41234"). InnerVolumeSpecName "kube-api-access-mswcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.986171 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mswcd\" (UniqueName: \"kubernetes.io/projected/e5bb8cbe-6922-4961-9327-f3711da41234-kube-api-access-mswcd\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:27 crc kubenswrapper[4743]: I1122 08:43:27.986223 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5bb8cbe-6922-4961-9327-f3711da41234-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.049483 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-449vs" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.049513 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-449vs" event={"ID":"5f4c19ea-fd20-422d-a4c0-efce91c256fc","Type":"ContainerDied","Data":"34989faab92d8d8398516807958dc568df7fecf652f3ff64d732bce1449791a0"} Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.049547 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34989faab92d8d8398516807958dc568df7fecf652f3ff64d732bce1449791a0" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.052256 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d49750-ff45-4b82-a623-3141ba782527","Type":"ContainerStarted","Data":"feae721194d83926d45a14081fb0d68c94aeadc33de7a54c1a03d0740f0eda06"} Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.052443 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="ceilometer-central-agent" containerID="cri-o://f44445ddf83e1d514c434eaab5bfaedd8290638a60c11187f376f64995033089" gracePeriod=30 Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.052728 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.053065 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="proxy-httpd" containerID="cri-o://feae721194d83926d45a14081fb0d68c94aeadc33de7a54c1a03d0740f0eda06" gracePeriod=30 Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.053137 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="sg-core" containerID="cri-o://340c52ef76ee165648f284d7add1bddb1ee48ee0ab8cd09485340771b0178b45" gracePeriod=30 Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.053196 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="ceilometer-notification-agent" containerID="cri-o://14b4171a89f85e941f20bb4503eb57f8651913716868b7f1d2b75320cdb4186b" gracePeriod=30 Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.065043 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1d8b-account-create-x94zr" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.065042 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1d8b-account-create-x94zr" event={"ID":"e5bb8cbe-6922-4961-9327-f3711da41234","Type":"ContainerDied","Data":"a017e01b220cad515fb3856943bfae484babe91c382eaac018f721e1d0307f1b"} Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.065179 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a017e01b220cad515fb3856943bfae484babe91c382eaac018f721e1d0307f1b" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.072957 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-mwrrj" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.073680 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-mwrrj" event={"ID":"b429a37f-69bf-4a7d-93c4-a3fa043b5f9b","Type":"ContainerDied","Data":"afa643a114d5554dbe5a430dbc8030f10f1b5e2aa26fdd66d1c237b654f2407c"} Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.073714 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afa643a114d5554dbe5a430dbc8030f10f1b5e2aa26fdd66d1c237b654f2407c" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.084225 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.491541929 podStartE2EDuration="7.084213227s" podCreationTimestamp="2025-11-22 08:43:21 +0000 UTC" firstStartedPulling="2025-11-22 08:43:22.863058982 +0000 UTC m=+1276.569420034" lastFinishedPulling="2025-11-22 08:43:27.45573028 +0000 UTC m=+1281.162091332" observedRunningTime="2025-11-22 08:43:28.082737604 +0000 UTC m=+1281.789098656" watchObservedRunningTime="2025-11-22 08:43:28.084213227 +0000 UTC m=+1281.790574279" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.453017 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ad52-account-create-gpn4k" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.539702 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c099-account-create-r2vpp" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.542591 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vzs2f" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.599955 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkp6v\" (UniqueName: \"kubernetes.io/projected/1b40c85c-3938-405e-902c-c4ea5a19fe20-kube-api-access-pkp6v\") pod \"1b40c85c-3938-405e-902c-c4ea5a19fe20\" (UID: \"1b40c85c-3938-405e-902c-c4ea5a19fe20\") " Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.602084 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b40c85c-3938-405e-902c-c4ea5a19fe20-operator-scripts\") pod \"1b40c85c-3938-405e-902c-c4ea5a19fe20\" (UID: \"1b40c85c-3938-405e-902c-c4ea5a19fe20\") " Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.603022 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b40c85c-3938-405e-902c-c4ea5a19fe20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b40c85c-3938-405e-902c-c4ea5a19fe20" (UID: "1b40c85c-3938-405e-902c-c4ea5a19fe20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.605796 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b40c85c-3938-405e-902c-c4ea5a19fe20-kube-api-access-pkp6v" (OuterVolumeSpecName: "kube-api-access-pkp6v") pod "1b40c85c-3938-405e-902c-c4ea5a19fe20" (UID: "1b40c85c-3938-405e-902c-c4ea5a19fe20"). InnerVolumeSpecName "kube-api-access-pkp6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.704559 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frwtb\" (UniqueName: \"kubernetes.io/projected/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-kube-api-access-frwtb\") pod \"e5f93e69-5601-45dc-a1f5-0e086d3dce5d\" (UID: \"e5f93e69-5601-45dc-a1f5-0e086d3dce5d\") " Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.704946 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/009dc869-9ae6-40f0-a055-1303494f16f1-operator-scripts\") pod \"009dc869-9ae6-40f0-a055-1303494f16f1\" (UID: \"009dc869-9ae6-40f0-a055-1303494f16f1\") " Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.705073 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-operator-scripts\") pod \"e5f93e69-5601-45dc-a1f5-0e086d3dce5d\" (UID: \"e5f93e69-5601-45dc-a1f5-0e086d3dce5d\") " Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.705203 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh8qv\" (UniqueName: \"kubernetes.io/projected/009dc869-9ae6-40f0-a055-1303494f16f1-kube-api-access-zh8qv\") pod \"009dc869-9ae6-40f0-a055-1303494f16f1\" (UID: \"009dc869-9ae6-40f0-a055-1303494f16f1\") " Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.705500 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/009dc869-9ae6-40f0-a055-1303494f16f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "009dc869-9ae6-40f0-a055-1303494f16f1" (UID: "009dc869-9ae6-40f0-a055-1303494f16f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.705492 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5f93e69-5601-45dc-a1f5-0e086d3dce5d" (UID: "e5f93e69-5601-45dc-a1f5-0e086d3dce5d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.705990 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/009dc869-9ae6-40f0-a055-1303494f16f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.706090 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.706160 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkp6v\" (UniqueName: \"kubernetes.io/projected/1b40c85c-3938-405e-902c-c4ea5a19fe20-kube-api-access-pkp6v\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.706219 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b40c85c-3938-405e-902c-c4ea5a19fe20-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.712327 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009dc869-9ae6-40f0-a055-1303494f16f1-kube-api-access-zh8qv" (OuterVolumeSpecName: "kube-api-access-zh8qv") pod "009dc869-9ae6-40f0-a055-1303494f16f1" (UID: "009dc869-9ae6-40f0-a055-1303494f16f1"). InnerVolumeSpecName "kube-api-access-zh8qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.713908 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-kube-api-access-frwtb" (OuterVolumeSpecName: "kube-api-access-frwtb") pod "e5f93e69-5601-45dc-a1f5-0e086d3dce5d" (UID: "e5f93e69-5601-45dc-a1f5-0e086d3dce5d"). InnerVolumeSpecName "kube-api-access-frwtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.807899 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frwtb\" (UniqueName: \"kubernetes.io/projected/e5f93e69-5601-45dc-a1f5-0e086d3dce5d-kube-api-access-frwtb\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:28 crc kubenswrapper[4743]: I1122 08:43:28.808102 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh8qv\" (UniqueName: \"kubernetes.io/projected/009dc869-9ae6-40f0-a055-1303494f16f1-kube-api-access-zh8qv\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.082503 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ad52-account-create-gpn4k" event={"ID":"1b40c85c-3938-405e-902c-c4ea5a19fe20","Type":"ContainerDied","Data":"3b44c24b4116c73834a261579aaf89fab37d0ba79b525828769f73ce704e7610"} Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.082559 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b44c24b4116c73834a261579aaf89fab37d0ba79b525828769f73ce704e7610" Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.082524 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ad52-account-create-gpn4k" Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.084107 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vzs2f" event={"ID":"e5f93e69-5601-45dc-a1f5-0e086d3dce5d","Type":"ContainerDied","Data":"ce29cfd39fd962d58a9d1e9628c5ec8696fb5b2158d7449d9e3903d74c5b8dc5"} Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.084146 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce29cfd39fd962d58a9d1e9628c5ec8696fb5b2158d7449d9e3903d74c5b8dc5" Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.084229 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vzs2f" Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.086437 4743 generic.go:334] "Generic (PLEG): container finished" podID="63d49750-ff45-4b82-a623-3141ba782527" containerID="feae721194d83926d45a14081fb0d68c94aeadc33de7a54c1a03d0740f0eda06" exitCode=0 Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.086464 4743 generic.go:334] "Generic (PLEG): container finished" podID="63d49750-ff45-4b82-a623-3141ba782527" containerID="340c52ef76ee165648f284d7add1bddb1ee48ee0ab8cd09485340771b0178b45" exitCode=2 Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.086476 4743 generic.go:334] "Generic (PLEG): container finished" podID="63d49750-ff45-4b82-a623-3141ba782527" containerID="14b4171a89f85e941f20bb4503eb57f8651913716868b7f1d2b75320cdb4186b" exitCode=0 Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.086514 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d49750-ff45-4b82-a623-3141ba782527","Type":"ContainerDied","Data":"feae721194d83926d45a14081fb0d68c94aeadc33de7a54c1a03d0740f0eda06"} Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.086532 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d49750-ff45-4b82-a623-3141ba782527","Type":"ContainerDied","Data":"340c52ef76ee165648f284d7add1bddb1ee48ee0ab8cd09485340771b0178b45"} Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.086546 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d49750-ff45-4b82-a623-3141ba782527","Type":"ContainerDied","Data":"14b4171a89f85e941f20bb4503eb57f8651913716868b7f1d2b75320cdb4186b"} Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.088753 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-c099-account-create-r2vpp" event={"ID":"009dc869-9ae6-40f0-a055-1303494f16f1","Type":"ContainerDied","Data":"22f8b26eec395755b0494ccf4105fc0087bac1f78faf80005eb4c2c1f6c54efe"} Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.088779 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22f8b26eec395755b0494ccf4105fc0087bac1f78faf80005eb4c2c1f6c54efe" Nov 22 08:43:29 crc kubenswrapper[4743]: I1122 08:43:29.088823 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-c099-account-create-r2vpp" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.415604 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z54x2"] Nov 22 08:43:30 crc kubenswrapper[4743]: E1122 08:43:30.416264 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" containerName="neutron-httpd" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416276 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" containerName="neutron-httpd" Nov 22 08:43:30 crc kubenswrapper[4743]: E1122 08:43:30.416284 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f4c19ea-fd20-422d-a4c0-efce91c256fc" containerName="mariadb-database-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416290 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f4c19ea-fd20-422d-a4c0-efce91c256fc" containerName="mariadb-database-create" Nov 22 08:43:30 crc kubenswrapper[4743]: E1122 08:43:30.416307 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" containerName="neutron-api" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416313 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" containerName="neutron-api" Nov 22 08:43:30 crc kubenswrapper[4743]: E1122 08:43:30.416322 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b429a37f-69bf-4a7d-93c4-a3fa043b5f9b" containerName="mariadb-database-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416328 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b429a37f-69bf-4a7d-93c4-a3fa043b5f9b" containerName="mariadb-database-create" Nov 22 08:43:30 crc kubenswrapper[4743]: E1122 08:43:30.416338 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f93e69-5601-45dc-a1f5-0e086d3dce5d" containerName="mariadb-database-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416343 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f93e69-5601-45dc-a1f5-0e086d3dce5d" containerName="mariadb-database-create" Nov 22 08:43:30 crc kubenswrapper[4743]: E1122 08:43:30.416354 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5bb8cbe-6922-4961-9327-f3711da41234" containerName="mariadb-account-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416359 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5bb8cbe-6922-4961-9327-f3711da41234" containerName="mariadb-account-create" Nov 22 08:43:30 crc kubenswrapper[4743]: E1122 08:43:30.416381 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b40c85c-3938-405e-902c-c4ea5a19fe20" containerName="mariadb-account-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416389 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b40c85c-3938-405e-902c-c4ea5a19fe20" containerName="mariadb-account-create" Nov 22 08:43:30 crc kubenswrapper[4743]: E1122 08:43:30.416405 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009dc869-9ae6-40f0-a055-1303494f16f1" containerName="mariadb-account-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416412 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="009dc869-9ae6-40f0-a055-1303494f16f1" containerName="mariadb-account-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416668 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="009dc869-9ae6-40f0-a055-1303494f16f1" containerName="mariadb-account-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416682 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b40c85c-3938-405e-902c-c4ea5a19fe20" containerName="mariadb-account-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416693 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5bb8cbe-6922-4961-9327-f3711da41234" containerName="mariadb-account-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416702 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" containerName="neutron-api" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416712 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d58da1fb-cb7a-4b26-9753-317919c3d2c9" containerName="neutron-httpd" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416720 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b429a37f-69bf-4a7d-93c4-a3fa043b5f9b" containerName="mariadb-database-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416730 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f93e69-5601-45dc-a1f5-0e086d3dce5d" containerName="mariadb-database-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.416743 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f4c19ea-fd20-422d-a4c0-efce91c256fc" containerName="mariadb-database-create" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.417322 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.419779 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.420016 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hx2pq" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.420183 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.434895 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z54x2"] Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.540311 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-scripts\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.540379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.540441 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-config-data\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.540515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7zpr\" (UniqueName: \"kubernetes.io/projected/041f321c-a19a-46ba-83e0-5934dd806565-kube-api-access-v7zpr\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.551656 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.559990 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.642044 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-scripts\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.642114 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.642186 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-config-data\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.642289 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7zpr\" (UniqueName: \"kubernetes.io/projected/041f321c-a19a-46ba-83e0-5934dd806565-kube-api-access-v7zpr\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.649336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.649640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-scripts\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.653267 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-config-data\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.665438 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7zpr\" (UniqueName: \"kubernetes.io/projected/041f321c-a19a-46ba-83e0-5934dd806565-kube-api-access-v7zpr\") pod \"nova-cell0-conductor-db-sync-z54x2\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:30 crc kubenswrapper[4743]: I1122 08:43:30.736417 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:31 crc kubenswrapper[4743]: I1122 08:43:31.216095 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z54x2"] Nov 22 08:43:31 crc kubenswrapper[4743]: W1122 08:43:31.218466 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod041f321c_a19a_46ba_83e0_5934dd806565.slice/crio-600b1cf30d654205deca311afcce5ee1406bb8b0718b2a43287436daaaaf92ed WatchSource:0}: Error finding container 600b1cf30d654205deca311afcce5ee1406bb8b0718b2a43287436daaaaf92ed: Status 404 returned error can't find the container with id 600b1cf30d654205deca311afcce5ee1406bb8b0718b2a43287436daaaaf92ed Nov 22 08:43:32 crc kubenswrapper[4743]: I1122 08:43:32.121179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z54x2" event={"ID":"041f321c-a19a-46ba-83e0-5934dd806565","Type":"ContainerStarted","Data":"600b1cf30d654205deca311afcce5ee1406bb8b0718b2a43287436daaaaf92ed"} Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.139168 4743 generic.go:334] "Generic (PLEG): container finished" podID="63d49750-ff45-4b82-a623-3141ba782527" containerID="f44445ddf83e1d514c434eaab5bfaedd8290638a60c11187f376f64995033089" exitCode=0 Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.139209 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d49750-ff45-4b82-a623-3141ba782527","Type":"ContainerDied","Data":"f44445ddf83e1d514c434eaab5bfaedd8290638a60c11187f376f64995033089"} Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.269410 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.392704 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-log-httpd\") pod \"63d49750-ff45-4b82-a623-3141ba782527\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.393063 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwh2k\" (UniqueName: \"kubernetes.io/projected/63d49750-ff45-4b82-a623-3141ba782527-kube-api-access-fwh2k\") pod \"63d49750-ff45-4b82-a623-3141ba782527\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.393088 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-config-data\") pod \"63d49750-ff45-4b82-a623-3141ba782527\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.393140 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-sg-core-conf-yaml\") pod \"63d49750-ff45-4b82-a623-3141ba782527\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.393180 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-scripts\") pod \"63d49750-ff45-4b82-a623-3141ba782527\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.393204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-combined-ca-bundle\") pod \"63d49750-ff45-4b82-a623-3141ba782527\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.393236 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-run-httpd\") pod \"63d49750-ff45-4b82-a623-3141ba782527\" (UID: \"63d49750-ff45-4b82-a623-3141ba782527\") " Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.393430 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "63d49750-ff45-4b82-a623-3141ba782527" (UID: "63d49750-ff45-4b82-a623-3141ba782527"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.393651 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "63d49750-ff45-4b82-a623-3141ba782527" (UID: "63d49750-ff45-4b82-a623-3141ba782527"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.393775 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.393791 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63d49750-ff45-4b82-a623-3141ba782527-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.398644 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-scripts" (OuterVolumeSpecName: "scripts") pod "63d49750-ff45-4b82-a623-3141ba782527" (UID: "63d49750-ff45-4b82-a623-3141ba782527"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.399165 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d49750-ff45-4b82-a623-3141ba782527-kube-api-access-fwh2k" (OuterVolumeSpecName: "kube-api-access-fwh2k") pod "63d49750-ff45-4b82-a623-3141ba782527" (UID: "63d49750-ff45-4b82-a623-3141ba782527"). InnerVolumeSpecName "kube-api-access-fwh2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.427259 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "63d49750-ff45-4b82-a623-3141ba782527" (UID: "63d49750-ff45-4b82-a623-3141ba782527"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.479623 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63d49750-ff45-4b82-a623-3141ba782527" (UID: "63d49750-ff45-4b82-a623-3141ba782527"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.495718 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwh2k\" (UniqueName: \"kubernetes.io/projected/63d49750-ff45-4b82-a623-3141ba782527-kube-api-access-fwh2k\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.495751 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.495760 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.495768 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.502267 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-config-data" (OuterVolumeSpecName: "config-data") pod "63d49750-ff45-4b82-a623-3141ba782527" (UID: "63d49750-ff45-4b82-a623-3141ba782527"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:33 crc kubenswrapper[4743]: I1122 08:43:33.597744 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d49750-ff45-4b82-a623-3141ba782527-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.152727 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63d49750-ff45-4b82-a623-3141ba782527","Type":"ContainerDied","Data":"38d208d8fcc1ee694b290fcfeda40bef40cbc7b669eb30d4926e138a73e2867b"} Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.152795 4743 scope.go:117] "RemoveContainer" containerID="feae721194d83926d45a14081fb0d68c94aeadc33de7a54c1a03d0740f0eda06" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.152809 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.191786 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.204502 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.213871 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:34 crc kubenswrapper[4743]: E1122 08:43:34.214342 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="proxy-httpd" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.214362 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="proxy-httpd" Nov 22 08:43:34 crc kubenswrapper[4743]: E1122 08:43:34.214375 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="ceilometer-central-agent" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.214381 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="ceilometer-central-agent" Nov 22 08:43:34 crc kubenswrapper[4743]: E1122 08:43:34.214392 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="sg-core" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.214398 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="sg-core" Nov 22 08:43:34 crc kubenswrapper[4743]: E1122 08:43:34.214417 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="ceilometer-notification-agent" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.214422 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="ceilometer-notification-agent" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.214638 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="ceilometer-notification-agent" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.214656 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="proxy-httpd" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.214665 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="sg-core" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.214676 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d49750-ff45-4b82-a623-3141ba782527" containerName="ceilometer-central-agent" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.216300 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.220084 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.220303 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.239470 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.310308 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-config-data\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.310463 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-run-httpd\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.310519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsbtf\" (UniqueName: \"kubernetes.io/projected/9e98e6ad-ce29-4f91-868c-975859995174-kube-api-access-tsbtf\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.310642 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.310697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.310731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-log-httpd\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.310769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-scripts\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.412482 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsbtf\" (UniqueName: \"kubernetes.io/projected/9e98e6ad-ce29-4f91-868c-975859995174-kube-api-access-tsbtf\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.412635 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.412708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.412758 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-log-httpd\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.412791 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-scripts\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.412832 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-config-data\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.412884 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-run-httpd\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.413593 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-log-httpd\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.413603 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-run-httpd\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.419827 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-config-data\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.420734 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-scripts\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.425277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.432215 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsbtf\" (UniqueName: \"kubernetes.io/projected/9e98e6ad-ce29-4f91-868c-975859995174-kube-api-access-tsbtf\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.435367 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " pod="openstack/ceilometer-0" Nov 22 08:43:34 crc kubenswrapper[4743]: I1122 08:43:34.543979 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:35 crc kubenswrapper[4743]: I1122 08:43:35.162957 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d49750-ff45-4b82-a623-3141ba782527" path="/var/lib/kubelet/pods/63d49750-ff45-4b82-a623-3141ba782527/volumes" Nov 22 08:43:36 crc kubenswrapper[4743]: I1122 08:43:36.195539 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:38 crc kubenswrapper[4743]: I1122 08:43:38.210569 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:43:38 crc kubenswrapper[4743]: I1122 08:43:38.211396 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" containerName="glance-log" containerID="cri-o://c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7" gracePeriod=30 Nov 22 08:43:38 crc kubenswrapper[4743]: I1122 08:43:38.211889 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" containerName="glance-httpd" containerID="cri-o://a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce" gracePeriod=30 Nov 22 08:43:38 crc kubenswrapper[4743]: I1122 08:43:38.897911 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:43:38 crc kubenswrapper[4743]: I1122 08:43:38.898595 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f3494746-cd7f-4497-b123-6ca7196e6480" containerName="glance-log" containerID="cri-o://6d178b1ea7a4e098244482ffdc9fe6d4b9d77eeab15b91ba026d2d050bfa9a72" gracePeriod=30 Nov 22 08:43:38 crc kubenswrapper[4743]: I1122 08:43:38.899082 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f3494746-cd7f-4497-b123-6ca7196e6480" containerName="glance-httpd" containerID="cri-o://6de714267c676d7c85c60c654a52b66ec288ebaa622fecd0837430e2a8ee8f23" gracePeriod=30 Nov 22 08:43:39 crc kubenswrapper[4743]: I1122 08:43:39.015383 4743 scope.go:117] "RemoveContainer" containerID="340c52ef76ee165648f284d7add1bddb1ee48ee0ab8cd09485340771b0178b45" Nov 22 08:43:39 crc kubenswrapper[4743]: I1122 08:43:39.075314 4743 scope.go:117] "RemoveContainer" containerID="14b4171a89f85e941f20bb4503eb57f8651913716868b7f1d2b75320cdb4186b" Nov 22 08:43:39 crc kubenswrapper[4743]: I1122 08:43:39.199834 4743 scope.go:117] "RemoveContainer" containerID="f44445ddf83e1d514c434eaab5bfaedd8290638a60c11187f376f64995033089" Nov 22 08:43:39 crc kubenswrapper[4743]: I1122 08:43:39.263822 4743 generic.go:334] "Generic (PLEG): container finished" podID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" containerID="c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7" exitCode=143 Nov 22 08:43:39 crc kubenswrapper[4743]: I1122 08:43:39.263916 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a2dcda9-85a3-4b08-a32b-14710e0a3b55","Type":"ContainerDied","Data":"c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7"} Nov 22 08:43:39 crc kubenswrapper[4743]: I1122 08:43:39.266257 4743 generic.go:334] "Generic (PLEG): container finished" podID="f3494746-cd7f-4497-b123-6ca7196e6480" containerID="6d178b1ea7a4e098244482ffdc9fe6d4b9d77eeab15b91ba026d2d050bfa9a72" exitCode=143 Nov 22 08:43:39 crc kubenswrapper[4743]: I1122 08:43:39.266304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3494746-cd7f-4497-b123-6ca7196e6480","Type":"ContainerDied","Data":"6d178b1ea7a4e098244482ffdc9fe6d4b9d77eeab15b91ba026d2d050bfa9a72"} Nov 22 08:43:39 crc kubenswrapper[4743]: I1122 08:43:39.545487 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:39 crc kubenswrapper[4743]: W1122 08:43:39.551815 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e98e6ad_ce29_4f91_868c_975859995174.slice/crio-15c9d8f35ec773a5772ed1a93dd502f17ed5c26bc0bc4a505eb8b003e283ef04 WatchSource:0}: Error finding container 15c9d8f35ec773a5772ed1a93dd502f17ed5c26bc0bc4a505eb8b003e283ef04: Status 404 returned error can't find the container with id 15c9d8f35ec773a5772ed1a93dd502f17ed5c26bc0bc4a505eb8b003e283ef04 Nov 22 08:43:40 crc kubenswrapper[4743]: I1122 08:43:40.275834 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z54x2" event={"ID":"041f321c-a19a-46ba-83e0-5934dd806565","Type":"ContainerStarted","Data":"83164b2e658bb9ac77208bcdab8d7ea5bcddd9ddb221ef2bb7c6d22ed509bf07"} Nov 22 08:43:40 crc kubenswrapper[4743]: I1122 08:43:40.277993 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e98e6ad-ce29-4f91-868c-975859995174","Type":"ContainerStarted","Data":"6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997"} Nov 22 08:43:40 crc kubenswrapper[4743]: I1122 08:43:40.278048 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e98e6ad-ce29-4f91-868c-975859995174","Type":"ContainerStarted","Data":"15c9d8f35ec773a5772ed1a93dd502f17ed5c26bc0bc4a505eb8b003e283ef04"} Nov 22 08:43:40 crc kubenswrapper[4743]: I1122 08:43:40.295653 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-z54x2" podStartSLOduration=2.419125972 podStartE2EDuration="10.295634047s" podCreationTimestamp="2025-11-22 08:43:30 +0000 UTC" firstStartedPulling="2025-11-22 08:43:31.226047424 +0000 UTC m=+1284.932408476" lastFinishedPulling="2025-11-22 08:43:39.102555499 +0000 UTC m=+1292.808916551" observedRunningTime="2025-11-22 08:43:40.289739377 +0000 UTC m=+1293.996100429" watchObservedRunningTime="2025-11-22 08:43:40.295634047 +0000 UTC m=+1294.001995099" Nov 22 08:43:41 crc kubenswrapper[4743]: I1122 08:43:41.294504 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e98e6ad-ce29-4f91-868c-975859995174","Type":"ContainerStarted","Data":"80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d"} Nov 22 08:43:41 crc kubenswrapper[4743]: I1122 08:43:41.920879 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.049764 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-combined-ca-bundle\") pod \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.049858 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-logs\") pod \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.049917 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vdnp\" (UniqueName: \"kubernetes.io/projected/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-kube-api-access-7vdnp\") pod \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.049997 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-internal-tls-certs\") pod \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.050038 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-scripts\") pod \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.050073 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-httpd-run\") pod \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.050093 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.050130 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-config-data\") pod \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\" (UID: \"8a2dcda9-85a3-4b08-a32b-14710e0a3b55\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.050625 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-logs" (OuterVolumeSpecName: "logs") pod "8a2dcda9-85a3-4b08-a32b-14710e0a3b55" (UID: "8a2dcda9-85a3-4b08-a32b-14710e0a3b55"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.051288 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a2dcda9-85a3-4b08-a32b-14710e0a3b55" (UID: "8a2dcda9-85a3-4b08-a32b-14710e0a3b55"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.057759 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-kube-api-access-7vdnp" (OuterVolumeSpecName: "kube-api-access-7vdnp") pod "8a2dcda9-85a3-4b08-a32b-14710e0a3b55" (UID: "8a2dcda9-85a3-4b08-a32b-14710e0a3b55"). InnerVolumeSpecName "kube-api-access-7vdnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.057998 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-scripts" (OuterVolumeSpecName: "scripts") pod "8a2dcda9-85a3-4b08-a32b-14710e0a3b55" (UID: "8a2dcda9-85a3-4b08-a32b-14710e0a3b55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.077536 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "8a2dcda9-85a3-4b08-a32b-14710e0a3b55" (UID: "8a2dcda9-85a3-4b08-a32b-14710e0a3b55"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.097721 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a2dcda9-85a3-4b08-a32b-14710e0a3b55" (UID: "8a2dcda9-85a3-4b08-a32b-14710e0a3b55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.129051 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8a2dcda9-85a3-4b08-a32b-14710e0a3b55" (UID: "8a2dcda9-85a3-4b08-a32b-14710e0a3b55"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.143191 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-config-data" (OuterVolumeSpecName: "config-data") pod "8a2dcda9-85a3-4b08-a32b-14710e0a3b55" (UID: "8a2dcda9-85a3-4b08-a32b-14710e0a3b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.152508 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vdnp\" (UniqueName: \"kubernetes.io/projected/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-kube-api-access-7vdnp\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.152538 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.152549 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.152558 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.152602 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.152612 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.152621 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.152629 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a2dcda9-85a3-4b08-a32b-14710e0a3b55-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.175930 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.254843 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.308081 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e98e6ad-ce29-4f91-868c-975859995174","Type":"ContainerStarted","Data":"3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04"} Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.311312 4743 generic.go:334] "Generic (PLEG): container finished" podID="f3494746-cd7f-4497-b123-6ca7196e6480" containerID="6de714267c676d7c85c60c654a52b66ec288ebaa622fecd0837430e2a8ee8f23" exitCode=0 Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.311368 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3494746-cd7f-4497-b123-6ca7196e6480","Type":"ContainerDied","Data":"6de714267c676d7c85c60c654a52b66ec288ebaa622fecd0837430e2a8ee8f23"} Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.314318 4743 generic.go:334] "Generic (PLEG): container finished" podID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" containerID="a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce" exitCode=0 Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.314365 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a2dcda9-85a3-4b08-a32b-14710e0a3b55","Type":"ContainerDied","Data":"a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce"} Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.314387 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.314391 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a2dcda9-85a3-4b08-a32b-14710e0a3b55","Type":"ContainerDied","Data":"8f3c3d1c229c349a1948a71ce5fb20da18b7e832eaf187d6d0feaeebfd103bd3"} Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.314420 4743 scope.go:117] "RemoveContainer" containerID="a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.358506 4743 scope.go:117] "RemoveContainer" containerID="c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.359299 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.398636 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.417542 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:43:42 crc kubenswrapper[4743]: E1122 08:43:42.418110 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" containerName="glance-httpd" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.418132 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" containerName="glance-httpd" Nov 22 08:43:42 crc kubenswrapper[4743]: E1122 08:43:42.418169 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" containerName="glance-log" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.418175 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" containerName="glance-log" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.418462 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" containerName="glance-httpd" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.418482 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" containerName="glance-log" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.420595 4743 scope.go:117] "RemoveContainer" containerID="a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.421081 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.421203 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: E1122 08:43:42.421600 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce\": container with ID starting with a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce not found: ID does not exist" containerID="a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.421641 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce"} err="failed to get container status \"a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce\": rpc error: code = NotFound desc = could not find container \"a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce\": container with ID starting with a4f1871e55f09c6056d3de79addbc253f052e8e93388a3a949cca5c62565b5ce not found: ID does not exist" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.421670 4743 scope.go:117] "RemoveContainer" containerID="c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7" Nov 22 08:43:42 crc kubenswrapper[4743]: E1122 08:43:42.424387 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7\": container with ID starting with c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7 not found: ID does not exist" containerID="c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.424431 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7"} err="failed to get container status \"c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7\": rpc error: code = NotFound desc = could not find container \"c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7\": container with ID starting with c8bae9d769b40695d7c7e7dad04adaecc471cea3389423cf6bb1b37c11265ce7 not found: ID does not exist" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.424471 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.424538 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.458288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.458334 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-logs\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.458392 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.458482 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-954qj\" (UniqueName: \"kubernetes.io/projected/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-kube-api-access-954qj\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.458504 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.458563 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.458598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.458712 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.560397 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.560450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.560473 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.560922 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.560940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-logs\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.560986 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.561081 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-954qj\" (UniqueName: \"kubernetes.io/projected/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-kube-api-access-954qj\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.561104 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.563181 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.563246 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-logs\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.563404 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.566480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.566491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.571254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.573557 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.581642 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-954qj\" (UniqueName: \"kubernetes.io/projected/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-kube-api-access-954qj\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.626158 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.705903 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.761494 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.764551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f3494746-cd7f-4497-b123-6ca7196e6480\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.764618 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sldp4\" (UniqueName: \"kubernetes.io/projected/f3494746-cd7f-4497-b123-6ca7196e6480-kube-api-access-sldp4\") pod \"f3494746-cd7f-4497-b123-6ca7196e6480\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.764667 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-logs\") pod \"f3494746-cd7f-4497-b123-6ca7196e6480\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.764808 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-httpd-run\") pod \"f3494746-cd7f-4497-b123-6ca7196e6480\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.764836 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-config-data\") pod \"f3494746-cd7f-4497-b123-6ca7196e6480\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.764879 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-combined-ca-bundle\") pod \"f3494746-cd7f-4497-b123-6ca7196e6480\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.764902 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-scripts\") pod \"f3494746-cd7f-4497-b123-6ca7196e6480\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.764970 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-public-tls-certs\") pod \"f3494746-cd7f-4497-b123-6ca7196e6480\" (UID: \"f3494746-cd7f-4497-b123-6ca7196e6480\") " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.766432 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-logs" (OuterVolumeSpecName: "logs") pod "f3494746-cd7f-4497-b123-6ca7196e6480" (UID: "f3494746-cd7f-4497-b123-6ca7196e6480"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.767039 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f3494746-cd7f-4497-b123-6ca7196e6480" (UID: "f3494746-cd7f-4497-b123-6ca7196e6480"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.772020 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3494746-cd7f-4497-b123-6ca7196e6480-kube-api-access-sldp4" (OuterVolumeSpecName: "kube-api-access-sldp4") pod "f3494746-cd7f-4497-b123-6ca7196e6480" (UID: "f3494746-cd7f-4497-b123-6ca7196e6480"). InnerVolumeSpecName "kube-api-access-sldp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.772037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "f3494746-cd7f-4497-b123-6ca7196e6480" (UID: "f3494746-cd7f-4497-b123-6ca7196e6480"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.775955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-scripts" (OuterVolumeSpecName: "scripts") pod "f3494746-cd7f-4497-b123-6ca7196e6480" (UID: "f3494746-cd7f-4497-b123-6ca7196e6480"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.826776 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3494746-cd7f-4497-b123-6ca7196e6480" (UID: "f3494746-cd7f-4497-b123-6ca7196e6480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.855528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-config-data" (OuterVolumeSpecName: "config-data") pod "f3494746-cd7f-4497-b123-6ca7196e6480" (UID: "f3494746-cd7f-4497-b123-6ca7196e6480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.870688 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.870714 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sldp4\" (UniqueName: \"kubernetes.io/projected/f3494746-cd7f-4497-b123-6ca7196e6480-kube-api-access-sldp4\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.870726 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.870734 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f3494746-cd7f-4497-b123-6ca7196e6480-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.870744 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.870753 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.870761 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.887080 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f3494746-cd7f-4497-b123-6ca7196e6480" (UID: "f3494746-cd7f-4497-b123-6ca7196e6480"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.903679 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.972274 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3494746-cd7f-4497-b123-6ca7196e6480-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:42 crc kubenswrapper[4743]: I1122 08:43:42.972311 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.166079 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a2dcda9-85a3-4b08-a32b-14710e0a3b55" path="/var/lib/kubelet/pods/8a2dcda9-85a3-4b08-a32b-14710e0a3b55/volumes" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.329971 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.338033 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f3494746-cd7f-4497-b123-6ca7196e6480","Type":"ContainerDied","Data":"c5c798f83a478c33ffbeeacfb2c41f29a486eee82fdde21e145333640afea8b9"} Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.338084 4743 scope.go:117] "RemoveContainer" containerID="6de714267c676d7c85c60c654a52b66ec288ebaa622fecd0837430e2a8ee8f23" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.338206 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.393082 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.398715 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.413688 4743 scope.go:117] "RemoveContainer" containerID="6d178b1ea7a4e098244482ffdc9fe6d4b9d77eeab15b91ba026d2d050bfa9a72" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.421763 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:43:43 crc kubenswrapper[4743]: E1122 08:43:43.422503 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3494746-cd7f-4497-b123-6ca7196e6480" containerName="glance-log" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.422520 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3494746-cd7f-4497-b123-6ca7196e6480" containerName="glance-log" Nov 22 08:43:43 crc kubenswrapper[4743]: E1122 08:43:43.422538 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3494746-cd7f-4497-b123-6ca7196e6480" containerName="glance-httpd" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.422544 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3494746-cd7f-4497-b123-6ca7196e6480" containerName="glance-httpd" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.422753 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3494746-cd7f-4497-b123-6ca7196e6480" containerName="glance-httpd" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.422775 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3494746-cd7f-4497-b123-6ca7196e6480" containerName="glance-log" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.424286 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.430466 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.430616 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.434057 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.583449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.583662 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gm9q\" (UniqueName: \"kubernetes.io/projected/c61760fb-827b-4199-bfdb-52698c7b4824-kube-api-access-5gm9q\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.583740 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.583800 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-logs\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.583888 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.584069 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-scripts\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.584113 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.584161 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-config-data\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.685764 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.685840 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gm9q\" (UniqueName: \"kubernetes.io/projected/c61760fb-827b-4199-bfdb-52698c7b4824-kube-api-access-5gm9q\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.685865 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.685910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-logs\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.686666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.686705 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-scripts\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.686726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.686725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-logs\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.686742 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-config-data\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.687008 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.688896 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.698402 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.698685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.699028 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-config-data\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.705777 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-scripts\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.718963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gm9q\" (UniqueName: \"kubernetes.io/projected/c61760fb-827b-4199-bfdb-52698c7b4824-kube-api-access-5gm9q\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.728035 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " pod="openstack/glance-default-external-api-0" Nov 22 08:43:43 crc kubenswrapper[4743]: I1122 08:43:43.758123 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:43:44 crc kubenswrapper[4743]: I1122 08:43:44.315532 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:43:44 crc kubenswrapper[4743]: I1122 08:43:44.377768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dca6d95c-89d6-4b49-bf28-2606b9b5c05e","Type":"ContainerStarted","Data":"fe08eff973531e6f3659274cc446f032181fff809dc4a40cc87a8af36f126183"} Nov 22 08:43:44 crc kubenswrapper[4743]: I1122 08:43:44.379965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c61760fb-827b-4199-bfdb-52698c7b4824","Type":"ContainerStarted","Data":"46b71e1b48f9c7665baab713cd7dc6fa5b9dc43ef77f56a297a7f8b0a42d5cf6"} Nov 22 08:43:45 crc kubenswrapper[4743]: I1122 08:43:45.188248 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3494746-cd7f-4497-b123-6ca7196e6480" path="/var/lib/kubelet/pods/f3494746-cd7f-4497-b123-6ca7196e6480/volumes" Nov 22 08:43:45 crc kubenswrapper[4743]: I1122 08:43:45.391078 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dca6d95c-89d6-4b49-bf28-2606b9b5c05e","Type":"ContainerStarted","Data":"d713e66a35891819a155186b552565a296254b8c93475b9aa0a54b55dd7cbd38"} Nov 22 08:43:46 crc kubenswrapper[4743]: I1122 08:43:46.404814 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c61760fb-827b-4199-bfdb-52698c7b4824","Type":"ContainerStarted","Data":"d85aa17d800ad1cbc94e2aaf79a94f094b2da7ff02061d9a5cc19841c8f58bb3"} Nov 22 08:43:46 crc kubenswrapper[4743]: I1122 08:43:46.407808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dca6d95c-89d6-4b49-bf28-2606b9b5c05e","Type":"ContainerStarted","Data":"6671bb99b39fc16a0f6c253ac0e494e49254030b8ca083c6f60cb786f074a063"} Nov 22 08:43:46 crc kubenswrapper[4743]: I1122 08:43:46.432935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e98e6ad-ce29-4f91-868c-975859995174","Type":"ContainerStarted","Data":"3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889"} Nov 22 08:43:46 crc kubenswrapper[4743]: I1122 08:43:46.433160 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="ceilometer-central-agent" containerID="cri-o://6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997" gracePeriod=30 Nov 22 08:43:46 crc kubenswrapper[4743]: I1122 08:43:46.433263 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 08:43:46 crc kubenswrapper[4743]: I1122 08:43:46.433306 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="proxy-httpd" containerID="cri-o://3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889" gracePeriod=30 Nov 22 08:43:46 crc kubenswrapper[4743]: I1122 08:43:46.433397 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="sg-core" containerID="cri-o://3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04" gracePeriod=30 Nov 22 08:43:46 crc kubenswrapper[4743]: I1122 08:43:46.433810 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="ceilometer-notification-agent" containerID="cri-o://80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d" gracePeriod=30 Nov 22 08:43:46 crc kubenswrapper[4743]: I1122 08:43:46.435124 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.435103268 podStartE2EDuration="4.435103268s" podCreationTimestamp="2025-11-22 08:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:43:46.427818748 +0000 UTC m=+1300.134179810" watchObservedRunningTime="2025-11-22 08:43:46.435103268 +0000 UTC m=+1300.141464320" Nov 22 08:43:46 crc kubenswrapper[4743]: I1122 08:43:46.463841 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.207368722 podStartE2EDuration="12.463820847s" podCreationTimestamp="2025-11-22 08:43:34 +0000 UTC" firstStartedPulling="2025-11-22 08:43:39.55425502 +0000 UTC m=+1293.260616072" lastFinishedPulling="2025-11-22 08:43:44.810707145 +0000 UTC m=+1298.517068197" observedRunningTime="2025-11-22 08:43:46.45769575 +0000 UTC m=+1300.164056802" watchObservedRunningTime="2025-11-22 08:43:46.463820847 +0000 UTC m=+1300.170181899" Nov 22 08:43:47 crc kubenswrapper[4743]: I1122 08:43:47.443736 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c61760fb-827b-4199-bfdb-52698c7b4824","Type":"ContainerStarted","Data":"f148d19bec9da2034a614aa3685da5500ee102ac2162e404c9df6a8dd6001346"} Nov 22 08:43:47 crc kubenswrapper[4743]: I1122 08:43:47.446715 4743 generic.go:334] "Generic (PLEG): container finished" podID="9e98e6ad-ce29-4f91-868c-975859995174" containerID="3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889" exitCode=0 Nov 22 08:43:47 crc kubenswrapper[4743]: I1122 08:43:47.446743 4743 generic.go:334] "Generic (PLEG): container finished" podID="9e98e6ad-ce29-4f91-868c-975859995174" containerID="3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04" exitCode=2 Nov 22 08:43:47 crc kubenswrapper[4743]: I1122 08:43:47.446753 4743 generic.go:334] "Generic (PLEG): container finished" podID="9e98e6ad-ce29-4f91-868c-975859995174" containerID="80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d" exitCode=0 Nov 22 08:43:47 crc kubenswrapper[4743]: I1122 08:43:47.446741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e98e6ad-ce29-4f91-868c-975859995174","Type":"ContainerDied","Data":"3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889"} Nov 22 08:43:47 crc kubenswrapper[4743]: I1122 08:43:47.446779 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e98e6ad-ce29-4f91-868c-975859995174","Type":"ContainerDied","Data":"3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04"} Nov 22 08:43:47 crc kubenswrapper[4743]: I1122 08:43:47.446792 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e98e6ad-ce29-4f91-868c-975859995174","Type":"ContainerDied","Data":"80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d"} Nov 22 08:43:48 crc kubenswrapper[4743]: I1122 08:43:48.480458 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.480437144 podStartE2EDuration="5.480437144s" podCreationTimestamp="2025-11-22 08:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:43:48.474672858 +0000 UTC m=+1302.181033910" watchObservedRunningTime="2025-11-22 08:43:48.480437144 +0000 UTC m=+1302.186798196" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.466955 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.521697 4743 generic.go:334] "Generic (PLEG): container finished" podID="9e98e6ad-ce29-4f91-868c-975859995174" containerID="6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997" exitCode=0 Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.521786 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.521847 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e98e6ad-ce29-4f91-868c-975859995174","Type":"ContainerDied","Data":"6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997"} Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.522193 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9e98e6ad-ce29-4f91-868c-975859995174","Type":"ContainerDied","Data":"15c9d8f35ec773a5772ed1a93dd502f17ed5c26bc0bc4a505eb8b003e283ef04"} Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.522257 4743 scope.go:117] "RemoveContainer" containerID="3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.549755 4743 scope.go:117] "RemoveContainer" containerID="3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.571182 4743 scope.go:117] "RemoveContainer" containerID="80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.590022 4743 scope.go:117] "RemoveContainer" containerID="6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.609288 4743 scope.go:117] "RemoveContainer" containerID="3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889" Nov 22 08:43:52 crc kubenswrapper[4743]: E1122 08:43:52.609922 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889\": container with ID starting with 3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889 not found: ID does not exist" containerID="3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.609966 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889"} err="failed to get container status \"3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889\": rpc error: code = NotFound desc = could not find container \"3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889\": container with ID starting with 3dbf8426be51d6ff33652bb968a84eef6c5aa2dc7c1c1a9911eb1e18ac7a3889 not found: ID does not exist" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.609995 4743 scope.go:117] "RemoveContainer" containerID="3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04" Nov 22 08:43:52 crc kubenswrapper[4743]: E1122 08:43:52.610249 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04\": container with ID starting with 3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04 not found: ID does not exist" containerID="3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.610285 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04"} err="failed to get container status \"3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04\": rpc error: code = NotFound desc = could not find container \"3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04\": container with ID starting with 3aa20263fce5b0cf1f1fa38a041046051252ecb6a862adba4f7c5e83744d5d04 not found: ID does not exist" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.610306 4743 scope.go:117] "RemoveContainer" containerID="80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d" Nov 22 08:43:52 crc kubenswrapper[4743]: E1122 08:43:52.610623 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d\": container with ID starting with 80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d not found: ID does not exist" containerID="80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.610666 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d"} err="failed to get container status \"80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d\": rpc error: code = NotFound desc = could not find container \"80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d\": container with ID starting with 80b9052d25792ce6a18f665ff8fff6961473777fc4c5504b325e8a2e3508644d not found: ID does not exist" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.610692 4743 scope.go:117] "RemoveContainer" containerID="6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997" Nov 22 08:43:52 crc kubenswrapper[4743]: E1122 08:43:52.610993 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997\": container with ID starting with 6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997 not found: ID does not exist" containerID="6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.611025 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997"} err="failed to get container status \"6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997\": rpc error: code = NotFound desc = could not find container \"6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997\": container with ID starting with 6176cf609ed6f918583a2612d81ac5a036ea5a0cf84ecc242dccc4943af69997 not found: ID does not exist" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.639062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-scripts\") pod \"9e98e6ad-ce29-4f91-868c-975859995174\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.639127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-log-httpd\") pod \"9e98e6ad-ce29-4f91-868c-975859995174\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.639156 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsbtf\" (UniqueName: \"kubernetes.io/projected/9e98e6ad-ce29-4f91-868c-975859995174-kube-api-access-tsbtf\") pod \"9e98e6ad-ce29-4f91-868c-975859995174\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.639200 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-sg-core-conf-yaml\") pod \"9e98e6ad-ce29-4f91-868c-975859995174\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.639300 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-combined-ca-bundle\") pod \"9e98e6ad-ce29-4f91-868c-975859995174\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.639377 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-run-httpd\") pod \"9e98e6ad-ce29-4f91-868c-975859995174\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.639422 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-config-data\") pod \"9e98e6ad-ce29-4f91-868c-975859995174\" (UID: \"9e98e6ad-ce29-4f91-868c-975859995174\") " Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.639744 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9e98e6ad-ce29-4f91-868c-975859995174" (UID: "9e98e6ad-ce29-4f91-868c-975859995174"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.639900 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.640202 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9e98e6ad-ce29-4f91-868c-975859995174" (UID: "9e98e6ad-ce29-4f91-868c-975859995174"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.645763 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-scripts" (OuterVolumeSpecName: "scripts") pod "9e98e6ad-ce29-4f91-868c-975859995174" (UID: "9e98e6ad-ce29-4f91-868c-975859995174"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.646602 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e98e6ad-ce29-4f91-868c-975859995174-kube-api-access-tsbtf" (OuterVolumeSpecName: "kube-api-access-tsbtf") pod "9e98e6ad-ce29-4f91-868c-975859995174" (UID: "9e98e6ad-ce29-4f91-868c-975859995174"). InnerVolumeSpecName "kube-api-access-tsbtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.671936 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9e98e6ad-ce29-4f91-868c-975859995174" (UID: "9e98e6ad-ce29-4f91-868c-975859995174"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.723275 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e98e6ad-ce29-4f91-868c-975859995174" (UID: "9e98e6ad-ce29-4f91-868c-975859995174"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.743792 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.743858 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9e98e6ad-ce29-4f91-868c-975859995174-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.743877 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsbtf\" (UniqueName: \"kubernetes.io/projected/9e98e6ad-ce29-4f91-868c-975859995174-kube-api-access-tsbtf\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.743895 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.743937 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.751467 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-config-data" (OuterVolumeSpecName: "config-data") pod "9e98e6ad-ce29-4f91-868c-975859995174" (UID: "9e98e6ad-ce29-4f91-868c-975859995174"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.762758 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.762809 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.793767 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.807764 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.846354 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e98e6ad-ce29-4f91-868c-975859995174-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.855652 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.863316 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.882858 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:52 crc kubenswrapper[4743]: E1122 08:43:52.883248 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="ceilometer-central-agent" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.883264 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="ceilometer-central-agent" Nov 22 08:43:52 crc kubenswrapper[4743]: E1122 08:43:52.883295 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="proxy-httpd" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.883301 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="proxy-httpd" Nov 22 08:43:52 crc kubenswrapper[4743]: E1122 08:43:52.883319 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="ceilometer-notification-agent" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.883325 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="ceilometer-notification-agent" Nov 22 08:43:52 crc kubenswrapper[4743]: E1122 08:43:52.883344 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="sg-core" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.883353 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="sg-core" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.883511 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="proxy-httpd" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.883523 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="ceilometer-central-agent" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.883536 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="ceilometer-notification-agent" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.883550 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e98e6ad-ce29-4f91-868c-975859995174" containerName="sg-core" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.885144 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.890014 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.890323 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 08:43:52 crc kubenswrapper[4743]: I1122 08:43:52.893227 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.048934 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-scripts\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.049161 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fhqp\" (UniqueName: \"kubernetes.io/projected/b7610922-7bb6-4198-bc73-50dc7a220848-kube-api-access-6fhqp\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.049291 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-config-data\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.049389 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-run-httpd\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.049490 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.049604 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-log-httpd\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.049722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.151114 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-scripts\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.151364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fhqp\" (UniqueName: \"kubernetes.io/projected/b7610922-7bb6-4198-bc73-50dc7a220848-kube-api-access-6fhqp\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.151484 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-config-data\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.151620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-run-httpd\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.151702 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.151803 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-log-httpd\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.151898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.152418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-log-httpd\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.154879 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-run-httpd\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.156778 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.157407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-config-data\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.158128 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.161426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-scripts\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.163612 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e98e6ad-ce29-4f91-868c-975859995174" path="/var/lib/kubelet/pods/9e98e6ad-ce29-4f91-868c-975859995174/volumes" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.172201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fhqp\" (UniqueName: \"kubernetes.io/projected/b7610922-7bb6-4198-bc73-50dc7a220848-kube-api-access-6fhqp\") pod \"ceilometer-0\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.219015 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.531847 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.531886 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.675414 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:43:53 crc kubenswrapper[4743]: W1122 08:43:53.678522 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7610922_7bb6_4198_bc73_50dc7a220848.slice/crio-5522dc0c9dcf1c1c84e9f2a8c4d01b7fdf7e54a35d3fc9257e431671f265e3d3 WatchSource:0}: Error finding container 5522dc0c9dcf1c1c84e9f2a8c4d01b7fdf7e54a35d3fc9257e431671f265e3d3: Status 404 returned error can't find the container with id 5522dc0c9dcf1c1c84e9f2a8c4d01b7fdf7e54a35d3fc9257e431671f265e3d3 Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.762371 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.762420 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.790771 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 08:43:53 crc kubenswrapper[4743]: I1122 08:43:53.812213 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 08:43:54 crc kubenswrapper[4743]: I1122 08:43:54.554009 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7610922-7bb6-4198-bc73-50dc7a220848","Type":"ContainerStarted","Data":"5522dc0c9dcf1c1c84e9f2a8c4d01b7fdf7e54a35d3fc9257e431671f265e3d3"} Nov 22 08:43:54 crc kubenswrapper[4743]: I1122 08:43:54.554411 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 08:43:54 crc kubenswrapper[4743]: I1122 08:43:54.554426 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 08:43:55 crc kubenswrapper[4743]: I1122 08:43:55.442409 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:55 crc kubenswrapper[4743]: I1122 08:43:55.447500 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 08:43:55 crc kubenswrapper[4743]: I1122 08:43:55.582254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7610922-7bb6-4198-bc73-50dc7a220848","Type":"ContainerStarted","Data":"b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462"} Nov 22 08:43:55 crc kubenswrapper[4743]: I1122 08:43:55.584276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7610922-7bb6-4198-bc73-50dc7a220848","Type":"ContainerStarted","Data":"fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e"} Nov 22 08:43:56 crc kubenswrapper[4743]: I1122 08:43:56.592288 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7610922-7bb6-4198-bc73-50dc7a220848","Type":"ContainerStarted","Data":"bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe"} Nov 22 08:43:56 crc kubenswrapper[4743]: I1122 08:43:56.596108 4743 generic.go:334] "Generic (PLEG): container finished" podID="041f321c-a19a-46ba-83e0-5934dd806565" containerID="83164b2e658bb9ac77208bcdab8d7ea5bcddd9ddb221ef2bb7c6d22ed509bf07" exitCode=0 Nov 22 08:43:56 crc kubenswrapper[4743]: I1122 08:43:56.596202 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z54x2" event={"ID":"041f321c-a19a-46ba-83e0-5934dd806565","Type":"ContainerDied","Data":"83164b2e658bb9ac77208bcdab8d7ea5bcddd9ddb221ef2bb7c6d22ed509bf07"} Nov 22 08:43:56 crc kubenswrapper[4743]: I1122 08:43:56.596330 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 08:43:56 crc kubenswrapper[4743]: I1122 08:43:56.596418 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 08:43:56 crc kubenswrapper[4743]: I1122 08:43:56.608399 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 08:43:56 crc kubenswrapper[4743]: I1122 08:43:56.723155 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 08:43:57 crc kubenswrapper[4743]: I1122 08:43:57.605398 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7610922-7bb6-4198-bc73-50dc7a220848","Type":"ContainerStarted","Data":"f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4"} Nov 22 08:43:57 crc kubenswrapper[4743]: I1122 08:43:57.606087 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 08:43:57 crc kubenswrapper[4743]: I1122 08:43:57.653092 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.099492067 podStartE2EDuration="5.653068753s" podCreationTimestamp="2025-11-22 08:43:52 +0000 UTC" firstStartedPulling="2025-11-22 08:43:53.680657695 +0000 UTC m=+1307.387018747" lastFinishedPulling="2025-11-22 08:43:57.234234381 +0000 UTC m=+1310.940595433" observedRunningTime="2025-11-22 08:43:57.626220798 +0000 UTC m=+1311.332581850" watchObservedRunningTime="2025-11-22 08:43:57.653068753 +0000 UTC m=+1311.359429795" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.043082 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.144132 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-scripts\") pod \"041f321c-a19a-46ba-83e0-5934dd806565\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.144722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-config-data\") pod \"041f321c-a19a-46ba-83e0-5934dd806565\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.144755 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7zpr\" (UniqueName: \"kubernetes.io/projected/041f321c-a19a-46ba-83e0-5934dd806565-kube-api-access-v7zpr\") pod \"041f321c-a19a-46ba-83e0-5934dd806565\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.144810 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-combined-ca-bundle\") pod \"041f321c-a19a-46ba-83e0-5934dd806565\" (UID: \"041f321c-a19a-46ba-83e0-5934dd806565\") " Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.151590 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/041f321c-a19a-46ba-83e0-5934dd806565-kube-api-access-v7zpr" (OuterVolumeSpecName: "kube-api-access-v7zpr") pod "041f321c-a19a-46ba-83e0-5934dd806565" (UID: "041f321c-a19a-46ba-83e0-5934dd806565"). InnerVolumeSpecName "kube-api-access-v7zpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.153438 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-scripts" (OuterVolumeSpecName: "scripts") pod "041f321c-a19a-46ba-83e0-5934dd806565" (UID: "041f321c-a19a-46ba-83e0-5934dd806565"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.179837 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-config-data" (OuterVolumeSpecName: "config-data") pod "041f321c-a19a-46ba-83e0-5934dd806565" (UID: "041f321c-a19a-46ba-83e0-5934dd806565"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.182869 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "041f321c-a19a-46ba-83e0-5934dd806565" (UID: "041f321c-a19a-46ba-83e0-5934dd806565"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.245787 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.245826 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7zpr\" (UniqueName: \"kubernetes.io/projected/041f321c-a19a-46ba-83e0-5934dd806565-kube-api-access-v7zpr\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.245842 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.245852 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/041f321c-a19a-46ba-83e0-5934dd806565-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.616351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-z54x2" event={"ID":"041f321c-a19a-46ba-83e0-5934dd806565","Type":"ContainerDied","Data":"600b1cf30d654205deca311afcce5ee1406bb8b0718b2a43287436daaaaf92ed"} Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.616735 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="600b1cf30d654205deca311afcce5ee1406bb8b0718b2a43287436daaaaf92ed" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.616433 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-z54x2" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.703114 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 08:43:58 crc kubenswrapper[4743]: E1122 08:43:58.703628 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="041f321c-a19a-46ba-83e0-5934dd806565" containerName="nova-cell0-conductor-db-sync" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.703650 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="041f321c-a19a-46ba-83e0-5934dd806565" containerName="nova-cell0-conductor-db-sync" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.703888 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="041f321c-a19a-46ba-83e0-5934dd806565" containerName="nova-cell0-conductor-db-sync" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.704670 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.708109 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hx2pq" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.710610 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.734069 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.857017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcjdh\" (UniqueName: \"kubernetes.io/projected/500679c5-1691-4831-b5ec-3c6cce19c503-kube-api-access-mcjdh\") pod \"nova-cell0-conductor-0\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.857211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.857382 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.959632 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.959877 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcjdh\" (UniqueName: \"kubernetes.io/projected/500679c5-1691-4831-b5ec-3c6cce19c503-kube-api-access-mcjdh\") pod \"nova-cell0-conductor-0\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.959970 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.979488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.980900 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:58 crc kubenswrapper[4743]: I1122 08:43:58.982941 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcjdh\" (UniqueName: \"kubernetes.io/projected/500679c5-1691-4831-b5ec-3c6cce19c503-kube-api-access-mcjdh\") pod \"nova-cell0-conductor-0\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:59 crc kubenswrapper[4743]: I1122 08:43:59.020356 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 08:43:59 crc kubenswrapper[4743]: I1122 08:43:59.496281 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 08:43:59 crc kubenswrapper[4743]: W1122 08:43:59.501496 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod500679c5_1691_4831_b5ec_3c6cce19c503.slice/crio-a20de782e28e0e8379c6797d783a4046209389085a5974006ff7b1c59c9dc05c WatchSource:0}: Error finding container a20de782e28e0e8379c6797d783a4046209389085a5974006ff7b1c59c9dc05c: Status 404 returned error can't find the container with id a20de782e28e0e8379c6797d783a4046209389085a5974006ff7b1c59c9dc05c Nov 22 08:43:59 crc kubenswrapper[4743]: I1122 08:43:59.624249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"500679c5-1691-4831-b5ec-3c6cce19c503","Type":"ContainerStarted","Data":"a20de782e28e0e8379c6797d783a4046209389085a5974006ff7b1c59c9dc05c"} Nov 22 08:44:00 crc kubenswrapper[4743]: I1122 08:44:00.634106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"500679c5-1691-4831-b5ec-3c6cce19c503","Type":"ContainerStarted","Data":"aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8"} Nov 22 08:44:00 crc kubenswrapper[4743]: I1122 08:44:00.634458 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 22 08:44:00 crc kubenswrapper[4743]: I1122 08:44:00.660660 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.660634814 podStartE2EDuration="2.660634814s" podCreationTimestamp="2025-11-22 08:43:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:00.650565163 +0000 UTC m=+1314.356926235" watchObservedRunningTime="2025-11-22 08:44:00.660634814 +0000 UTC m=+1314.366995896" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.048773 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.487362 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-8hnw7"] Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.488835 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.500886 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.501123 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.501765 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8hnw7"] Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.564258 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-config-data\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.564333 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.564371 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-scripts\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.564414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87jpn\" (UniqueName: \"kubernetes.io/projected/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-kube-api-access-87jpn\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.668763 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-config-data\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.668870 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.668947 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-scripts\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.669054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87jpn\" (UniqueName: \"kubernetes.io/projected/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-kube-api-access-87jpn\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.677643 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.683648 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-scripts\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.694565 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-config-data\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.697302 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.698985 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.707791 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.714375 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87jpn\" (UniqueName: \"kubernetes.io/projected/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-kube-api-access-87jpn\") pod \"nova-cell0-cell-mapping-8hnw7\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.726778 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.743825 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.763802 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.774937 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.775860 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-logs\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.775900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.775923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtblt\" (UniqueName: \"kubernetes.io/projected/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-kube-api-access-vtblt\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.775977 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.776003 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgbm7\" (UniqueName: \"kubernetes.io/projected/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-kube-api-access-tgbm7\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.776047 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-config-data\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.776099 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-config-data\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.776118 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-logs\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.792209 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.830303 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.831457 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.845675 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.851352 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.853750 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.881139 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-config-data\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.881469 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-config-data\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.881601 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-logs\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.881710 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-logs\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.881808 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.881915 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.882020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtblt\" (UniqueName: \"kubernetes.io/projected/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-kube-api-access-vtblt\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.882150 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.882313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.882413 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzvw4\" (UniqueName: \"kubernetes.io/projected/b4191582-06b4-46bb-be20-3f027173e83d-kube-api-access-mzvw4\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.882521 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgbm7\" (UniqueName: \"kubernetes.io/projected/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-kube-api-access-tgbm7\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.883348 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-logs\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.884248 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-logs\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.922047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.923080 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-config-data\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.923685 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.924292 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-config-data\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.928217 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.947326 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.955348 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.985430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtblt\" (UniqueName: \"kubernetes.io/projected/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-kube-api-access-vtblt\") pod \"nova-metadata-0\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " pod="openstack/nova-metadata-0" Nov 22 08:44:04 crc kubenswrapper[4743]: I1122 08:44:04.985548 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.002263 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzvw4\" (UniqueName: \"kubernetes.io/projected/b4191582-06b4-46bb-be20-3f027173e83d-kube-api-access-mzvw4\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.003629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.014221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.031169 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgbm7\" (UniqueName: \"kubernetes.io/projected/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-kube-api-access-tgbm7\") pod \"nova-api-0\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " pod="openstack/nova-api-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.038931 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.050927 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.054926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzvw4\" (UniqueName: \"kubernetes.io/projected/b4191582-06b4-46bb-be20-3f027173e83d-kube-api-access-mzvw4\") pod \"nova-cell1-novncproxy-0\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.071568 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-dvstw"] Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.090947 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-dvstw"] Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.091063 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.106289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-config\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.106338 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.106370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.106397 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.106426 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6lsm\" (UniqueName: \"kubernetes.io/projected/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-kube-api-access-r6lsm\") pod \"nova-scheduler-0\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.106571 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-svc\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.106766 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-config-data\") pod \"nova-scheduler-0\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.107094 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p85l6\" (UniqueName: \"kubernetes.io/projected/1c8fd004-0cbe-4a32-87cf-d199a7f39716-kube-api-access-p85l6\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.107195 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.120857 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.146508 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.208596 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p85l6\" (UniqueName: \"kubernetes.io/projected/1c8fd004-0cbe-4a32-87cf-d199a7f39716-kube-api-access-p85l6\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.208666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.208716 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-config\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.208752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.208785 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.208815 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.208846 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6lsm\" (UniqueName: \"kubernetes.io/projected/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-kube-api-access-r6lsm\") pod \"nova-scheduler-0\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.208880 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-svc\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.208909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-config-data\") pod \"nova-scheduler-0\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.210632 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.211447 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.212140 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-config\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.213032 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-svc\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.214781 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-config-data\") pod \"nova-scheduler-0\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.216489 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.221515 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.245004 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p85l6\" (UniqueName: \"kubernetes.io/projected/1c8fd004-0cbe-4a32-87cf-d199a7f39716-kube-api-access-p85l6\") pod \"dnsmasq-dns-bccf8f775-dvstw\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.245711 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6lsm\" (UniqueName: \"kubernetes.io/projected/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-kube-api-access-r6lsm\") pod \"nova-scheduler-0\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.282783 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.372334 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.413504 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.457208 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8hnw7"] Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.538275 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hccj5"] Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.542346 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.547774 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.548462 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.550342 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hccj5"] Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.705852 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8hnw7" event={"ID":"2dfecaa0-8299-4d91-a2eb-11ddb19e029d","Type":"ContainerStarted","Data":"370da40fca8ff59e35b79c480f5472d417be5c4a2e3bef0fe5b83260d30d331b"} Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.717982 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-config-data\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.718069 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.718246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-scripts\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.718364 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9h9p\" (UniqueName: \"kubernetes.io/projected/58260e6d-177b-49c5-beac-c516036341a4-kube-api-access-z9h9p\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.747288 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.792765 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.837279 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-config-data\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.837358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.837391 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-scripts\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.837422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9h9p\" (UniqueName: \"kubernetes.io/projected/58260e6d-177b-49c5-beac-c516036341a4-kube-api-access-z9h9p\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.842986 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-scripts\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.868272 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9h9p\" (UniqueName: \"kubernetes.io/projected/58260e6d-177b-49c5-beac-c516036341a4-kube-api-access-z9h9p\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.878661 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-config-data\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:05 crc kubenswrapper[4743]: I1122 08:44:05.885182 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hccj5\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.175197 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.208509 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.221651 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-dvstw"] Nov 22 08:44:06 crc kubenswrapper[4743]: W1122 08:44:06.228818 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4191582_06b4_46bb_be20_3f027173e83d.slice/crio-6c483e64e144b6c9d168d04f0e5089100ce7f1f5336ceeef739e2a0678a5de29 WatchSource:0}: Error finding container 6c483e64e144b6c9d168d04f0e5089100ce7f1f5336ceeef739e2a0678a5de29: Status 404 returned error can't find the container with id 6c483e64e144b6c9d168d04f0e5089100ce7f1f5336ceeef739e2a0678a5de29 Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.231973 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.682890 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hccj5"] Nov 22 08:44:06 crc kubenswrapper[4743]: W1122 08:44:06.691129 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58260e6d_177b_49c5_beac_c516036341a4.slice/crio-9db0181351e64b4055ca99c885aa3dab6cd2940004da3b5f5e6ea0e9cc30b414 WatchSource:0}: Error finding container 9db0181351e64b4055ca99c885aa3dab6cd2940004da3b5f5e6ea0e9cc30b414: Status 404 returned error can't find the container with id 9db0181351e64b4055ca99c885aa3dab6cd2940004da3b5f5e6ea0e9cc30b414 Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.722140 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4191582-06b4-46bb-be20-3f027173e83d","Type":"ContainerStarted","Data":"6c483e64e144b6c9d168d04f0e5089100ce7f1f5336ceeef739e2a0678a5de29"} Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.728692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hccj5" event={"ID":"58260e6d-177b-49c5-beac-c516036341a4","Type":"ContainerStarted","Data":"9db0181351e64b4055ca99c885aa3dab6cd2940004da3b5f5e6ea0e9cc30b414"} Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.730465 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9ef09b2-ae5a-4889-ab43-0d63d1536c21","Type":"ContainerStarted","Data":"c13b60022daed7ca40c776da2e92dc5903c4f096980e07d6e5dfa7f6ee77100f"} Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.733454 4743 generic.go:334] "Generic (PLEG): container finished" podID="1c8fd004-0cbe-4a32-87cf-d199a7f39716" containerID="d7c8d3c8e57a43c68920b28d1a4096bd93ba2a85d4e96a9e8a3c68b721f19852" exitCode=0 Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.733531 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" event={"ID":"1c8fd004-0cbe-4a32-87cf-d199a7f39716","Type":"ContainerDied","Data":"d7c8d3c8e57a43c68920b28d1a4096bd93ba2a85d4e96a9e8a3c68b721f19852"} Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.733556 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" event={"ID":"1c8fd004-0cbe-4a32-87cf-d199a7f39716","Type":"ContainerStarted","Data":"bea7ef7891652427b440b67220d6bc7805f0bde9438c6f42cfede0a7aab3e18d"} Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.734724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72121226-9fe0-4ce0-ac86-85b9b8efa8d1","Type":"ContainerStarted","Data":"42fe87118a680e23194f9791def6e1cf2c47d6765d22de0415967b175d84b600"} Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.740204 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8hnw7" event={"ID":"2dfecaa0-8299-4d91-a2eb-11ddb19e029d","Type":"ContainerStarted","Data":"e06fe6ab41a32e8d8623a2af9e10dd42dc4c4ab2b15a0a739bcf375a4c618b9c"} Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.743259 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c950bf-5b72-4fb9-8216-3778fe25ccfc","Type":"ContainerStarted","Data":"e0186e487af28e27f27ef1ac21e1722ba00046db6e30c09ca643248fa7b5c813"} Nov 22 08:44:06 crc kubenswrapper[4743]: I1122 08:44:06.796323 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-8hnw7" podStartSLOduration=2.796298314 podStartE2EDuration="2.796298314s" podCreationTimestamp="2025-11-22 08:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:06.772479896 +0000 UTC m=+1320.478840948" watchObservedRunningTime="2025-11-22 08:44:06.796298314 +0000 UTC m=+1320.502659366" Nov 22 08:44:07 crc kubenswrapper[4743]: I1122 08:44:07.759046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hccj5" event={"ID":"58260e6d-177b-49c5-beac-c516036341a4","Type":"ContainerStarted","Data":"7305244bd79cd85c2c92eab84566fc7d97bcd7fde2ff9a55e6572d0e121cf472"} Nov 22 08:44:07 crc kubenswrapper[4743]: I1122 08:44:07.765183 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" event={"ID":"1c8fd004-0cbe-4a32-87cf-d199a7f39716","Type":"ContainerStarted","Data":"a57450da07eb6f8247c0be6c280bf1d2154596fb64d6bf9c5fc0fba236baa1c3"} Nov 22 08:44:07 crc kubenswrapper[4743]: I1122 08:44:07.765286 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:07 crc kubenswrapper[4743]: I1122 08:44:07.784091 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hccj5" podStartSLOduration=2.784071505 podStartE2EDuration="2.784071505s" podCreationTimestamp="2025-11-22 08:44:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:07.779281667 +0000 UTC m=+1321.485642719" watchObservedRunningTime="2025-11-22 08:44:07.784071505 +0000 UTC m=+1321.490432557" Nov 22 08:44:07 crc kubenswrapper[4743]: I1122 08:44:07.808455 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" podStartSLOduration=3.8084361380000002 podStartE2EDuration="3.808436138s" podCreationTimestamp="2025-11-22 08:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:07.800726656 +0000 UTC m=+1321.507087708" watchObservedRunningTime="2025-11-22 08:44:07.808436138 +0000 UTC m=+1321.514797190" Nov 22 08:44:08 crc kubenswrapper[4743]: I1122 08:44:08.995236 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:09 crc kubenswrapper[4743]: I1122 08:44:09.027715 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.808468 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4191582-06b4-46bb-be20-3f027173e83d","Type":"ContainerStarted","Data":"1e3e8fa38a54921afda25d5c5650246d86727b8836380c7c504af6b09ba4a0eb"} Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.808615 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b4191582-06b4-46bb-be20-3f027173e83d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://1e3e8fa38a54921afda25d5c5650246d86727b8836380c7c504af6b09ba4a0eb" gracePeriod=30 Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.811216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9ef09b2-ae5a-4889-ab43-0d63d1536c21","Type":"ContainerStarted","Data":"25714f0496337b391de1d8a2df60321bbed0b8223ec05bc6098ab9c41802a112"} Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.811265 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9ef09b2-ae5a-4889-ab43-0d63d1536c21","Type":"ContainerStarted","Data":"53636fc04a07b1ef06a3b8a8f46eb01c425e508cf05d43628793582ed3944470"} Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.812510 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72121226-9fe0-4ce0-ac86-85b9b8efa8d1","Type":"ContainerStarted","Data":"706d640758ae0b79ba22d6bc344c0dda6f6db4c83772b0db77ecc54979a29466"} Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.817858 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c950bf-5b72-4fb9-8216-3778fe25ccfc","Type":"ContainerStarted","Data":"492e5207af6a9e1731d67cadd169d49e5cc582d7148b44eb82c6f929c24d0bef"} Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.817919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c950bf-5b72-4fb9-8216-3778fe25ccfc","Type":"ContainerStarted","Data":"abd0ccc6b9e4a1d10ea110f193ff900ff785aa1a3e8d40270a0c0cd69c76d9fc"} Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.817961 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" containerName="nova-metadata-log" containerID="cri-o://abd0ccc6b9e4a1d10ea110f193ff900ff785aa1a3e8d40270a0c0cd69c76d9fc" gracePeriod=30 Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.818022 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" containerName="nova-metadata-metadata" containerID="cri-o://492e5207af6a9e1731d67cadd169d49e5cc582d7148b44eb82c6f929c24d0bef" gracePeriod=30 Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.826360 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.280567667 podStartE2EDuration="6.826337797s" podCreationTimestamp="2025-11-22 08:44:04 +0000 UTC" firstStartedPulling="2025-11-22 08:44:06.240163326 +0000 UTC m=+1319.946524378" lastFinishedPulling="2025-11-22 08:44:09.785933456 +0000 UTC m=+1323.492294508" observedRunningTime="2025-11-22 08:44:10.826173942 +0000 UTC m=+1324.532535004" watchObservedRunningTime="2025-11-22 08:44:10.826337797 +0000 UTC m=+1324.532698849" Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.844755 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.346525491 podStartE2EDuration="6.844734368s" podCreationTimestamp="2025-11-22 08:44:04 +0000 UTC" firstStartedPulling="2025-11-22 08:44:06.240122755 +0000 UTC m=+1319.946483797" lastFinishedPulling="2025-11-22 08:44:09.738331622 +0000 UTC m=+1323.444692674" observedRunningTime="2025-11-22 08:44:10.841424722 +0000 UTC m=+1324.547785774" watchObservedRunningTime="2025-11-22 08:44:10.844734368 +0000 UTC m=+1324.551095450" Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.877790 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.944028239 podStartE2EDuration="6.877752571s" podCreationTimestamp="2025-11-22 08:44:04 +0000 UTC" firstStartedPulling="2025-11-22 08:44:05.807150323 +0000 UTC m=+1319.513511375" lastFinishedPulling="2025-11-22 08:44:09.740874655 +0000 UTC m=+1323.447235707" observedRunningTime="2025-11-22 08:44:10.861873593 +0000 UTC m=+1324.568234645" watchObservedRunningTime="2025-11-22 08:44:10.877752571 +0000 UTC m=+1324.584113643" Nov 22 08:44:10 crc kubenswrapper[4743]: I1122 08:44:10.896986 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.944236945 podStartE2EDuration="6.896962116s" podCreationTimestamp="2025-11-22 08:44:04 +0000 UTC" firstStartedPulling="2025-11-22 08:44:05.786165847 +0000 UTC m=+1319.492526899" lastFinishedPulling="2025-11-22 08:44:09.738891018 +0000 UTC m=+1323.445252070" observedRunningTime="2025-11-22 08:44:10.888725158 +0000 UTC m=+1324.595086210" watchObservedRunningTime="2025-11-22 08:44:10.896962116 +0000 UTC m=+1324.603323168" Nov 22 08:44:11 crc kubenswrapper[4743]: I1122 08:44:11.828718 4743 generic.go:334] "Generic (PLEG): container finished" podID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" containerID="abd0ccc6b9e4a1d10ea110f193ff900ff785aa1a3e8d40270a0c0cd69c76d9fc" exitCode=143 Nov 22 08:44:11 crc kubenswrapper[4743]: I1122 08:44:11.829120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c950bf-5b72-4fb9-8216-3778fe25ccfc","Type":"ContainerDied","Data":"abd0ccc6b9e4a1d10ea110f193ff900ff785aa1a3e8d40270a0c0cd69c76d9fc"} Nov 22 08:44:12 crc kubenswrapper[4743]: I1122 08:44:12.862646 4743 generic.go:334] "Generic (PLEG): container finished" podID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" containerID="492e5207af6a9e1731d67cadd169d49e5cc582d7148b44eb82c6f929c24d0bef" exitCode=0 Nov 22 08:44:12 crc kubenswrapper[4743]: I1122 08:44:12.862956 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c950bf-5b72-4fb9-8216-3778fe25ccfc","Type":"ContainerDied","Data":"492e5207af6a9e1731d67cadd169d49e5cc582d7148b44eb82c6f929c24d0bef"} Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.357053 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.515314 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-logs\") pod \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.515542 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-config-data\") pod \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.515784 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtblt\" (UniqueName: \"kubernetes.io/projected/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-kube-api-access-vtblt\") pod \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.515855 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-combined-ca-bundle\") pod \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\" (UID: \"b6c950bf-5b72-4fb9-8216-3778fe25ccfc\") " Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.516553 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-logs" (OuterVolumeSpecName: "logs") pod "b6c950bf-5b72-4fb9-8216-3778fe25ccfc" (UID: "b6c950bf-5b72-4fb9-8216-3778fe25ccfc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.516915 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.521808 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-kube-api-access-vtblt" (OuterVolumeSpecName: "kube-api-access-vtblt") pod "b6c950bf-5b72-4fb9-8216-3778fe25ccfc" (UID: "b6c950bf-5b72-4fb9-8216-3778fe25ccfc"). InnerVolumeSpecName "kube-api-access-vtblt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.548251 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-config-data" (OuterVolumeSpecName: "config-data") pod "b6c950bf-5b72-4fb9-8216-3778fe25ccfc" (UID: "b6c950bf-5b72-4fb9-8216-3778fe25ccfc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.551763 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6c950bf-5b72-4fb9-8216-3778fe25ccfc" (UID: "b6c950bf-5b72-4fb9-8216-3778fe25ccfc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.618800 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtblt\" (UniqueName: \"kubernetes.io/projected/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-kube-api-access-vtblt\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.618839 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.618851 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6c950bf-5b72-4fb9-8216-3778fe25ccfc-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.886894 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6c950bf-5b72-4fb9-8216-3778fe25ccfc","Type":"ContainerDied","Data":"e0186e487af28e27f27ef1ac21e1722ba00046db6e30c09ca643248fa7b5c813"} Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.887150 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.887232 4743 scope.go:117] "RemoveContainer" containerID="492e5207af6a9e1731d67cadd169d49e5cc582d7148b44eb82c6f929c24d0bef" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.909344 4743 scope.go:117] "RemoveContainer" containerID="abd0ccc6b9e4a1d10ea110f193ff900ff785aa1a3e8d40270a0c0cd69c76d9fc" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.924540 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.936037 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.945741 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:14 crc kubenswrapper[4743]: E1122 08:44:14.946141 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" containerName="nova-metadata-metadata" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.946158 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" containerName="nova-metadata-metadata" Nov 22 08:44:14 crc kubenswrapper[4743]: E1122 08:44:14.946182 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" containerName="nova-metadata-log" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.946188 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" containerName="nova-metadata-log" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.946339 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" containerName="nova-metadata-metadata" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.946368 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" containerName="nova-metadata-log" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.948153 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.960082 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.960350 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 08:44:14 crc kubenswrapper[4743]: I1122 08:44:14.968839 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.122113 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.122179 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.132341 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-config-data\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.132511 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.132700 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb6lz\" (UniqueName: \"kubernetes.io/projected/3f9969c6-194e-44e7-b178-0a02830a7299-kube-api-access-hb6lz\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.132760 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9969c6-194e-44e7-b178-0a02830a7299-logs\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.132984 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.165685 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c950bf-5b72-4fb9-8216-3778fe25ccfc" path="/var/lib/kubelet/pods/b6c950bf-5b72-4fb9-8216-3778fe25ccfc/volumes" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.235050 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.235125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-config-data\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.235178 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.235221 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb6lz\" (UniqueName: \"kubernetes.io/projected/3f9969c6-194e-44e7-b178-0a02830a7299-kube-api-access-hb6lz\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.235241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9969c6-194e-44e7-b178-0a02830a7299-logs\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.235703 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9969c6-194e-44e7-b178-0a02830a7299-logs\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.240043 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-config-data\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.240727 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.244799 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.253535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb6lz\" (UniqueName: \"kubernetes.io/projected/3f9969c6-194e-44e7-b178-0a02830a7299-kube-api-access-hb6lz\") pod \"nova-metadata-0\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.279664 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.283620 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.373319 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.373511 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.410515 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.414893 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.487686 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w9dgz"] Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.487922 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" podUID="8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" containerName="dnsmasq-dns" containerID="cri-o://e9e305800baf94abf462f598104fa32c5ed7dcf8670598fe185ffc0786bbcc6a" gracePeriod=10 Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.796065 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:15 crc kubenswrapper[4743]: W1122 08:44:15.806216 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f9969c6_194e_44e7_b178_0a02830a7299.slice/crio-446cd75e49b36957ec049d9461fa254749b7dd49c7a3065c42b6892ea16826be WatchSource:0}: Error finding container 446cd75e49b36957ec049d9461fa254749b7dd49c7a3065c42b6892ea16826be: Status 404 returned error can't find the container with id 446cd75e49b36957ec049d9461fa254749b7dd49c7a3065c42b6892ea16826be Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.908547 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9969c6-194e-44e7-b178-0a02830a7299","Type":"ContainerStarted","Data":"446cd75e49b36957ec049d9461fa254749b7dd49c7a3065c42b6892ea16826be"} Nov 22 08:44:15 crc kubenswrapper[4743]: I1122 08:44:15.941421 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 08:44:16 crc kubenswrapper[4743]: I1122 08:44:16.203781 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 08:44:16 crc kubenswrapper[4743]: I1122 08:44:16.203782 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.181:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 08:44:16 crc kubenswrapper[4743]: I1122 08:44:16.919563 4743 generic.go:334] "Generic (PLEG): container finished" podID="8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" containerID="e9e305800baf94abf462f598104fa32c5ed7dcf8670598fe185ffc0786bbcc6a" exitCode=0 Nov 22 08:44:16 crc kubenswrapper[4743]: I1122 08:44:16.919612 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" event={"ID":"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe","Type":"ContainerDied","Data":"e9e305800baf94abf462f598104fa32c5ed7dcf8670598fe185ffc0786bbcc6a"} Nov 22 08:44:18 crc kubenswrapper[4743]: I1122 08:44:18.130837 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" podUID="8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: connect: connection refused" Nov 22 08:44:19 crc kubenswrapper[4743]: I1122 08:44:19.956351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9969c6-194e-44e7-b178-0a02830a7299","Type":"ContainerStarted","Data":"51b56e86e68f034956857d02e70ab7b8c255f8da35225eb79d7c716d381caf60"} Nov 22 08:44:21 crc kubenswrapper[4743]: I1122 08:44:21.907426 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.005864 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" event={"ID":"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe","Type":"ContainerDied","Data":"31c6b04926ef4af3fc66bf22ce89d3b4aac4b2a07bc7856bc783fa19ba616a81"} Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.005895 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-w9dgz" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.005935 4743 scope.go:117] "RemoveContainer" containerID="e9e305800baf94abf462f598104fa32c5ed7dcf8670598fe185ffc0786bbcc6a" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.008882 4743 generic.go:334] "Generic (PLEG): container finished" podID="2dfecaa0-8299-4d91-a2eb-11ddb19e029d" containerID="e06fe6ab41a32e8d8623a2af9e10dd42dc4c4ab2b15a0a739bcf375a4c618b9c" exitCode=0 Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.008945 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8hnw7" event={"ID":"2dfecaa0-8299-4d91-a2eb-11ddb19e029d","Type":"ContainerDied","Data":"e06fe6ab41a32e8d8623a2af9e10dd42dc4c4ab2b15a0a739bcf375a4c618b9c"} Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.084958 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-swift-storage-0\") pod \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.085077 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tdfd\" (UniqueName: \"kubernetes.io/projected/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-kube-api-access-5tdfd\") pod \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.085160 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-svc\") pod \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.085258 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-nb\") pod \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.085293 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-config\") pod \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.085319 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-sb\") pod \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\" (UID: \"8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe\") " Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.091781 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-kube-api-access-5tdfd" (OuterVolumeSpecName: "kube-api-access-5tdfd") pod "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" (UID: "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe"). InnerVolumeSpecName "kube-api-access-5tdfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.130773 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-config" (OuterVolumeSpecName: "config") pod "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" (UID: "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.134418 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" (UID: "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.138867 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" (UID: "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.150929 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" (UID: "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.180328 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" (UID: "8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.187245 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.187288 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.187301 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.187313 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.187326 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.187339 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tdfd\" (UniqueName: \"kubernetes.io/projected/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe-kube-api-access-5tdfd\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.339907 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w9dgz"] Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.347147 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-w9dgz"] Nov 22 08:44:22 crc kubenswrapper[4743]: I1122 08:44:22.626771 4743 scope.go:117] "RemoveContainer" containerID="a20fdbb0f842e51ec330438640c16b55c9539f6c5fb956a8d31a2287c2f59d62" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.166518 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" path="/var/lib/kubelet/pods/8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe/volumes" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.225480 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.517537 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.718813 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-scripts\") pod \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.718911 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87jpn\" (UniqueName: \"kubernetes.io/projected/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-kube-api-access-87jpn\") pod \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.719119 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-config-data\") pod \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.719169 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-combined-ca-bundle\") pod \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\" (UID: \"2dfecaa0-8299-4d91-a2eb-11ddb19e029d\") " Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.727071 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-kube-api-access-87jpn" (OuterVolumeSpecName: "kube-api-access-87jpn") pod "2dfecaa0-8299-4d91-a2eb-11ddb19e029d" (UID: "2dfecaa0-8299-4d91-a2eb-11ddb19e029d"). InnerVolumeSpecName "kube-api-access-87jpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.727721 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-scripts" (OuterVolumeSpecName: "scripts") pod "2dfecaa0-8299-4d91-a2eb-11ddb19e029d" (UID: "2dfecaa0-8299-4d91-a2eb-11ddb19e029d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.748540 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dfecaa0-8299-4d91-a2eb-11ddb19e029d" (UID: "2dfecaa0-8299-4d91-a2eb-11ddb19e029d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.759296 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-config-data" (OuterVolumeSpecName: "config-data") pod "2dfecaa0-8299-4d91-a2eb-11ddb19e029d" (UID: "2dfecaa0-8299-4d91-a2eb-11ddb19e029d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.822086 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.822121 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87jpn\" (UniqueName: \"kubernetes.io/projected/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-kube-api-access-87jpn\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.822139 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:23 crc kubenswrapper[4743]: I1122 08:44:23.822152 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dfecaa0-8299-4d91-a2eb-11ddb19e029d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:24 crc kubenswrapper[4743]: I1122 08:44:24.034914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8hnw7" event={"ID":"2dfecaa0-8299-4d91-a2eb-11ddb19e029d","Type":"ContainerDied","Data":"370da40fca8ff59e35b79c480f5472d417be5c4a2e3bef0fe5b83260d30d331b"} Nov 22 08:44:24 crc kubenswrapper[4743]: I1122 08:44:24.034959 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="370da40fca8ff59e35b79c480f5472d417be5c4a2e3bef0fe5b83260d30d331b" Nov 22 08:44:24 crc kubenswrapper[4743]: I1122 08:44:24.035019 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8hnw7" Nov 22 08:44:24 crc kubenswrapper[4743]: I1122 08:44:24.234486 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:44:24 crc kubenswrapper[4743]: I1122 08:44:24.234771 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="72121226-9fe0-4ce0-ac86-85b9b8efa8d1" containerName="nova-scheduler-scheduler" containerID="cri-o://706d640758ae0b79ba22d6bc344c0dda6f6db4c83772b0db77ecc54979a29466" gracePeriod=30 Nov 22 08:44:24 crc kubenswrapper[4743]: I1122 08:44:24.249536 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:24 crc kubenswrapper[4743]: I1122 08:44:24.249818 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerName="nova-api-log" containerID="cri-o://53636fc04a07b1ef06a3b8a8f46eb01c425e508cf05d43628793582ed3944470" gracePeriod=30 Nov 22 08:44:24 crc kubenswrapper[4743]: I1122 08:44:24.249978 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerName="nova-api-api" containerID="cri-o://25714f0496337b391de1d8a2df60321bbed0b8223ec05bc6098ab9c41802a112" gracePeriod=30 Nov 22 08:44:24 crc kubenswrapper[4743]: I1122 08:44:24.282963 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:25 crc kubenswrapper[4743]: E1122 08:44:25.376731 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="706d640758ae0b79ba22d6bc344c0dda6f6db4c83772b0db77ecc54979a29466" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 08:44:25 crc kubenswrapper[4743]: E1122 08:44:25.381597 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="706d640758ae0b79ba22d6bc344c0dda6f6db4c83772b0db77ecc54979a29466" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 08:44:25 crc kubenswrapper[4743]: E1122 08:44:25.383237 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="706d640758ae0b79ba22d6bc344c0dda6f6db4c83772b0db77ecc54979a29466" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 08:44:25 crc kubenswrapper[4743]: E1122 08:44:25.383316 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="72121226-9fe0-4ce0-ac86-85b9b8efa8d1" containerName="nova-scheduler-scheduler" Nov 22 08:44:26 crc kubenswrapper[4743]: I1122 08:44:26.055130 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9969c6-194e-44e7-b178-0a02830a7299","Type":"ContainerStarted","Data":"f048b11cd4338e8b749a00410627b046a8527d30b480f82a5f7d5ae76a4d63f6"} Nov 22 08:44:26 crc kubenswrapper[4743]: I1122 08:44:26.057470 4743 generic.go:334] "Generic (PLEG): container finished" podID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerID="53636fc04a07b1ef06a3b8a8f46eb01c425e508cf05d43628793582ed3944470" exitCode=143 Nov 22 08:44:26 crc kubenswrapper[4743]: I1122 08:44:26.057507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9ef09b2-ae5a-4889-ab43-0d63d1536c21","Type":"ContainerDied","Data":"53636fc04a07b1ef06a3b8a8f46eb01c425e508cf05d43628793582ed3944470"} Nov 22 08:44:27 crc kubenswrapper[4743]: I1122 08:44:27.070655 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f9969c6-194e-44e7-b178-0a02830a7299" containerName="nova-metadata-log" containerID="cri-o://51b56e86e68f034956857d02e70ab7b8c255f8da35225eb79d7c716d381caf60" gracePeriod=30 Nov 22 08:44:27 crc kubenswrapper[4743]: I1122 08:44:27.070691 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3f9969c6-194e-44e7-b178-0a02830a7299" containerName="nova-metadata-metadata" containerID="cri-o://f048b11cd4338e8b749a00410627b046a8527d30b480f82a5f7d5ae76a4d63f6" gracePeriod=30 Nov 22 08:44:27 crc kubenswrapper[4743]: I1122 08:44:27.094525 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=13.094502311 podStartE2EDuration="13.094502311s" podCreationTimestamp="2025-11-22 08:44:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:27.087655104 +0000 UTC m=+1340.794016176" watchObservedRunningTime="2025-11-22 08:44:27.094502311 +0000 UTC m=+1340.800863363" Nov 22 08:44:27 crc kubenswrapper[4743]: I1122 08:44:27.163880 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:44:27 crc kubenswrapper[4743]: I1122 08:44:27.164096 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1bc16799-92e0-45f0-a46d-770ef95eefa6" containerName="kube-state-metrics" containerID="cri-o://fde0e2caee1ca419e2f4d1b5ee9c79dd7c36dbd3603ce5038a379f25b387d1f8" gracePeriod=30 Nov 22 08:44:28 crc kubenswrapper[4743]: I1122 08:44:28.080081 4743 generic.go:334] "Generic (PLEG): container finished" podID="1bc16799-92e0-45f0-a46d-770ef95eefa6" containerID="fde0e2caee1ca419e2f4d1b5ee9c79dd7c36dbd3603ce5038a379f25b387d1f8" exitCode=2 Nov 22 08:44:28 crc kubenswrapper[4743]: I1122 08:44:28.080171 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1bc16799-92e0-45f0-a46d-770ef95eefa6","Type":"ContainerDied","Data":"fde0e2caee1ca419e2f4d1b5ee9c79dd7c36dbd3603ce5038a379f25b387d1f8"} Nov 22 08:44:28 crc kubenswrapper[4743]: I1122 08:44:28.081857 4743 generic.go:334] "Generic (PLEG): container finished" podID="3f9969c6-194e-44e7-b178-0a02830a7299" containerID="51b56e86e68f034956857d02e70ab7b8c255f8da35225eb79d7c716d381caf60" exitCode=143 Nov 22 08:44:28 crc kubenswrapper[4743]: I1122 08:44:28.081921 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9969c6-194e-44e7-b178-0a02830a7299","Type":"ContainerDied","Data":"51b56e86e68f034956857d02e70ab7b8c255f8da35225eb79d7c716d381caf60"} Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.093998 4743 generic.go:334] "Generic (PLEG): container finished" podID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerID="25714f0496337b391de1d8a2df60321bbed0b8223ec05bc6098ab9c41802a112" exitCode=0 Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.094330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9ef09b2-ae5a-4889-ab43-0d63d1536c21","Type":"ContainerDied","Data":"25714f0496337b391de1d8a2df60321bbed0b8223ec05bc6098ab9c41802a112"} Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.095924 4743 generic.go:334] "Generic (PLEG): container finished" podID="3f9969c6-194e-44e7-b178-0a02830a7299" containerID="f048b11cd4338e8b749a00410627b046a8527d30b480f82a5f7d5ae76a4d63f6" exitCode=0 Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.095967 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9969c6-194e-44e7-b178-0a02830a7299","Type":"ContainerDied","Data":"f048b11cd4338e8b749a00410627b046a8527d30b480f82a5f7d5ae76a4d63f6"} Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.097168 4743 generic.go:334] "Generic (PLEG): container finished" podID="72121226-9fe0-4ce0-ac86-85b9b8efa8d1" containerID="706d640758ae0b79ba22d6bc344c0dda6f6db4c83772b0db77ecc54979a29466" exitCode=0 Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.097201 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72121226-9fe0-4ce0-ac86-85b9b8efa8d1","Type":"ContainerDied","Data":"706d640758ae0b79ba22d6bc344c0dda6f6db4c83772b0db77ecc54979a29466"} Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.280878 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.429351 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75v6v\" (UniqueName: \"kubernetes.io/projected/1bc16799-92e0-45f0-a46d-770ef95eefa6-kube-api-access-75v6v\") pod \"1bc16799-92e0-45f0-a46d-770ef95eefa6\" (UID: \"1bc16799-92e0-45f0-a46d-770ef95eefa6\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.437117 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bc16799-92e0-45f0-a46d-770ef95eefa6-kube-api-access-75v6v" (OuterVolumeSpecName: "kube-api-access-75v6v") pod "1bc16799-92e0-45f0-a46d-770ef95eefa6" (UID: "1bc16799-92e0-45f0-a46d-770ef95eefa6"). InnerVolumeSpecName "kube-api-access-75v6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.524062 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.531313 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75v6v\" (UniqueName: \"kubernetes.io/projected/1bc16799-92e0-45f0-a46d-770ef95eefa6-kube-api-access-75v6v\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.632126 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-combined-ca-bundle\") pod \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.632231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-logs\") pod \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.632379 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-config-data\") pod \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.632455 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgbm7\" (UniqueName: \"kubernetes.io/projected/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-kube-api-access-tgbm7\") pod \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\" (UID: \"e9ef09b2-ae5a-4889-ab43-0d63d1536c21\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.633309 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-logs" (OuterVolumeSpecName: "logs") pod "e9ef09b2-ae5a-4889-ab43-0d63d1536c21" (UID: "e9ef09b2-ae5a-4889-ab43-0d63d1536c21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.638706 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-kube-api-access-tgbm7" (OuterVolumeSpecName: "kube-api-access-tgbm7") pod "e9ef09b2-ae5a-4889-ab43-0d63d1536c21" (UID: "e9ef09b2-ae5a-4889-ab43-0d63d1536c21"). InnerVolumeSpecName "kube-api-access-tgbm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.678158 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.679545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-config-data" (OuterVolumeSpecName: "config-data") pod "e9ef09b2-ae5a-4889-ab43-0d63d1536c21" (UID: "e9ef09b2-ae5a-4889-ab43-0d63d1536c21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.680789 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9ef09b2-ae5a-4889-ab43-0d63d1536c21" (UID: "e9ef09b2-ae5a-4889-ab43-0d63d1536c21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.689015 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.738634 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.738685 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.738699 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.738711 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgbm7\" (UniqueName: \"kubernetes.io/projected/e9ef09b2-ae5a-4889-ab43-0d63d1536c21-kube-api-access-tgbm7\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.840429 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-combined-ca-bundle\") pod \"3f9969c6-194e-44e7-b178-0a02830a7299\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.840513 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-nova-metadata-tls-certs\") pod \"3f9969c6-194e-44e7-b178-0a02830a7299\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.840571 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9969c6-194e-44e7-b178-0a02830a7299-logs\") pod \"3f9969c6-194e-44e7-b178-0a02830a7299\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.840725 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-combined-ca-bundle\") pod \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.840769 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb6lz\" (UniqueName: \"kubernetes.io/projected/3f9969c6-194e-44e7-b178-0a02830a7299-kube-api-access-hb6lz\") pod \"3f9969c6-194e-44e7-b178-0a02830a7299\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.840811 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6lsm\" (UniqueName: \"kubernetes.io/projected/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-kube-api-access-r6lsm\") pod \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.840842 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-config-data\") pod \"3f9969c6-194e-44e7-b178-0a02830a7299\" (UID: \"3f9969c6-194e-44e7-b178-0a02830a7299\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.840864 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-config-data\") pod \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\" (UID: \"72121226-9fe0-4ce0-ac86-85b9b8efa8d1\") " Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.841895 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f9969c6-194e-44e7-b178-0a02830a7299-logs" (OuterVolumeSpecName: "logs") pod "3f9969c6-194e-44e7-b178-0a02830a7299" (UID: "3f9969c6-194e-44e7-b178-0a02830a7299"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.845159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-kube-api-access-r6lsm" (OuterVolumeSpecName: "kube-api-access-r6lsm") pod "72121226-9fe0-4ce0-ac86-85b9b8efa8d1" (UID: "72121226-9fe0-4ce0-ac86-85b9b8efa8d1"). InnerVolumeSpecName "kube-api-access-r6lsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.845714 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9969c6-194e-44e7-b178-0a02830a7299-kube-api-access-hb6lz" (OuterVolumeSpecName: "kube-api-access-hb6lz") pod "3f9969c6-194e-44e7-b178-0a02830a7299" (UID: "3f9969c6-194e-44e7-b178-0a02830a7299"). InnerVolumeSpecName "kube-api-access-hb6lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.867398 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72121226-9fe0-4ce0-ac86-85b9b8efa8d1" (UID: "72121226-9fe0-4ce0-ac86-85b9b8efa8d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.868072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f9969c6-194e-44e7-b178-0a02830a7299" (UID: "3f9969c6-194e-44e7-b178-0a02830a7299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.870368 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-config-data" (OuterVolumeSpecName: "config-data") pod "3f9969c6-194e-44e7-b178-0a02830a7299" (UID: "3f9969c6-194e-44e7-b178-0a02830a7299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.872513 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-config-data" (OuterVolumeSpecName: "config-data") pod "72121226-9fe0-4ce0-ac86-85b9b8efa8d1" (UID: "72121226-9fe0-4ce0-ac86-85b9b8efa8d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.887155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3f9969c6-194e-44e7-b178-0a02830a7299" (UID: "3f9969c6-194e-44e7-b178-0a02830a7299"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.943921 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.943969 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.943986 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f9969c6-194e-44e7-b178-0a02830a7299-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.943999 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.944010 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb6lz\" (UniqueName: \"kubernetes.io/projected/3f9969c6-194e-44e7-b178-0a02830a7299-kube-api-access-hb6lz\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.944022 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6lsm\" (UniqueName: \"kubernetes.io/projected/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-kube-api-access-r6lsm\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.944033 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72121226-9fe0-4ce0-ac86-85b9b8efa8d1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:29 crc kubenswrapper[4743]: I1122 08:44:29.944044 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f9969c6-194e-44e7-b178-0a02830a7299-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.107856 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3f9969c6-194e-44e7-b178-0a02830a7299","Type":"ContainerDied","Data":"446cd75e49b36957ec049d9461fa254749b7dd49c7a3065c42b6892ea16826be"} Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.107875 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.107937 4743 scope.go:117] "RemoveContainer" containerID="f048b11cd4338e8b749a00410627b046a8527d30b480f82a5f7d5ae76a4d63f6" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.110650 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72121226-9fe0-4ce0-ac86-85b9b8efa8d1","Type":"ContainerDied","Data":"42fe87118a680e23194f9791def6e1cf2c47d6765d22de0415967b175d84b600"} Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.110665 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.112749 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.113053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1bc16799-92e0-45f0-a46d-770ef95eefa6","Type":"ContainerDied","Data":"94d25dcea6f7143c72e0e8b063dead3d2fb7b47423c5b2687150391f6f7f6f97"} Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.115233 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e9ef09b2-ae5a-4889-ab43-0d63d1536c21","Type":"ContainerDied","Data":"c13b60022daed7ca40c776da2e92dc5903c4f096980e07d6e5dfa7f6ee77100f"} Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.115294 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.146907 4743 scope.go:117] "RemoveContainer" containerID="51b56e86e68f034956857d02e70ab7b8c255f8da35225eb79d7c716d381caf60" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.149687 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.164310 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.208075 4743 scope.go:117] "RemoveContainer" containerID="706d640758ae0b79ba22d6bc344c0dda6f6db4c83772b0db77ecc54979a29466" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.263699 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.288729 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.302709 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.309209 4743 scope.go:117] "RemoveContainer" containerID="fde0e2caee1ca419e2f4d1b5ee9c79dd7c36dbd3603ce5038a379f25b387d1f8" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.312717 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: E1122 08:44:30.313145 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72121226-9fe0-4ce0-ac86-85b9b8efa8d1" containerName="nova-scheduler-scheduler" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313172 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="72121226-9fe0-4ce0-ac86-85b9b8efa8d1" containerName="nova-scheduler-scheduler" Nov 22 08:44:30 crc kubenswrapper[4743]: E1122 08:44:30.313191 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" containerName="dnsmasq-dns" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313200 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" containerName="dnsmasq-dns" Nov 22 08:44:30 crc kubenswrapper[4743]: E1122 08:44:30.313217 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerName="nova-api-log" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313225 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerName="nova-api-log" Nov 22 08:44:30 crc kubenswrapper[4743]: E1122 08:44:30.313244 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9969c6-194e-44e7-b178-0a02830a7299" containerName="nova-metadata-log" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313252 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9969c6-194e-44e7-b178-0a02830a7299" containerName="nova-metadata-log" Nov 22 08:44:30 crc kubenswrapper[4743]: E1122 08:44:30.313274 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerName="nova-api-api" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313282 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerName="nova-api-api" Nov 22 08:44:30 crc kubenswrapper[4743]: E1122 08:44:30.313297 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bc16799-92e0-45f0-a46d-770ef95eefa6" containerName="kube-state-metrics" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313305 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc16799-92e0-45f0-a46d-770ef95eefa6" containerName="kube-state-metrics" Nov 22 08:44:30 crc kubenswrapper[4743]: E1122 08:44:30.313315 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" containerName="init" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313323 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" containerName="init" Nov 22 08:44:30 crc kubenswrapper[4743]: E1122 08:44:30.313337 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dfecaa0-8299-4d91-a2eb-11ddb19e029d" containerName="nova-manage" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313345 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dfecaa0-8299-4d91-a2eb-11ddb19e029d" containerName="nova-manage" Nov 22 08:44:30 crc kubenswrapper[4743]: E1122 08:44:30.313358 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9969c6-194e-44e7-b178-0a02830a7299" containerName="nova-metadata-metadata" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313366 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9969c6-194e-44e7-b178-0a02830a7299" containerName="nova-metadata-metadata" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313677 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bc16799-92e0-45f0-a46d-770ef95eefa6" containerName="kube-state-metrics" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313698 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9969c6-194e-44e7-b178-0a02830a7299" containerName="nova-metadata-log" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313710 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="72121226-9fe0-4ce0-ac86-85b9b8efa8d1" containerName="nova-scheduler-scheduler" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313727 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9969c6-194e-44e7-b178-0a02830a7299" containerName="nova-metadata-metadata" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313738 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1642c8-4b5d-4c1d-ade3-a958a0a52dbe" containerName="dnsmasq-dns" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313746 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerName="nova-api-api" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313771 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" containerName="nova-api-log" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.313785 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dfecaa0-8299-4d91-a2eb-11ddb19e029d" containerName="nova-manage" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.315136 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.317477 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.317800 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.323647 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.339090 4743 scope.go:117] "RemoveContainer" containerID="25714f0496337b391de1d8a2df60321bbed0b8223ec05bc6098ab9c41802a112" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.343921 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.357348 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.359237 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.361683 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.363828 4743 scope.go:117] "RemoveContainer" containerID="53636fc04a07b1ef06a3b8a8f46eb01c425e508cf05d43628793582ed3944470" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.370777 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.372299 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.379287 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.384352 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.400525 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.411966 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.423628 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.433130 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.434448 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.437863 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.437921 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.443718 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.456707 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtqp8\" (UniqueName: \"kubernetes.io/projected/6837b2a8-dfb5-4277-87f4-483200d1ae93-kube-api-access-mtqp8\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.456909 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-config-data\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.457018 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.457066 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6837b2a8-dfb5-4277-87f4-483200d1ae93-logs\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.457243 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.529529 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.530046 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="sg-core" containerID="cri-o://bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe" gracePeriod=30 Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.530181 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="ceilometer-notification-agent" containerID="cri-o://b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462" gracePeriod=30 Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.530067 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="proxy-httpd" containerID="cri-o://f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4" gracePeriod=30 Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.529906 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="ceilometer-central-agent" containerID="cri-o://fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e" gracePeriod=30 Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559242 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtqp8\" (UniqueName: \"kubernetes.io/projected/6837b2a8-dfb5-4277-87f4-483200d1ae93-kube-api-access-mtqp8\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rpj7\" (UniqueName: \"kubernetes.io/projected/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-api-access-5rpj7\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559352 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9da353e0-8b25-44b3-8c96-6cc4355615ec-logs\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559377 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559404 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-config-data\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559626 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-config-data\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559656 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6h84f\" (UniqueName: \"kubernetes.io/projected/e7ef7cdc-68f1-4031-a5ef-66a910c50764-kube-api-access-6h84f\") pod \"nova-scheduler-0\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559689 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559705 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6837b2a8-dfb5-4277-87f4-483200d1ae93-logs\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-config-data\") pod \"nova-scheduler-0\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559742 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559761 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfkxb\" (UniqueName: \"kubernetes.io/projected/9da353e0-8b25-44b3-8c96-6cc4355615ec-kube-api-access-tfkxb\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559792 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559816 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.559840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.560596 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6837b2a8-dfb5-4277-87f4-483200d1ae93-logs\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.566340 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.566375 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.566610 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-config-data\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.578389 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtqp8\" (UniqueName: \"kubernetes.io/projected/6837b2a8-dfb5-4277-87f4-483200d1ae93-kube-api-access-mtqp8\") pod \"nova-metadata-0\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.643561 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.661069 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.661131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rpj7\" (UniqueName: \"kubernetes.io/projected/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-api-access-5rpj7\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.662146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9da353e0-8b25-44b3-8c96-6cc4355615ec-logs\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.662205 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.662303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-config-data\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.662357 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6h84f\" (UniqueName: \"kubernetes.io/projected/e7ef7cdc-68f1-4031-a5ef-66a910c50764-kube-api-access-6h84f\") pod \"nova-scheduler-0\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.662420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-config-data\") pod \"nova-scheduler-0\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.662453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.662487 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfkxb\" (UniqueName: \"kubernetes.io/projected/9da353e0-8b25-44b3-8c96-6cc4355615ec-kube-api-access-tfkxb\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.662552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.662639 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.663310 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9da353e0-8b25-44b3-8c96-6cc4355615ec-logs\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.665985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.667250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-config-data\") pod \"nova-scheduler-0\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.667359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.668125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.668542 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.674769 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-config-data\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.677830 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.678415 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6h84f\" (UniqueName: \"kubernetes.io/projected/e7ef7cdc-68f1-4031-a5ef-66a910c50764-kube-api-access-6h84f\") pod \"nova-scheduler-0\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.681882 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rpj7\" (UniqueName: \"kubernetes.io/projected/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-api-access-5rpj7\") pod \"kube-state-metrics-0\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " pod="openstack/kube-state-metrics-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.682014 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfkxb\" (UniqueName: \"kubernetes.io/projected/9da353e0-8b25-44b3-8c96-6cc4355615ec-kube-api-access-tfkxb\") pod \"nova-api-0\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.693094 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.704397 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:44:30 crc kubenswrapper[4743]: I1122 08:44:30.759108 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.133411 4743 generic.go:334] "Generic (PLEG): container finished" podID="b7610922-7bb6-4198-bc73-50dc7a220848" containerID="f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4" exitCode=0 Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.133468 4743 generic.go:334] "Generic (PLEG): container finished" podID="b7610922-7bb6-4198-bc73-50dc7a220848" containerID="bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe" exitCode=2 Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.133478 4743 generic.go:334] "Generic (PLEG): container finished" podID="b7610922-7bb6-4198-bc73-50dc7a220848" containerID="fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e" exitCode=0 Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.133516 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7610922-7bb6-4198-bc73-50dc7a220848","Type":"ContainerDied","Data":"f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4"} Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.133559 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7610922-7bb6-4198-bc73-50dc7a220848","Type":"ContainerDied","Data":"bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe"} Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.133593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7610922-7bb6-4198-bc73-50dc7a220848","Type":"ContainerDied","Data":"fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e"} Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.165155 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc16799-92e0-45f0-a46d-770ef95eefa6" path="/var/lib/kubelet/pods/1bc16799-92e0-45f0-a46d-770ef95eefa6/volumes" Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.166224 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9969c6-194e-44e7-b178-0a02830a7299" path="/var/lib/kubelet/pods/3f9969c6-194e-44e7-b178-0a02830a7299/volumes" Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.166966 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72121226-9fe0-4ce0-ac86-85b9b8efa8d1" path="/var/lib/kubelet/pods/72121226-9fe0-4ce0-ac86-85b9b8efa8d1/volumes" Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.168218 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ef09b2-ae5a-4889-ab43-0d63d1536c21" path="/var/lib/kubelet/pods/e9ef09b2-ae5a-4889-ab43-0d63d1536c21/volumes" Nov 22 08:44:31 crc kubenswrapper[4743]: W1122 08:44:31.206562 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6837b2a8_dfb5_4277_87f4_483200d1ae93.slice/crio-e00470aee988549c68eb3a6103d0e38d26e77a855268221a4925e5293b8785ae WatchSource:0}: Error finding container e00470aee988549c68eb3a6103d0e38d26e77a855268221a4925e5293b8785ae: Status 404 returned error can't find the container with id e00470aee988549c68eb3a6103d0e38d26e77a855268221a4925e5293b8785ae Nov 22 08:44:31 crc kubenswrapper[4743]: I1122 08:44:31.211733 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:44:32 crc kubenswrapper[4743]: I1122 08:44:32.137162 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:44:32 crc kubenswrapper[4743]: I1122 08:44:32.155863 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6837b2a8-dfb5-4277-87f4-483200d1ae93","Type":"ContainerStarted","Data":"e00470aee988549c68eb3a6103d0e38d26e77a855268221a4925e5293b8785ae"} Nov 22 08:44:32 crc kubenswrapper[4743]: I1122 08:44:32.159262 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:32 crc kubenswrapper[4743]: I1122 08:44:32.169114 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:44:32 crc kubenswrapper[4743]: I1122 08:44:32.938965 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.107394 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-scripts\") pod \"b7610922-7bb6-4198-bc73-50dc7a220848\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.107713 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-log-httpd\") pod \"b7610922-7bb6-4198-bc73-50dc7a220848\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.107788 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fhqp\" (UniqueName: \"kubernetes.io/projected/b7610922-7bb6-4198-bc73-50dc7a220848-kube-api-access-6fhqp\") pod \"b7610922-7bb6-4198-bc73-50dc7a220848\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.107823 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-run-httpd\") pod \"b7610922-7bb6-4198-bc73-50dc7a220848\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.107869 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-combined-ca-bundle\") pod \"b7610922-7bb6-4198-bc73-50dc7a220848\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.107982 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-sg-core-conf-yaml\") pod \"b7610922-7bb6-4198-bc73-50dc7a220848\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.108001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-config-data\") pod \"b7610922-7bb6-4198-bc73-50dc7a220848\" (UID: \"b7610922-7bb6-4198-bc73-50dc7a220848\") " Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.108321 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b7610922-7bb6-4198-bc73-50dc7a220848" (UID: "b7610922-7bb6-4198-bc73-50dc7a220848"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.108447 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.108468 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b7610922-7bb6-4198-bc73-50dc7a220848" (UID: "b7610922-7bb6-4198-bc73-50dc7a220848"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.114211 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-scripts" (OuterVolumeSpecName: "scripts") pod "b7610922-7bb6-4198-bc73-50dc7a220848" (UID: "b7610922-7bb6-4198-bc73-50dc7a220848"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.114371 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7610922-7bb6-4198-bc73-50dc7a220848-kube-api-access-6fhqp" (OuterVolumeSpecName: "kube-api-access-6fhqp") pod "b7610922-7bb6-4198-bc73-50dc7a220848" (UID: "b7610922-7bb6-4198-bc73-50dc7a220848"). InnerVolumeSpecName "kube-api-access-6fhqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.146880 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b7610922-7bb6-4198-bc73-50dc7a220848" (UID: "b7610922-7bb6-4198-bc73-50dc7a220848"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.189069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9da353e0-8b25-44b3-8c96-6cc4355615ec","Type":"ContainerStarted","Data":"9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5"} Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.189125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9da353e0-8b25-44b3-8c96-6cc4355615ec","Type":"ContainerStarted","Data":"4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249"} Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.189142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9da353e0-8b25-44b3-8c96-6cc4355615ec","Type":"ContainerStarted","Data":"9bde9edbaf428c4a403d7d6c49804ae33626fc4759e7aa51fa2e5fce98715abd"} Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.193444 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3a93a60-b315-4de2-96d7-d23c9cedbc9c","Type":"ContainerStarted","Data":"9599f4fa0992155906bf350a593ed3aa398353ce4ff4f43f444ac8fe006585ac"} Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.198101 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6837b2a8-dfb5-4277-87f4-483200d1ae93","Type":"ContainerStarted","Data":"80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d"} Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.198153 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6837b2a8-dfb5-4277-87f4-483200d1ae93","Type":"ContainerStarted","Data":"c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8"} Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.212289 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.212355 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.212382 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7610922-7bb6-4198-bc73-50dc7a220848-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.212397 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fhqp\" (UniqueName: \"kubernetes.io/projected/b7610922-7bb6-4198-bc73-50dc7a220848-kube-api-access-6fhqp\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.212907 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.212889603 podStartE2EDuration="3.212889603s" podCreationTimestamp="2025-11-22 08:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:33.211605806 +0000 UTC m=+1346.917966878" watchObservedRunningTime="2025-11-22 08:44:33.212889603 +0000 UTC m=+1346.919250655" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.213083 4743 generic.go:334] "Generic (PLEG): container finished" podID="b7610922-7bb6-4198-bc73-50dc7a220848" containerID="b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462" exitCode=0 Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.213167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7610922-7bb6-4198-bc73-50dc7a220848","Type":"ContainerDied","Data":"b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462"} Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.213237 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b7610922-7bb6-4198-bc73-50dc7a220848","Type":"ContainerDied","Data":"5522dc0c9dcf1c1c84e9f2a8c4d01b7fdf7e54a35d3fc9257e431671f265e3d3"} Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.213182 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.213263 4743 scope.go:117] "RemoveContainer" containerID="f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.228492 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7ef7cdc-68f1-4031-a5ef-66a910c50764","Type":"ContainerStarted","Data":"38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a"} Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.228549 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7ef7cdc-68f1-4031-a5ef-66a910c50764","Type":"ContainerStarted","Data":"9d9c461cd97b5c37192ca3c98c9d58f016c91ceb2d9f2433d3cbf903b42f2eaf"} Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.245510 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.2454866940000002 podStartE2EDuration="3.245486694s" podCreationTimestamp="2025-11-22 08:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:33.242287422 +0000 UTC m=+1346.948648474" watchObservedRunningTime="2025-11-22 08:44:33.245486694 +0000 UTC m=+1346.951847746" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.248432 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7610922-7bb6-4198-bc73-50dc7a220848" (UID: "b7610922-7bb6-4198-bc73-50dc7a220848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.276029 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.2760121460000002 podStartE2EDuration="3.276012146s" podCreationTimestamp="2025-11-22 08:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:33.275036048 +0000 UTC m=+1346.981397100" watchObservedRunningTime="2025-11-22 08:44:33.276012146 +0000 UTC m=+1346.982373198" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.316970 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.324822 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-config-data" (OuterVolumeSpecName: "config-data") pod "b7610922-7bb6-4198-bc73-50dc7a220848" (UID: "b7610922-7bb6-4198-bc73-50dc7a220848"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.383821 4743 scope.go:117] "RemoveContainer" containerID="bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.402086 4743 scope.go:117] "RemoveContainer" containerID="b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.417813 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7610922-7bb6-4198-bc73-50dc7a220848-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.420670 4743 scope.go:117] "RemoveContainer" containerID="fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.446478 4743 scope.go:117] "RemoveContainer" containerID="f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4" Nov 22 08:44:33 crc kubenswrapper[4743]: E1122 08:44:33.446987 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4\": container with ID starting with f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4 not found: ID does not exist" containerID="f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.447019 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4"} err="failed to get container status \"f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4\": rpc error: code = NotFound desc = could not find container \"f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4\": container with ID starting with f8afecc6d559a631ce4d1203ed064572b88c947065e8dde17fcb11cea928f9b4 not found: ID does not exist" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.447038 4743 scope.go:117] "RemoveContainer" containerID="bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe" Nov 22 08:44:33 crc kubenswrapper[4743]: E1122 08:44:33.447336 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe\": container with ID starting with bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe not found: ID does not exist" containerID="bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.447360 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe"} err="failed to get container status \"bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe\": rpc error: code = NotFound desc = could not find container \"bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe\": container with ID starting with bd79634602f1af3900093531cf48d07b16adac6a3895bca4341de05c38b7c5fe not found: ID does not exist" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.447372 4743 scope.go:117] "RemoveContainer" containerID="b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462" Nov 22 08:44:33 crc kubenswrapper[4743]: E1122 08:44:33.447943 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462\": container with ID starting with b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462 not found: ID does not exist" containerID="b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.447988 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462"} err="failed to get container status \"b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462\": rpc error: code = NotFound desc = could not find container \"b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462\": container with ID starting with b7241677a54b2870c197cc0ef0a4b553fe2f9c8b0b2ae3ad71ac051d330f8462 not found: ID does not exist" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.448014 4743 scope.go:117] "RemoveContainer" containerID="fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e" Nov 22 08:44:33 crc kubenswrapper[4743]: E1122 08:44:33.448331 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e\": container with ID starting with fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e not found: ID does not exist" containerID="fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.448372 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e"} err="failed to get container status \"fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e\": rpc error: code = NotFound desc = could not find container \"fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e\": container with ID starting with fb344aee5621da67abc2f6eacd659b556907459507ba9fa21da6e4872629b96e not found: ID does not exist" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.553042 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.563192 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.572759 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:33 crc kubenswrapper[4743]: E1122 08:44:33.573143 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="sg-core" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.573157 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="sg-core" Nov 22 08:44:33 crc kubenswrapper[4743]: E1122 08:44:33.573166 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="proxy-httpd" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.573174 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="proxy-httpd" Nov 22 08:44:33 crc kubenswrapper[4743]: E1122 08:44:33.573182 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="ceilometer-notification-agent" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.573188 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="ceilometer-notification-agent" Nov 22 08:44:33 crc kubenswrapper[4743]: E1122 08:44:33.573212 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="ceilometer-central-agent" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.573219 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="ceilometer-central-agent" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.573428 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="ceilometer-central-agent" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.573444 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="sg-core" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.573458 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="ceilometer-notification-agent" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.573470 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" containerName="proxy-httpd" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.575383 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.578142 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.584354 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.584823 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.586290 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.619853 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-config-data\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.619944 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.619974 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6rgx\" (UniqueName: \"kubernetes.io/projected/7076577a-0e3f-484b-9d48-f78906d78cc1-kube-api-access-j6rgx\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.620067 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.620088 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-run-httpd\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.620115 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-scripts\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.620176 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.620225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-log-httpd\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.721497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-config-data\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.721615 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.721642 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6rgx\" (UniqueName: \"kubernetes.io/projected/7076577a-0e3f-484b-9d48-f78906d78cc1-kube-api-access-j6rgx\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.721659 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.721677 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-run-httpd\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.721707 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-scripts\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.721724 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.721750 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-log-httpd\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.722141 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-log-httpd\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.722328 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-run-httpd\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.726352 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-scripts\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.726419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.727707 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.729188 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.729905 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-config-data\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.743140 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6rgx\" (UniqueName: \"kubernetes.io/projected/7076577a-0e3f-484b-9d48-f78906d78cc1-kube-api-access-j6rgx\") pod \"ceilometer-0\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " pod="openstack/ceilometer-0" Nov 22 08:44:33 crc kubenswrapper[4743]: I1122 08:44:33.952568 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:44:34 crc kubenswrapper[4743]: I1122 08:44:34.263793 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3a93a60-b315-4de2-96d7-d23c9cedbc9c","Type":"ContainerStarted","Data":"4f25a424241601e91504248ab884e7a0f9860edb39f6eab7afdb79fa3b729315"} Nov 22 08:44:34 crc kubenswrapper[4743]: I1122 08:44:34.264142 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 08:44:34 crc kubenswrapper[4743]: I1122 08:44:34.289836 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.405219226 podStartE2EDuration="4.289799498s" podCreationTimestamp="2025-11-22 08:44:30 +0000 UTC" firstStartedPulling="2025-11-22 08:44:32.150333733 +0000 UTC m=+1345.856694785" lastFinishedPulling="2025-11-22 08:44:33.034914005 +0000 UTC m=+1346.741275057" observedRunningTime="2025-11-22 08:44:34.286094251 +0000 UTC m=+1347.992455293" watchObservedRunningTime="2025-11-22 08:44:34.289799498 +0000 UTC m=+1347.996160550" Nov 22 08:44:34 crc kubenswrapper[4743]: I1122 08:44:34.430938 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:34 crc kubenswrapper[4743]: W1122 08:44:34.433594 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7076577a_0e3f_484b_9d48_f78906d78cc1.slice/crio-01993ffda116adfdd93d39df45cac5793e8c86b1ebc6c984aca6b2f606910495 WatchSource:0}: Error finding container 01993ffda116adfdd93d39df45cac5793e8c86b1ebc6c984aca6b2f606910495: Status 404 returned error can't find the container with id 01993ffda116adfdd93d39df45cac5793e8c86b1ebc6c984aca6b2f606910495 Nov 22 08:44:35 crc kubenswrapper[4743]: I1122 08:44:35.162347 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7610922-7bb6-4198-bc73-50dc7a220848" path="/var/lib/kubelet/pods/b7610922-7bb6-4198-bc73-50dc7a220848/volumes" Nov 22 08:44:35 crc kubenswrapper[4743]: I1122 08:44:35.302404 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7076577a-0e3f-484b-9d48-f78906d78cc1","Type":"ContainerStarted","Data":"ac32c7763a81911759a477ad5743b1b4ede99ef9dd4a726e749a1912728ba2d5"} Nov 22 08:44:35 crc kubenswrapper[4743]: I1122 08:44:35.302474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7076577a-0e3f-484b-9d48-f78906d78cc1","Type":"ContainerStarted","Data":"01993ffda116adfdd93d39df45cac5793e8c86b1ebc6c984aca6b2f606910495"} Nov 22 08:44:35 crc kubenswrapper[4743]: I1122 08:44:35.644293 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 08:44:35 crc kubenswrapper[4743]: I1122 08:44:35.644640 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 08:44:35 crc kubenswrapper[4743]: I1122 08:44:35.705038 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 08:44:36 crc kubenswrapper[4743]: I1122 08:44:36.314089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7076577a-0e3f-484b-9d48-f78906d78cc1","Type":"ContainerStarted","Data":"fc25e983573a9a2569ae4fd74d3c05be9053d7d1ad8b1f4b9f46313df480252e"} Nov 22 08:44:37 crc kubenswrapper[4743]: I1122 08:44:37.329981 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7076577a-0e3f-484b-9d48-f78906d78cc1","Type":"ContainerStarted","Data":"28a6904297dec16ce94f2372b39a0c2033316e889c1bf7c59c59f90df6dd3ae9"} Nov 22 08:44:38 crc kubenswrapper[4743]: I1122 08:44:38.341713 4743 generic.go:334] "Generic (PLEG): container finished" podID="58260e6d-177b-49c5-beac-c516036341a4" containerID="7305244bd79cd85c2c92eab84566fc7d97bcd7fde2ff9a55e6572d0e121cf472" exitCode=0 Nov 22 08:44:38 crc kubenswrapper[4743]: I1122 08:44:38.341794 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hccj5" event={"ID":"58260e6d-177b-49c5-beac-c516036341a4","Type":"ContainerDied","Data":"7305244bd79cd85c2c92eab84566fc7d97bcd7fde2ff9a55e6572d0e121cf472"} Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.354327 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7076577a-0e3f-484b-9d48-f78906d78cc1","Type":"ContainerStarted","Data":"8a3be35c394732ea4b86e2b914ad48c7083da21fb23ec5c34b37027eeb564201"} Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.697518 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.716241 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.224857786 podStartE2EDuration="6.7162236s" podCreationTimestamp="2025-11-22 08:44:33 +0000 UTC" firstStartedPulling="2025-11-22 08:44:34.436407981 +0000 UTC m=+1348.142769043" lastFinishedPulling="2025-11-22 08:44:38.927773805 +0000 UTC m=+1352.634134857" observedRunningTime="2025-11-22 08:44:39.375811601 +0000 UTC m=+1353.082172653" watchObservedRunningTime="2025-11-22 08:44:39.7162236 +0000 UTC m=+1353.422584652" Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.843492 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-config-data\") pod \"58260e6d-177b-49c5-beac-c516036341a4\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.843657 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9h9p\" (UniqueName: \"kubernetes.io/projected/58260e6d-177b-49c5-beac-c516036341a4-kube-api-access-z9h9p\") pod \"58260e6d-177b-49c5-beac-c516036341a4\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.843695 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-combined-ca-bundle\") pod \"58260e6d-177b-49c5-beac-c516036341a4\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.843754 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-scripts\") pod \"58260e6d-177b-49c5-beac-c516036341a4\" (UID: \"58260e6d-177b-49c5-beac-c516036341a4\") " Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.850786 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58260e6d-177b-49c5-beac-c516036341a4-kube-api-access-z9h9p" (OuterVolumeSpecName: "kube-api-access-z9h9p") pod "58260e6d-177b-49c5-beac-c516036341a4" (UID: "58260e6d-177b-49c5-beac-c516036341a4"). InnerVolumeSpecName "kube-api-access-z9h9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.850877 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-scripts" (OuterVolumeSpecName: "scripts") pod "58260e6d-177b-49c5-beac-c516036341a4" (UID: "58260e6d-177b-49c5-beac-c516036341a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.872646 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58260e6d-177b-49c5-beac-c516036341a4" (UID: "58260e6d-177b-49c5-beac-c516036341a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.879306 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-config-data" (OuterVolumeSpecName: "config-data") pod "58260e6d-177b-49c5-beac-c516036341a4" (UID: "58260e6d-177b-49c5-beac-c516036341a4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.946226 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.946272 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9h9p\" (UniqueName: \"kubernetes.io/projected/58260e6d-177b-49c5-beac-c516036341a4-kube-api-access-z9h9p\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.946287 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:39 crc kubenswrapper[4743]: I1122 08:44:39.946297 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58260e6d-177b-49c5-beac-c516036341a4-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.365121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hccj5" event={"ID":"58260e6d-177b-49c5-beac-c516036341a4","Type":"ContainerDied","Data":"9db0181351e64b4055ca99c885aa3dab6cd2940004da3b5f5e6ea0e9cc30b414"} Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.365460 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9db0181351e64b4055ca99c885aa3dab6cd2940004da3b5f5e6ea0e9cc30b414" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.365167 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hccj5" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.366152 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.424119 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 08:44:40 crc kubenswrapper[4743]: E1122 08:44:40.424559 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58260e6d-177b-49c5-beac-c516036341a4" containerName="nova-cell1-conductor-db-sync" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.424593 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="58260e6d-177b-49c5-beac-c516036341a4" containerName="nova-cell1-conductor-db-sync" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.424818 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="58260e6d-177b-49c5-beac-c516036341a4" containerName="nova-cell1-conductor-db-sync" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.425492 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.428172 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.441221 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.558093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.558167 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9vdx\" (UniqueName: \"kubernetes.io/projected/11c59cd3-7ee4-43f3-83ce-9d22824473d7-kube-api-access-m9vdx\") pod \"nova-cell1-conductor-0\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.558246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.645677 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.645723 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.659752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.659839 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9vdx\" (UniqueName: \"kubernetes.io/projected/11c59cd3-7ee4-43f3-83ce-9d22824473d7-kube-api-access-m9vdx\") pod \"nova-cell1-conductor-0\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.659901 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.671775 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.672350 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.688953 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9vdx\" (UniqueName: \"kubernetes.io/projected/11c59cd3-7ee4-43f3-83ce-9d22824473d7-kube-api-access-m9vdx\") pod \"nova-cell1-conductor-0\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.694506 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.694550 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.705064 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.744954 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.746290 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 08:44:40 crc kubenswrapper[4743]: I1122 08:44:40.808767 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.325204 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.378051 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"11c59cd3-7ee4-43f3-83ce-9d22824473d7","Type":"ContainerStarted","Data":"92caa2e13cbb5fceb5c72acb9106cffb671d3aaa272061ee6717337df6bb7392"} Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.380814 4743 generic.go:334] "Generic (PLEG): container finished" podID="b4191582-06b4-46bb-be20-3f027173e83d" containerID="1e3e8fa38a54921afda25d5c5650246d86727b8836380c7c504af6b09ba4a0eb" exitCode=137 Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.380908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4191582-06b4-46bb-be20-3f027173e83d","Type":"ContainerDied","Data":"1e3e8fa38a54921afda25d5c5650246d86727b8836380c7c504af6b09ba4a0eb"} Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.380950 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b4191582-06b4-46bb-be20-3f027173e83d","Type":"ContainerDied","Data":"6c483e64e144b6c9d168d04f0e5089100ce7f1f5336ceeef739e2a0678a5de29"} Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.380967 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c483e64e144b6c9d168d04f0e5089100ce7f1f5336ceeef739e2a0678a5de29" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.420267 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.521362 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.661806 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.662070 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.692378 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-combined-ca-bundle\") pod \"b4191582-06b4-46bb-be20-3f027173e83d\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.692648 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzvw4\" (UniqueName: \"kubernetes.io/projected/b4191582-06b4-46bb-be20-3f027173e83d-kube-api-access-mzvw4\") pod \"b4191582-06b4-46bb-be20-3f027173e83d\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.692714 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-config-data\") pod \"b4191582-06b4-46bb-be20-3f027173e83d\" (UID: \"b4191582-06b4-46bb-be20-3f027173e83d\") " Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.697628 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4191582-06b4-46bb-be20-3f027173e83d-kube-api-access-mzvw4" (OuterVolumeSpecName: "kube-api-access-mzvw4") pod "b4191582-06b4-46bb-be20-3f027173e83d" (UID: "b4191582-06b4-46bb-be20-3f027173e83d"). InnerVolumeSpecName "kube-api-access-mzvw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.725127 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4191582-06b4-46bb-be20-3f027173e83d" (UID: "b4191582-06b4-46bb-be20-3f027173e83d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.765699 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-config-data" (OuterVolumeSpecName: "config-data") pod "b4191582-06b4-46bb-be20-3f027173e83d" (UID: "b4191582-06b4-46bb-be20-3f027173e83d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.777759 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.778044 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.794720 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.794753 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzvw4\" (UniqueName: \"kubernetes.io/projected/b4191582-06b4-46bb-be20-3f027173e83d-kube-api-access-mzvw4\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:41 crc kubenswrapper[4743]: I1122 08:44:41.794763 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4191582-06b4-46bb-be20-3f027173e83d-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.391109 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"11c59cd3-7ee4-43f3-83ce-9d22824473d7","Type":"ContainerStarted","Data":"fd43f52e71d508747b99d25448ea2492e1fc68d783ff2250c3357c3281ede81e"} Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.391168 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.391698 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.426882 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.426854677 podStartE2EDuration="2.426854677s" podCreationTimestamp="2025-11-22 08:44:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:42.415672234 +0000 UTC m=+1356.122033286" watchObservedRunningTime="2025-11-22 08:44:42.426854677 +0000 UTC m=+1356.133215729" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.447696 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.456601 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.468770 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:44:42 crc kubenswrapper[4743]: E1122 08:44:42.469161 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4191582-06b4-46bb-be20-3f027173e83d" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.469179 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4191582-06b4-46bb-be20-3f027173e83d" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.469369 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4191582-06b4-46bb-be20-3f027173e83d" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.469992 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.473635 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.474964 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.475138 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.497370 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.611055 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.611145 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.611192 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.611262 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk94d\" (UniqueName: \"kubernetes.io/projected/b7be7b8b-96eb-40fb-98b2-bc33e2154343-kube-api-access-mk94d\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.611380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.713091 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.713187 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.713777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk94d\" (UniqueName: \"kubernetes.io/projected/b7be7b8b-96eb-40fb-98b2-bc33e2154343-kube-api-access-mk94d\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.713861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.713944 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.719467 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.720150 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.721241 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.721792 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.735095 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk94d\" (UniqueName: \"kubernetes.io/projected/b7be7b8b-96eb-40fb-98b2-bc33e2154343-kube-api-access-mk94d\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:42 crc kubenswrapper[4743]: I1122 08:44:42.793045 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:43 crc kubenswrapper[4743]: I1122 08:44:43.165372 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4191582-06b4-46bb-be20-3f027173e83d" path="/var/lib/kubelet/pods/b4191582-06b4-46bb-be20-3f027173e83d/volumes" Nov 22 08:44:43 crc kubenswrapper[4743]: I1122 08:44:43.304095 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:44:43 crc kubenswrapper[4743]: W1122 08:44:43.314016 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7be7b8b_96eb_40fb_98b2_bc33e2154343.slice/crio-a516b410fb650b2b8b6e948a05b7feaddc7628bdd3f6fb42e7660e03ec892467 WatchSource:0}: Error finding container a516b410fb650b2b8b6e948a05b7feaddc7628bdd3f6fb42e7660e03ec892467: Status 404 returned error can't find the container with id a516b410fb650b2b8b6e948a05b7feaddc7628bdd3f6fb42e7660e03ec892467 Nov 22 08:44:43 crc kubenswrapper[4743]: I1122 08:44:43.405291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7be7b8b-96eb-40fb-98b2-bc33e2154343","Type":"ContainerStarted","Data":"a516b410fb650b2b8b6e948a05b7feaddc7628bdd3f6fb42e7660e03ec892467"} Nov 22 08:44:44 crc kubenswrapper[4743]: I1122 08:44:44.415225 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7be7b8b-96eb-40fb-98b2-bc33e2154343","Type":"ContainerStarted","Data":"d5f358c51f3837120bf2786591f156a51d70cdabcf793d05895bd486bf90bd29"} Nov 22 08:44:47 crc kubenswrapper[4743]: I1122 08:44:47.793646 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:50 crc kubenswrapper[4743]: I1122 08:44:50.649777 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 08:44:50 crc kubenswrapper[4743]: I1122 08:44:50.650400 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 08:44:50 crc kubenswrapper[4743]: I1122 08:44:50.656067 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 08:44:50 crc kubenswrapper[4743]: I1122 08:44:50.656727 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 08:44:50 crc kubenswrapper[4743]: I1122 08:44:50.669549 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=8.669524593 podStartE2EDuration="8.669524593s" podCreationTimestamp="2025-11-22 08:44:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:44.431003984 +0000 UTC m=+1358.137365036" watchObservedRunningTime="2025-11-22 08:44:50.669524593 +0000 UTC m=+1364.375885665" Nov 22 08:44:50 crc kubenswrapper[4743]: I1122 08:44:50.703520 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 08:44:50 crc kubenswrapper[4743]: I1122 08:44:50.704218 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 08:44:50 crc kubenswrapper[4743]: I1122 08:44:50.710358 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 08:44:50 crc kubenswrapper[4743]: I1122 08:44:50.711942 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 08:44:50 crc kubenswrapper[4743]: I1122 08:44:50.778049 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.473403 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.477466 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.677370 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-zqvbz"] Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.679289 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.701736 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-zqvbz"] Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.775671 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnkgq\" (UniqueName: \"kubernetes.io/projected/aab079ae-b574-40f3-8df0-7deff1356e09-kube-api-access-tnkgq\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.775714 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-config\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.775760 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.775787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.775862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.775899 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.877325 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnkgq\" (UniqueName: \"kubernetes.io/projected/aab079ae-b574-40f3-8df0-7deff1356e09-kube-api-access-tnkgq\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.877615 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-config\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.877790 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.878074 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.878272 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.878426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.878705 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.878752 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-config\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.878973 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.879169 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.879183 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:51 crc kubenswrapper[4743]: I1122 08:44:51.916650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnkgq\" (UniqueName: \"kubernetes.io/projected/aab079ae-b574-40f3-8df0-7deff1356e09-kube-api-access-tnkgq\") pod \"dnsmasq-dns-cd5cbd7b9-zqvbz\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:52 crc kubenswrapper[4743]: I1122 08:44:51.999983 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:52 crc kubenswrapper[4743]: I1122 08:44:52.611824 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-zqvbz"] Nov 22 08:44:52 crc kubenswrapper[4743]: I1122 08:44:52.794148 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:52 crc kubenswrapper[4743]: I1122 08:44:52.817405 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.491094 4743 generic.go:334] "Generic (PLEG): container finished" podID="aab079ae-b574-40f3-8df0-7deff1356e09" containerID="c84bf01830ea8518e6ca660ac284815461a856f5cc89ce3d25184993d73472c4" exitCode=0 Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.492838 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" event={"ID":"aab079ae-b574-40f3-8df0-7deff1356e09","Type":"ContainerDied","Data":"c84bf01830ea8518e6ca660ac284815461a856f5cc89ce3d25184993d73472c4"} Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.492954 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" event={"ID":"aab079ae-b574-40f3-8df0-7deff1356e09","Type":"ContainerStarted","Data":"cc5f1d9b8e5ad4aaa231f5aea0ea23e9d8319e0b69a165218b9b2b15bffe5323"} Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.531025 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.706363 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-hkskb"] Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.712290 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.714709 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.714887 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.716944 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hkskb"] Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.822022 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-config-data\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.822138 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmldv\" (UniqueName: \"kubernetes.io/projected/529310a2-d134-4138-a99f-9146e71f32eb-kube-api-access-cmldv\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.822207 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.822233 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-scripts\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.924238 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-config-data\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.924706 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmldv\" (UniqueName: \"kubernetes.io/projected/529310a2-d134-4138-a99f-9146e71f32eb-kube-api-access-cmldv\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.924754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.924783 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-scripts\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.931514 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-config-data\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.932793 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.932826 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-scripts\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:53 crc kubenswrapper[4743]: I1122 08:44:53.944615 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmldv\" (UniqueName: \"kubernetes.io/projected/529310a2-d134-4138-a99f-9146e71f32eb-kube-api-access-cmldv\") pod \"nova-cell1-cell-mapping-hkskb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.045059 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.221742 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.222427 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="ceilometer-central-agent" containerID="cri-o://ac32c7763a81911759a477ad5743b1b4ede99ef9dd4a726e749a1912728ba2d5" gracePeriod=30 Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.223446 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="sg-core" containerID="cri-o://28a6904297dec16ce94f2372b39a0c2033316e889c1bf7c59c59f90df6dd3ae9" gracePeriod=30 Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.223610 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="proxy-httpd" containerID="cri-o://8a3be35c394732ea4b86e2b914ad48c7083da21fb23ec5c34b37027eeb564201" gracePeriod=30 Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.223682 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="ceilometer-notification-agent" containerID="cri-o://fc25e983573a9a2569ae4fd74d3c05be9053d7d1ad8b1f4b9f46313df480252e" gracePeriod=30 Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.336458 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.192:3000/\": read tcp 10.217.0.2:43936->10.217.0.192:3000: read: connection reset by peer" Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.501800 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" event={"ID":"aab079ae-b574-40f3-8df0-7deff1356e09","Type":"ContainerStarted","Data":"b976e4375d0ed0f34734cc0cfaefc7f462784997b77abe216f7a1ac9164a5a03"} Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.502825 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.505116 4743 generic.go:334] "Generic (PLEG): container finished" podID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerID="8a3be35c394732ea4b86e2b914ad48c7083da21fb23ec5c34b37027eeb564201" exitCode=0 Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.505151 4743 generic.go:334] "Generic (PLEG): container finished" podID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerID="28a6904297dec16ce94f2372b39a0c2033316e889c1bf7c59c59f90df6dd3ae9" exitCode=2 Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.505165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7076577a-0e3f-484b-9d48-f78906d78cc1","Type":"ContainerDied","Data":"8a3be35c394732ea4b86e2b914ad48c7083da21fb23ec5c34b37027eeb564201"} Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.505195 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7076577a-0e3f-484b-9d48-f78906d78cc1","Type":"ContainerDied","Data":"28a6904297dec16ce94f2372b39a0c2033316e889c1bf7c59c59f90df6dd3ae9"} Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.527422 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" podStartSLOduration=3.527407904 podStartE2EDuration="3.527407904s" podCreationTimestamp="2025-11-22 08:44:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:54.525103208 +0000 UTC m=+1368.231464260" watchObservedRunningTime="2025-11-22 08:44:54.527407904 +0000 UTC m=+1368.233768956" Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.546536 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-hkskb"] Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.619706 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.619929 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerName="nova-api-log" containerID="cri-o://4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249" gracePeriod=30 Nov 22 08:44:54 crc kubenswrapper[4743]: I1122 08:44:54.620314 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerName="nova-api-api" containerID="cri-o://9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5" gracePeriod=30 Nov 22 08:44:55 crc kubenswrapper[4743]: I1122 08:44:55.520133 4743 generic.go:334] "Generic (PLEG): container finished" podID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerID="ac32c7763a81911759a477ad5743b1b4ede99ef9dd4a726e749a1912728ba2d5" exitCode=0 Nov 22 08:44:55 crc kubenswrapper[4743]: I1122 08:44:55.520222 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7076577a-0e3f-484b-9d48-f78906d78cc1","Type":"ContainerDied","Data":"ac32c7763a81911759a477ad5743b1b4ede99ef9dd4a726e749a1912728ba2d5"} Nov 22 08:44:55 crc kubenswrapper[4743]: I1122 08:44:55.523752 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hkskb" event={"ID":"529310a2-d134-4138-a99f-9146e71f32eb","Type":"ContainerStarted","Data":"bd056c88a18e8d56a5bd78a0aae1c581bd242d465d94e8b629291f01de7a8525"} Nov 22 08:44:55 crc kubenswrapper[4743]: I1122 08:44:55.523799 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hkskb" event={"ID":"529310a2-d134-4138-a99f-9146e71f32eb","Type":"ContainerStarted","Data":"a1215fe85db004fd22ba5820e31254cfc578851a5d646c3efb42cbc5ccc5cd68"} Nov 22 08:44:55 crc kubenswrapper[4743]: I1122 08:44:55.527466 4743 generic.go:334] "Generic (PLEG): container finished" podID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerID="4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249" exitCode=143 Nov 22 08:44:55 crc kubenswrapper[4743]: I1122 08:44:55.527560 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9da353e0-8b25-44b3-8c96-6cc4355615ec","Type":"ContainerDied","Data":"4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249"} Nov 22 08:44:55 crc kubenswrapper[4743]: I1122 08:44:55.539266 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-hkskb" podStartSLOduration=2.539249772 podStartE2EDuration="2.539249772s" podCreationTimestamp="2025-11-22 08:44:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:44:55.535850014 +0000 UTC m=+1369.242211066" watchObservedRunningTime="2025-11-22 08:44:55.539249772 +0000 UTC m=+1369.245610824" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.549437 4743 generic.go:334] "Generic (PLEG): container finished" podID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerID="fc25e983573a9a2569ae4fd74d3c05be9053d7d1ad8b1f4b9f46313df480252e" exitCode=0 Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.549496 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7076577a-0e3f-484b-9d48-f78906d78cc1","Type":"ContainerDied","Data":"fc25e983573a9a2569ae4fd74d3c05be9053d7d1ad8b1f4b9f46313df480252e"} Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.663834 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.838170 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-sg-core-conf-yaml\") pod \"7076577a-0e3f-484b-9d48-f78906d78cc1\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.838256 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6rgx\" (UniqueName: \"kubernetes.io/projected/7076577a-0e3f-484b-9d48-f78906d78cc1-kube-api-access-j6rgx\") pod \"7076577a-0e3f-484b-9d48-f78906d78cc1\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.838283 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-combined-ca-bundle\") pod \"7076577a-0e3f-484b-9d48-f78906d78cc1\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.838310 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-config-data\") pod \"7076577a-0e3f-484b-9d48-f78906d78cc1\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.838356 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-ceilometer-tls-certs\") pod \"7076577a-0e3f-484b-9d48-f78906d78cc1\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.838388 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-run-httpd\") pod \"7076577a-0e3f-484b-9d48-f78906d78cc1\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.838466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-log-httpd\") pod \"7076577a-0e3f-484b-9d48-f78906d78cc1\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.838487 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-scripts\") pod \"7076577a-0e3f-484b-9d48-f78906d78cc1\" (UID: \"7076577a-0e3f-484b-9d48-f78906d78cc1\") " Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.844724 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7076577a-0e3f-484b-9d48-f78906d78cc1" (UID: "7076577a-0e3f-484b-9d48-f78906d78cc1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.845433 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7076577a-0e3f-484b-9d48-f78906d78cc1" (UID: "7076577a-0e3f-484b-9d48-f78906d78cc1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.847332 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-scripts" (OuterVolumeSpecName: "scripts") pod "7076577a-0e3f-484b-9d48-f78906d78cc1" (UID: "7076577a-0e3f-484b-9d48-f78906d78cc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.852229 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7076577a-0e3f-484b-9d48-f78906d78cc1-kube-api-access-j6rgx" (OuterVolumeSpecName: "kube-api-access-j6rgx") pod "7076577a-0e3f-484b-9d48-f78906d78cc1" (UID: "7076577a-0e3f-484b-9d48-f78906d78cc1"). InnerVolumeSpecName "kube-api-access-j6rgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.891363 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7076577a-0e3f-484b-9d48-f78906d78cc1" (UID: "7076577a-0e3f-484b-9d48-f78906d78cc1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.904951 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7076577a-0e3f-484b-9d48-f78906d78cc1" (UID: "7076577a-0e3f-484b-9d48-f78906d78cc1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.934881 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7076577a-0e3f-484b-9d48-f78906d78cc1" (UID: "7076577a-0e3f-484b-9d48-f78906d78cc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.942125 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6rgx\" (UniqueName: \"kubernetes.io/projected/7076577a-0e3f-484b-9d48-f78906d78cc1-kube-api-access-j6rgx\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.942162 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.942175 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.942190 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.942203 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7076577a-0e3f-484b-9d48-f78906d78cc1-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.942215 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.942229 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:57 crc kubenswrapper[4743]: I1122 08:44:57.963776 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-config-data" (OuterVolumeSpecName: "config-data") pod "7076577a-0e3f-484b-9d48-f78906d78cc1" (UID: "7076577a-0e3f-484b-9d48-f78906d78cc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.044953 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7076577a-0e3f-484b-9d48-f78906d78cc1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.261658 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.457148 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-combined-ca-bundle\") pod \"9da353e0-8b25-44b3-8c96-6cc4355615ec\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.457863 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfkxb\" (UniqueName: \"kubernetes.io/projected/9da353e0-8b25-44b3-8c96-6cc4355615ec-kube-api-access-tfkxb\") pod \"9da353e0-8b25-44b3-8c96-6cc4355615ec\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.458333 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9da353e0-8b25-44b3-8c96-6cc4355615ec-logs\") pod \"9da353e0-8b25-44b3-8c96-6cc4355615ec\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.458382 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-config-data\") pod \"9da353e0-8b25-44b3-8c96-6cc4355615ec\" (UID: \"9da353e0-8b25-44b3-8c96-6cc4355615ec\") " Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.459002 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9da353e0-8b25-44b3-8c96-6cc4355615ec-logs" (OuterVolumeSpecName: "logs") pod "9da353e0-8b25-44b3-8c96-6cc4355615ec" (UID: "9da353e0-8b25-44b3-8c96-6cc4355615ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.471471 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9da353e0-8b25-44b3-8c96-6cc4355615ec-kube-api-access-tfkxb" (OuterVolumeSpecName: "kube-api-access-tfkxb") pod "9da353e0-8b25-44b3-8c96-6cc4355615ec" (UID: "9da353e0-8b25-44b3-8c96-6cc4355615ec"). InnerVolumeSpecName "kube-api-access-tfkxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.487715 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9da353e0-8b25-44b3-8c96-6cc4355615ec" (UID: "9da353e0-8b25-44b3-8c96-6cc4355615ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.495009 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-config-data" (OuterVolumeSpecName: "config-data") pod "9da353e0-8b25-44b3-8c96-6cc4355615ec" (UID: "9da353e0-8b25-44b3-8c96-6cc4355615ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.560440 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfkxb\" (UniqueName: \"kubernetes.io/projected/9da353e0-8b25-44b3-8c96-6cc4355615ec-kube-api-access-tfkxb\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.560480 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9da353e0-8b25-44b3-8c96-6cc4355615ec-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.560492 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.560527 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9da353e0-8b25-44b3-8c96-6cc4355615ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.564776 4743 generic.go:334] "Generic (PLEG): container finished" podID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerID="9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5" exitCode=0 Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.564872 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.564889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9da353e0-8b25-44b3-8c96-6cc4355615ec","Type":"ContainerDied","Data":"9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5"} Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.564925 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9da353e0-8b25-44b3-8c96-6cc4355615ec","Type":"ContainerDied","Data":"9bde9edbaf428c4a403d7d6c49804ae33626fc4759e7aa51fa2e5fce98715abd"} Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.564943 4743 scope.go:117] "RemoveContainer" containerID="9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.570274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7076577a-0e3f-484b-9d48-f78906d78cc1","Type":"ContainerDied","Data":"01993ffda116adfdd93d39df45cac5793e8c86b1ebc6c984aca6b2f606910495"} Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.570335 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.615036 4743 scope.go:117] "RemoveContainer" containerID="4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.625630 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.635004 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.644976 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:58 crc kubenswrapper[4743]: E1122 08:44:58.645387 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="ceilometer-central-agent" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645403 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="ceilometer-central-agent" Nov 22 08:44:58 crc kubenswrapper[4743]: E1122 08:44:58.645422 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerName="nova-api-log" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645429 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerName="nova-api-log" Nov 22 08:44:58 crc kubenswrapper[4743]: E1122 08:44:58.645439 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="proxy-httpd" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645445 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="proxy-httpd" Nov 22 08:44:58 crc kubenswrapper[4743]: E1122 08:44:58.645463 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerName="nova-api-api" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645470 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerName="nova-api-api" Nov 22 08:44:58 crc kubenswrapper[4743]: E1122 08:44:58.645483 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="ceilometer-notification-agent" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645491 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="ceilometer-notification-agent" Nov 22 08:44:58 crc kubenswrapper[4743]: E1122 08:44:58.645502 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="sg-core" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645508 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="sg-core" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645705 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="sg-core" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645720 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerName="nova-api-api" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645731 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="proxy-httpd" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645744 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="ceilometer-central-agent" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645755 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" containerName="ceilometer-notification-agent" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.645766 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" containerName="nova-api-log" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.646748 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.651759 4743 scope.go:117] "RemoveContainer" containerID="9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.652410 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.652539 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 08:44:58 crc kubenswrapper[4743]: E1122 08:44:58.652557 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5\": container with ID starting with 9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5 not found: ID does not exist" containerID="9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.652623 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5"} err="failed to get container status \"9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5\": rpc error: code = NotFound desc = could not find container \"9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5\": container with ID starting with 9df617822b848f67ce454a1c72492046c2a1ab30d709b83b778880a2f613b8e5 not found: ID does not exist" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.652657 4743 scope.go:117] "RemoveContainer" containerID="4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.652412 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 08:44:58 crc kubenswrapper[4743]: E1122 08:44:58.653392 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249\": container with ID starting with 4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249 not found: ID does not exist" containerID="4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.653420 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249"} err="failed to get container status \"4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249\": rpc error: code = NotFound desc = could not find container \"4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249\": container with ID starting with 4c72513b6bf700976318bea05eb11923c3c2ac0c873198dcdbc0ff16dc7e9249 not found: ID does not exist" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.653436 4743 scope.go:117] "RemoveContainer" containerID="8a3be35c394732ea4b86e2b914ad48c7083da21fb23ec5c34b37027eeb564201" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.656635 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.665546 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.693247 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.710992 4743 scope.go:117] "RemoveContainer" containerID="28a6904297dec16ce94f2372b39a0c2033316e889c1bf7c59c59f90df6dd3ae9" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.711522 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.714157 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.717093 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.717350 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.717491 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.728685 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.744984 4743 scope.go:117] "RemoveContainer" containerID="fc25e983573a9a2569ae4fd74d3c05be9053d7d1ad8b1f4b9f46313df480252e" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.765990 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.766037 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.766094 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-public-tls-certs\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.766113 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvj29\" (UniqueName: \"kubernetes.io/projected/d83da6b0-e31f-4af6-934b-bfa046b49d20-kube-api-access-jvj29\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.766358 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-config-data\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.766523 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d83da6b0-e31f-4af6-934b-bfa046b49d20-logs\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.767948 4743 scope.go:117] "RemoveContainer" containerID="ac32c7763a81911759a477ad5743b1b4ede99ef9dd4a726e749a1912728ba2d5" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868466 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-public-tls-certs\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868518 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvj29\" (UniqueName: \"kubernetes.io/projected/d83da6b0-e31f-4af6-934b-bfa046b49d20-kube-api-access-jvj29\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-log-httpd\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdlbj\" (UniqueName: \"kubernetes.io/projected/58b7a46d-98c7-4e9e-94df-80d359fd68c7-kube-api-access-wdlbj\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868664 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-scripts\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868682 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-config-data\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d83da6b0-e31f-4af6-934b-bfa046b49d20-logs\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868777 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-run-httpd\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868810 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868880 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.868913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-config-data\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.869894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d83da6b0-e31f-4af6-934b-bfa046b49d20-logs\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.873338 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-public-tls-certs\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.873339 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.875273 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-config-data\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.878816 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.885825 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvj29\" (UniqueName: \"kubernetes.io/projected/d83da6b0-e31f-4af6-934b-bfa046b49d20-kube-api-access-jvj29\") pod \"nova-api-0\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.970739 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-run-httpd\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.970815 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.970864 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.970905 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-config-data\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.970986 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-log-httpd\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.971042 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdlbj\" (UniqueName: \"kubernetes.io/projected/58b7a46d-98c7-4e9e-94df-80d359fd68c7-kube-api-access-wdlbj\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.971082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-scripts\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.971109 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.971394 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-run-httpd\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.971800 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-log-httpd\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.977837 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-config-data\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.977992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-scripts\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.978731 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.979148 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.980182 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.986220 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:44:58 crc kubenswrapper[4743]: I1122 08:44:58.989404 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdlbj\" (UniqueName: \"kubernetes.io/projected/58b7a46d-98c7-4e9e-94df-80d359fd68c7-kube-api-access-wdlbj\") pod \"ceilometer-0\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " pod="openstack/ceilometer-0" Nov 22 08:44:59 crc kubenswrapper[4743]: I1122 08:44:59.041818 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:44:59 crc kubenswrapper[4743]: I1122 08:44:59.177724 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7076577a-0e3f-484b-9d48-f78906d78cc1" path="/var/lib/kubelet/pods/7076577a-0e3f-484b-9d48-f78906d78cc1/volumes" Nov 22 08:44:59 crc kubenswrapper[4743]: I1122 08:44:59.178537 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9da353e0-8b25-44b3-8c96-6cc4355615ec" path="/var/lib/kubelet/pods/9da353e0-8b25-44b3-8c96-6cc4355615ec/volumes" Nov 22 08:44:59 crc kubenswrapper[4743]: I1122 08:44:59.451156 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:44:59 crc kubenswrapper[4743]: I1122 08:44:59.581426 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:44:59 crc kubenswrapper[4743]: I1122 08:44:59.583015 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d83da6b0-e31f-4af6-934b-bfa046b49d20","Type":"ContainerStarted","Data":"ff477c7fdd64c30529b506eb8b453644fb3b7eac4716a35f9ead2ae890768f89"} Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.148188 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z"] Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.149708 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.152764 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.152966 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.170423 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z"] Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.294692 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd54a052-ecbb-4344-9f1e-323a7fbf034b-secret-volume\") pod \"collect-profiles-29396685-s586z\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.294775 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd54a052-ecbb-4344-9f1e-323a7fbf034b-config-volume\") pod \"collect-profiles-29396685-s586z\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.294849 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq7ld\" (UniqueName: \"kubernetes.io/projected/bd54a052-ecbb-4344-9f1e-323a7fbf034b-kube-api-access-cq7ld\") pod \"collect-profiles-29396685-s586z\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.397960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd54a052-ecbb-4344-9f1e-323a7fbf034b-secret-volume\") pod \"collect-profiles-29396685-s586z\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.398511 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd54a052-ecbb-4344-9f1e-323a7fbf034b-config-volume\") pod \"collect-profiles-29396685-s586z\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.398739 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq7ld\" (UniqueName: \"kubernetes.io/projected/bd54a052-ecbb-4344-9f1e-323a7fbf034b-kube-api-access-cq7ld\") pod \"collect-profiles-29396685-s586z\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.399626 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd54a052-ecbb-4344-9f1e-323a7fbf034b-config-volume\") pod \"collect-profiles-29396685-s586z\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.402372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd54a052-ecbb-4344-9f1e-323a7fbf034b-secret-volume\") pod \"collect-profiles-29396685-s586z\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.419388 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq7ld\" (UniqueName: \"kubernetes.io/projected/bd54a052-ecbb-4344-9f1e-323a7fbf034b-kube-api-access-cq7ld\") pod \"collect-profiles-29396685-s586z\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.471743 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.600132 4743 generic.go:334] "Generic (PLEG): container finished" podID="529310a2-d134-4138-a99f-9146e71f32eb" containerID="bd056c88a18e8d56a5bd78a0aae1c581bd242d465d94e8b629291f01de7a8525" exitCode=0 Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.600249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hkskb" event={"ID":"529310a2-d134-4138-a99f-9146e71f32eb","Type":"ContainerDied","Data":"bd056c88a18e8d56a5bd78a0aae1c581bd242d465d94e8b629291f01de7a8525"} Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.603313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58b7a46d-98c7-4e9e-94df-80d359fd68c7","Type":"ContainerStarted","Data":"d286017c65a937c25c9d60c902e5c3e3d08c52047202b26fc6832bb9d0b90d7b"} Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.603519 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58b7a46d-98c7-4e9e-94df-80d359fd68c7","Type":"ContainerStarted","Data":"89b4182dfa6e79f4534d0c5ef9b9ec4a209bc42eddfd081856572c80729ad686"} Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.612239 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d83da6b0-e31f-4af6-934b-bfa046b49d20","Type":"ContainerStarted","Data":"f6a47a6042cdcc435e43f807a46da87bc4a2b5807e85c840896d368eabd6740d"} Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.612293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d83da6b0-e31f-4af6-934b-bfa046b49d20","Type":"ContainerStarted","Data":"f437f9d019295d0f78778e5e7033bf6f20cb2b01e6b07ff80c52b784e7ef5faa"} Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.657827 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.657804325 podStartE2EDuration="2.657804325s" podCreationTimestamp="2025-11-22 08:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:45:00.639120077 +0000 UTC m=+1374.345481129" watchObservedRunningTime="2025-11-22 08:45:00.657804325 +0000 UTC m=+1374.364165377" Nov 22 08:45:00 crc kubenswrapper[4743]: I1122 08:45:00.930728 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z"] Nov 22 08:45:01 crc kubenswrapper[4743]: I1122 08:45:01.251666 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:45:01 crc kubenswrapper[4743]: I1122 08:45:01.251984 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:45:01 crc kubenswrapper[4743]: I1122 08:45:01.629478 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58b7a46d-98c7-4e9e-94df-80d359fd68c7","Type":"ContainerStarted","Data":"0fc045e12216ec39ccb7f9f0e77c526341966c30dccbf1e54a1b50e33279ef52"} Nov 22 08:45:01 crc kubenswrapper[4743]: I1122 08:45:01.629541 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58b7a46d-98c7-4e9e-94df-80d359fd68c7","Type":"ContainerStarted","Data":"37fbdedccc446a670cae08eb9632f568f7a7eccdeada9e4e6146ebf59a36b2e3"} Nov 22 08:45:01 crc kubenswrapper[4743]: I1122 08:45:01.632514 4743 generic.go:334] "Generic (PLEG): container finished" podID="bd54a052-ecbb-4344-9f1e-323a7fbf034b" containerID="7659ae12fad0147c8b735d5bd5d19689a44c9c8031ab0169048089c6125a5fe6" exitCode=0 Nov 22 08:45:01 crc kubenswrapper[4743]: I1122 08:45:01.632608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" event={"ID":"bd54a052-ecbb-4344-9f1e-323a7fbf034b","Type":"ContainerDied","Data":"7659ae12fad0147c8b735d5bd5d19689a44c9c8031ab0169048089c6125a5fe6"} Nov 22 08:45:01 crc kubenswrapper[4743]: I1122 08:45:01.632763 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" event={"ID":"bd54a052-ecbb-4344-9f1e-323a7fbf034b","Type":"ContainerStarted","Data":"229579252cca2904015b5c99e1fc5e7496da4c143f0e4cb2dc505e336bf0164e"} Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.001724 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.103477 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-dvstw"] Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.104262 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" podUID="1c8fd004-0cbe-4a32-87cf-d199a7f39716" containerName="dnsmasq-dns" containerID="cri-o://a57450da07eb6f8247c0be6c280bf1d2154596fb64d6bf9c5fc0fba236baa1c3" gracePeriod=10 Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.241506 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.338678 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-combined-ca-bundle\") pod \"529310a2-d134-4138-a99f-9146e71f32eb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.338741 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-config-data\") pod \"529310a2-d134-4138-a99f-9146e71f32eb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.338801 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmldv\" (UniqueName: \"kubernetes.io/projected/529310a2-d134-4138-a99f-9146e71f32eb-kube-api-access-cmldv\") pod \"529310a2-d134-4138-a99f-9146e71f32eb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.338851 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-scripts\") pod \"529310a2-d134-4138-a99f-9146e71f32eb\" (UID: \"529310a2-d134-4138-a99f-9146e71f32eb\") " Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.347732 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/529310a2-d134-4138-a99f-9146e71f32eb-kube-api-access-cmldv" (OuterVolumeSpecName: "kube-api-access-cmldv") pod "529310a2-d134-4138-a99f-9146e71f32eb" (UID: "529310a2-d134-4138-a99f-9146e71f32eb"). InnerVolumeSpecName "kube-api-access-cmldv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.358680 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-scripts" (OuterVolumeSpecName: "scripts") pod "529310a2-d134-4138-a99f-9146e71f32eb" (UID: "529310a2-d134-4138-a99f-9146e71f32eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.376937 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "529310a2-d134-4138-a99f-9146e71f32eb" (UID: "529310a2-d134-4138-a99f-9146e71f32eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.398929 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-config-data" (OuterVolumeSpecName: "config-data") pod "529310a2-d134-4138-a99f-9146e71f32eb" (UID: "529310a2-d134-4138-a99f-9146e71f32eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.444730 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.444779 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.444794 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmldv\" (UniqueName: \"kubernetes.io/projected/529310a2-d134-4138-a99f-9146e71f32eb-kube-api-access-cmldv\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.444808 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/529310a2-d134-4138-a99f-9146e71f32eb-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.648664 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-hkskb" event={"ID":"529310a2-d134-4138-a99f-9146e71f32eb","Type":"ContainerDied","Data":"a1215fe85db004fd22ba5820e31254cfc578851a5d646c3efb42cbc5ccc5cd68"} Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.648753 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1215fe85db004fd22ba5820e31254cfc578851a5d646c3efb42cbc5ccc5cd68" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.648922 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-hkskb" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.659030 4743 generic.go:334] "Generic (PLEG): container finished" podID="1c8fd004-0cbe-4a32-87cf-d199a7f39716" containerID="a57450da07eb6f8247c0be6c280bf1d2154596fb64d6bf9c5fc0fba236baa1c3" exitCode=0 Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.659303 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" event={"ID":"1c8fd004-0cbe-4a32-87cf-d199a7f39716","Type":"ContainerDied","Data":"a57450da07eb6f8247c0be6c280bf1d2154596fb64d6bf9c5fc0fba236baa1c3"} Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.707194 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.808212 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.808449 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d83da6b0-e31f-4af6-934b-bfa046b49d20" containerName="nova-api-log" containerID="cri-o://f437f9d019295d0f78778e5e7033bf6f20cb2b01e6b07ff80c52b784e7ef5faa" gracePeriod=30 Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.808596 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d83da6b0-e31f-4af6-934b-bfa046b49d20" containerName="nova-api-api" containerID="cri-o://f6a47a6042cdcc435e43f807a46da87bc4a2b5807e85c840896d368eabd6740d" gracePeriod=30 Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.828073 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.828319 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e7ef7cdc-68f1-4031-a5ef-66a910c50764" containerName="nova-scheduler-scheduler" containerID="cri-o://38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a" gracePeriod=30 Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.845460 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.848665 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-metadata" containerID="cri-o://80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d" gracePeriod=30 Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.848627 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-log" containerID="cri-o://c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8" gracePeriod=30 Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.856964 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-nb\") pod \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.857019 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-swift-storage-0\") pod \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.857127 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p85l6\" (UniqueName: \"kubernetes.io/projected/1c8fd004-0cbe-4a32-87cf-d199a7f39716-kube-api-access-p85l6\") pod \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.857298 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-config\") pod \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.857321 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-svc\") pod \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.857363 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-sb\") pod \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\" (UID: \"1c8fd004-0cbe-4a32-87cf-d199a7f39716\") " Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.864778 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c8fd004-0cbe-4a32-87cf-d199a7f39716-kube-api-access-p85l6" (OuterVolumeSpecName: "kube-api-access-p85l6") pod "1c8fd004-0cbe-4a32-87cf-d199a7f39716" (UID: "1c8fd004-0cbe-4a32-87cf-d199a7f39716"). InnerVolumeSpecName "kube-api-access-p85l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.959683 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p85l6\" (UniqueName: \"kubernetes.io/projected/1c8fd004-0cbe-4a32-87cf-d199a7f39716-kube-api-access-p85l6\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:02 crc kubenswrapper[4743]: I1122 08:45:02.999513 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1c8fd004-0cbe-4a32-87cf-d199a7f39716" (UID: "1c8fd004-0cbe-4a32-87cf-d199a7f39716"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.022225 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1c8fd004-0cbe-4a32-87cf-d199a7f39716" (UID: "1c8fd004-0cbe-4a32-87cf-d199a7f39716"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.039975 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1c8fd004-0cbe-4a32-87cf-d199a7f39716" (UID: "1c8fd004-0cbe-4a32-87cf-d199a7f39716"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.040787 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.049983 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1c8fd004-0cbe-4a32-87cf-d199a7f39716" (UID: "1c8fd004-0cbe-4a32-87cf-d199a7f39716"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.061328 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.061369 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.061381 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.061393 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.061993 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-config" (OuterVolumeSpecName: "config") pod "1c8fd004-0cbe-4a32-87cf-d199a7f39716" (UID: "1c8fd004-0cbe-4a32-87cf-d199a7f39716"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.162411 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd54a052-ecbb-4344-9f1e-323a7fbf034b-config-volume\") pod \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.162602 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq7ld\" (UniqueName: \"kubernetes.io/projected/bd54a052-ecbb-4344-9f1e-323a7fbf034b-kube-api-access-cq7ld\") pod \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.162766 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd54a052-ecbb-4344-9f1e-323a7fbf034b-secret-volume\") pod \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\" (UID: \"bd54a052-ecbb-4344-9f1e-323a7fbf034b\") " Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.163495 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c8fd004-0cbe-4a32-87cf-d199a7f39716-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.164602 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd54a052-ecbb-4344-9f1e-323a7fbf034b-config-volume" (OuterVolumeSpecName: "config-volume") pod "bd54a052-ecbb-4344-9f1e-323a7fbf034b" (UID: "bd54a052-ecbb-4344-9f1e-323a7fbf034b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.171487 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd54a052-ecbb-4344-9f1e-323a7fbf034b-kube-api-access-cq7ld" (OuterVolumeSpecName: "kube-api-access-cq7ld") pod "bd54a052-ecbb-4344-9f1e-323a7fbf034b" (UID: "bd54a052-ecbb-4344-9f1e-323a7fbf034b"). InnerVolumeSpecName "kube-api-access-cq7ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.175394 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd54a052-ecbb-4344-9f1e-323a7fbf034b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bd54a052-ecbb-4344-9f1e-323a7fbf034b" (UID: "bd54a052-ecbb-4344-9f1e-323a7fbf034b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.265200 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bd54a052-ecbb-4344-9f1e-323a7fbf034b-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.265234 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bd54a052-ecbb-4344-9f1e-323a7fbf034b-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.265246 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq7ld\" (UniqueName: \"kubernetes.io/projected/bd54a052-ecbb-4344-9f1e-323a7fbf034b-kube-api-access-cq7ld\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.669621 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.669644 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z" event={"ID":"bd54a052-ecbb-4344-9f1e-323a7fbf034b","Type":"ContainerDied","Data":"229579252cca2904015b5c99e1fc5e7496da4c143f0e4cb2dc505e336bf0164e"} Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.670080 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="229579252cca2904015b5c99e1fc5e7496da4c143f0e4cb2dc505e336bf0164e" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.672518 4743 generic.go:334] "Generic (PLEG): container finished" podID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerID="c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8" exitCode=143 Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.672602 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6837b2a8-dfb5-4277-87f4-483200d1ae93","Type":"ContainerDied","Data":"c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8"} Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.674755 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" event={"ID":"1c8fd004-0cbe-4a32-87cf-d199a7f39716","Type":"ContainerDied","Data":"bea7ef7891652427b440b67220d6bc7805f0bde9438c6f42cfede0a7aab3e18d"} Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.674782 4743 scope.go:117] "RemoveContainer" containerID="a57450da07eb6f8247c0be6c280bf1d2154596fb64d6bf9c5fc0fba236baa1c3" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.674912 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-dvstw" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.683995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58b7a46d-98c7-4e9e-94df-80d359fd68c7","Type":"ContainerStarted","Data":"3a8d852d3f689474ae890d019ad247d38067084ae22ca4aca5bee80eb525bd90"} Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.684639 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.689365 4743 generic.go:334] "Generic (PLEG): container finished" podID="d83da6b0-e31f-4af6-934b-bfa046b49d20" containerID="f6a47a6042cdcc435e43f807a46da87bc4a2b5807e85c840896d368eabd6740d" exitCode=0 Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.689390 4743 generic.go:334] "Generic (PLEG): container finished" podID="d83da6b0-e31f-4af6-934b-bfa046b49d20" containerID="f437f9d019295d0f78778e5e7033bf6f20cb2b01e6b07ff80c52b784e7ef5faa" exitCode=143 Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.689407 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d83da6b0-e31f-4af6-934b-bfa046b49d20","Type":"ContainerDied","Data":"f6a47a6042cdcc435e43f807a46da87bc4a2b5807e85c840896d368eabd6740d"} Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.689426 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d83da6b0-e31f-4af6-934b-bfa046b49d20","Type":"ContainerDied","Data":"f437f9d019295d0f78778e5e7033bf6f20cb2b01e6b07ff80c52b784e7ef5faa"} Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.689439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d83da6b0-e31f-4af6-934b-bfa046b49d20","Type":"ContainerDied","Data":"ff477c7fdd64c30529b506eb8b453644fb3b7eac4716a35f9ead2ae890768f89"} Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.689450 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff477c7fdd64c30529b506eb8b453644fb3b7eac4716a35f9ead2ae890768f89" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.788400 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.800761 4743 scope.go:117] "RemoveContainer" containerID="d7c8d3c8e57a43c68920b28d1a4096bd93ba2a85d4e96a9e8a3c68b721f19852" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.811013 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.488040262 podStartE2EDuration="5.810991602s" podCreationTimestamp="2025-11-22 08:44:58 +0000 UTC" firstStartedPulling="2025-11-22 08:44:59.589491809 +0000 UTC m=+1373.295852861" lastFinishedPulling="2025-11-22 08:45:02.912443149 +0000 UTC m=+1376.618804201" observedRunningTime="2025-11-22 08:45:03.708063365 +0000 UTC m=+1377.414424417" watchObservedRunningTime="2025-11-22 08:45:03.810991602 +0000 UTC m=+1377.517352654" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.835758 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-dvstw"] Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.856785 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-dvstw"] Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.879589 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-combined-ca-bundle\") pod \"d83da6b0-e31f-4af6-934b-bfa046b49d20\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.879671 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvj29\" (UniqueName: \"kubernetes.io/projected/d83da6b0-e31f-4af6-934b-bfa046b49d20-kube-api-access-jvj29\") pod \"d83da6b0-e31f-4af6-934b-bfa046b49d20\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.879720 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-internal-tls-certs\") pod \"d83da6b0-e31f-4af6-934b-bfa046b49d20\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.879823 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d83da6b0-e31f-4af6-934b-bfa046b49d20-logs\") pod \"d83da6b0-e31f-4af6-934b-bfa046b49d20\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.879876 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-config-data\") pod \"d83da6b0-e31f-4af6-934b-bfa046b49d20\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.879898 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-public-tls-certs\") pod \"d83da6b0-e31f-4af6-934b-bfa046b49d20\" (UID: \"d83da6b0-e31f-4af6-934b-bfa046b49d20\") " Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.880377 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d83da6b0-e31f-4af6-934b-bfa046b49d20-logs" (OuterVolumeSpecName: "logs") pod "d83da6b0-e31f-4af6-934b-bfa046b49d20" (UID: "d83da6b0-e31f-4af6-934b-bfa046b49d20"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.885223 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d83da6b0-e31f-4af6-934b-bfa046b49d20-kube-api-access-jvj29" (OuterVolumeSpecName: "kube-api-access-jvj29") pod "d83da6b0-e31f-4af6-934b-bfa046b49d20" (UID: "d83da6b0-e31f-4af6-934b-bfa046b49d20"). InnerVolumeSpecName "kube-api-access-jvj29". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.914152 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d83da6b0-e31f-4af6-934b-bfa046b49d20" (UID: "d83da6b0-e31f-4af6-934b-bfa046b49d20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.931898 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-config-data" (OuterVolumeSpecName: "config-data") pod "d83da6b0-e31f-4af6-934b-bfa046b49d20" (UID: "d83da6b0-e31f-4af6-934b-bfa046b49d20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.934478 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d83da6b0-e31f-4af6-934b-bfa046b49d20" (UID: "d83da6b0-e31f-4af6-934b-bfa046b49d20"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.958323 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d83da6b0-e31f-4af6-934b-bfa046b49d20" (UID: "d83da6b0-e31f-4af6-934b-bfa046b49d20"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.981647 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d83da6b0-e31f-4af6-934b-bfa046b49d20-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.981681 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.981691 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.981702 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.981711 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvj29\" (UniqueName: \"kubernetes.io/projected/d83da6b0-e31f-4af6-934b-bfa046b49d20-kube-api-access-jvj29\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:03 crc kubenswrapper[4743]: I1122 08:45:03.981719 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d83da6b0-e31f-4af6-934b-bfa046b49d20-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.545113 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.698722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6h84f\" (UniqueName: \"kubernetes.io/projected/e7ef7cdc-68f1-4031-a5ef-66a910c50764-kube-api-access-6h84f\") pod \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.698938 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-combined-ca-bundle\") pod \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.699038 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-config-data\") pod \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\" (UID: \"e7ef7cdc-68f1-4031-a5ef-66a910c50764\") " Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.702307 4743 generic.go:334] "Generic (PLEG): container finished" podID="e7ef7cdc-68f1-4031-a5ef-66a910c50764" containerID="38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a" exitCode=0 Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.702465 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.704298 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7ef7cdc-68f1-4031-a5ef-66a910c50764","Type":"ContainerDied","Data":"38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a"} Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.704606 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e7ef7cdc-68f1-4031-a5ef-66a910c50764","Type":"ContainerDied","Data":"9d9c461cd97b5c37192ca3c98c9d58f016c91ceb2d9f2433d3cbf903b42f2eaf"} Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.704632 4743 scope.go:117] "RemoveContainer" containerID="38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.704717 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ef7cdc-68f1-4031-a5ef-66a910c50764-kube-api-access-6h84f" (OuterVolumeSpecName: "kube-api-access-6h84f") pod "e7ef7cdc-68f1-4031-a5ef-66a910c50764" (UID: "e7ef7cdc-68f1-4031-a5ef-66a910c50764"). InnerVolumeSpecName "kube-api-access-6h84f". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.704807 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.733712 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-config-data" (OuterVolumeSpecName: "config-data") pod "e7ef7cdc-68f1-4031-a5ef-66a910c50764" (UID: "e7ef7cdc-68f1-4031-a5ef-66a910c50764"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.751688 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7ef7cdc-68f1-4031-a5ef-66a910c50764" (UID: "e7ef7cdc-68f1-4031-a5ef-66a910c50764"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.801298 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.801326 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7ef7cdc-68f1-4031-a5ef-66a910c50764-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.801334 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6h84f\" (UniqueName: \"kubernetes.io/projected/e7ef7cdc-68f1-4031-a5ef-66a910c50764-kube-api-access-6h84f\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.822490 4743 scope.go:117] "RemoveContainer" containerID="38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a" Nov 22 08:45:04 crc kubenswrapper[4743]: E1122 08:45:04.823035 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a\": container with ID starting with 38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a not found: ID does not exist" containerID="38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.823150 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a"} err="failed to get container status \"38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a\": rpc error: code = NotFound desc = could not find container \"38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a\": container with ID starting with 38a720f9d4d8e72bf1d620cc73a358a9afa7ad05f1f4720b314ae5ce250d059a not found: ID does not exist" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.825624 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.839960 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.849366 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 08:45:04 crc kubenswrapper[4743]: E1122 08:45:04.849832 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d83da6b0-e31f-4af6-934b-bfa046b49d20" containerName="nova-api-api" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.849861 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d83da6b0-e31f-4af6-934b-bfa046b49d20" containerName="nova-api-api" Nov 22 08:45:04 crc kubenswrapper[4743]: E1122 08:45:04.849884 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d83da6b0-e31f-4af6-934b-bfa046b49d20" containerName="nova-api-log" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.849895 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d83da6b0-e31f-4af6-934b-bfa046b49d20" containerName="nova-api-log" Nov 22 08:45:04 crc kubenswrapper[4743]: E1122 08:45:04.849926 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8fd004-0cbe-4a32-87cf-d199a7f39716" containerName="init" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.849934 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8fd004-0cbe-4a32-87cf-d199a7f39716" containerName="init" Nov 22 08:45:04 crc kubenswrapper[4743]: E1122 08:45:04.849941 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ef7cdc-68f1-4031-a5ef-66a910c50764" containerName="nova-scheduler-scheduler" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.849949 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ef7cdc-68f1-4031-a5ef-66a910c50764" containerName="nova-scheduler-scheduler" Nov 22 08:45:04 crc kubenswrapper[4743]: E1122 08:45:04.849975 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="529310a2-d134-4138-a99f-9146e71f32eb" containerName="nova-manage" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.849983 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="529310a2-d134-4138-a99f-9146e71f32eb" containerName="nova-manage" Nov 22 08:45:04 crc kubenswrapper[4743]: E1122 08:45:04.849994 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd54a052-ecbb-4344-9f1e-323a7fbf034b" containerName="collect-profiles" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.850001 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd54a052-ecbb-4344-9f1e-323a7fbf034b" containerName="collect-profiles" Nov 22 08:45:04 crc kubenswrapper[4743]: E1122 08:45:04.850010 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c8fd004-0cbe-4a32-87cf-d199a7f39716" containerName="dnsmasq-dns" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.850017 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c8fd004-0cbe-4a32-87cf-d199a7f39716" containerName="dnsmasq-dns" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.850233 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ef7cdc-68f1-4031-a5ef-66a910c50764" containerName="nova-scheduler-scheduler" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.850248 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c8fd004-0cbe-4a32-87cf-d199a7f39716" containerName="dnsmasq-dns" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.850262 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd54a052-ecbb-4344-9f1e-323a7fbf034b" containerName="collect-profiles" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.850287 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="529310a2-d134-4138-a99f-9146e71f32eb" containerName="nova-manage" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.850309 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d83da6b0-e31f-4af6-934b-bfa046b49d20" containerName="nova-api-log" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.850319 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d83da6b0-e31f-4af6-934b-bfa046b49d20" containerName="nova-api-api" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.859932 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.861303 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.863988 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.864032 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Nov 22 08:45:04 crc kubenswrapper[4743]: I1122 08:45:04.864157 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.004211 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng59n\" (UniqueName: \"kubernetes.io/projected/db905ec2-675e-48ea-a051-ed3d78c35797-kube-api-access-ng59n\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.004265 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-internal-tls-certs\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.004608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-config-data\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.004868 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-public-tls-certs\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.004923 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db905ec2-675e-48ea-a051-ed3d78c35797-logs\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.004978 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.037347 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.054766 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.094270 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.099785 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.102340 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.106674 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-config-data\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.106829 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-public-tls-certs\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.106873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db905ec2-675e-48ea-a051-ed3d78c35797-logs\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.106907 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.107525 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng59n\" (UniqueName: \"kubernetes.io/projected/db905ec2-675e-48ea-a051-ed3d78c35797-kube-api-access-ng59n\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.107423 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db905ec2-675e-48ea-a051-ed3d78c35797-logs\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.107566 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-internal-tls-certs\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.107708 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.110993 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-public-tls-certs\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.114847 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-config-data\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.115833 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.111603 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-internal-tls-certs\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.125264 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng59n\" (UniqueName: \"kubernetes.io/projected/db905ec2-675e-48ea-a051-ed3d78c35797-kube-api-access-ng59n\") pod \"nova-api-0\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.161261 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c8fd004-0cbe-4a32-87cf-d199a7f39716" path="/var/lib/kubelet/pods/1c8fd004-0cbe-4a32-87cf-d199a7f39716/volumes" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.162078 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d83da6b0-e31f-4af6-934b-bfa046b49d20" path="/var/lib/kubelet/pods/d83da6b0-e31f-4af6-934b-bfa046b49d20/volumes" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.162732 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7ef7cdc-68f1-4031-a5ef-66a910c50764" path="/var/lib/kubelet/pods/e7ef7cdc-68f1-4031-a5ef-66a910c50764/volumes" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.178822 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.209859 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.210072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-config-data\") pod \"nova-scheduler-0\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.210274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7p5b\" (UniqueName: \"kubernetes.io/projected/da8955a2-6deb-440c-97e3-f2420aa5fae8-kube-api-access-r7p5b\") pod \"nova-scheduler-0\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.311837 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-config-data\") pod \"nova-scheduler-0\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.311954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7p5b\" (UniqueName: \"kubernetes.io/projected/da8955a2-6deb-440c-97e3-f2420aa5fae8-kube-api-access-r7p5b\") pod \"nova-scheduler-0\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.312023 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.317251 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-config-data\") pod \"nova-scheduler-0\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.322121 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.334473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7p5b\" (UniqueName: \"kubernetes.io/projected/da8955a2-6deb-440c-97e3-f2420aa5fae8-kube-api-access-r7p5b\") pod \"nova-scheduler-0\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.420075 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.668564 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:45:05 crc kubenswrapper[4743]: W1122 08:45:05.676306 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb905ec2_675e_48ea_a051_ed3d78c35797.slice/crio-223b383c0ac8b350c1df0ab0af665d4f99d8904987d1963f07e7f334e3a37d46 WatchSource:0}: Error finding container 223b383c0ac8b350c1df0ab0af665d4f99d8904987d1963f07e7f334e3a37d46: Status 404 returned error can't find the container with id 223b383c0ac8b350c1df0ab0af665d4f99d8904987d1963f07e7f334e3a37d46 Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.717715 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db905ec2-675e-48ea-a051-ed3d78c35797","Type":"ContainerStarted","Data":"223b383c0ac8b350c1df0ab0af665d4f99d8904987d1963f07e7f334e3a37d46"} Nov 22 08:45:05 crc kubenswrapper[4743]: W1122 08:45:05.882534 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda8955a2_6deb_440c_97e3_f2420aa5fae8.slice/crio-66d5cb532dcc036751a45bee87df23976c1ef63228dbdb8d9b4db6f138df1e7a WatchSource:0}: Error finding container 66d5cb532dcc036751a45bee87df23976c1ef63228dbdb8d9b4db6f138df1e7a: Status 404 returned error can't find the container with id 66d5cb532dcc036751a45bee87df23976c1ef63228dbdb8d9b4db6f138df1e7a Nov 22 08:45:05 crc kubenswrapper[4743]: I1122 08:45:05.882666 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.017524 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": read tcp 10.217.0.2:50760->10.217.0.188:8775: read: connection reset by peer" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.017531 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.188:8775/\": read tcp 10.217.0.2:50764->10.217.0.188:8775: read: connection reset by peer" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.473784 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.543101 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-combined-ca-bundle\") pod \"6837b2a8-dfb5-4277-87f4-483200d1ae93\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.543253 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6837b2a8-dfb5-4277-87f4-483200d1ae93-logs\") pod \"6837b2a8-dfb5-4277-87f4-483200d1ae93\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.543898 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-nova-metadata-tls-certs\") pod \"6837b2a8-dfb5-4277-87f4-483200d1ae93\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.543922 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-config-data\") pod \"6837b2a8-dfb5-4277-87f4-483200d1ae93\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.543949 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtqp8\" (UniqueName: \"kubernetes.io/projected/6837b2a8-dfb5-4277-87f4-483200d1ae93-kube-api-access-mtqp8\") pod \"6837b2a8-dfb5-4277-87f4-483200d1ae93\" (UID: \"6837b2a8-dfb5-4277-87f4-483200d1ae93\") " Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.544751 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6837b2a8-dfb5-4277-87f4-483200d1ae93-logs" (OuterVolumeSpecName: "logs") pod "6837b2a8-dfb5-4277-87f4-483200d1ae93" (UID: "6837b2a8-dfb5-4277-87f4-483200d1ae93"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.566614 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6837b2a8-dfb5-4277-87f4-483200d1ae93-kube-api-access-mtqp8" (OuterVolumeSpecName: "kube-api-access-mtqp8") pod "6837b2a8-dfb5-4277-87f4-483200d1ae93" (UID: "6837b2a8-dfb5-4277-87f4-483200d1ae93"). InnerVolumeSpecName "kube-api-access-mtqp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.579462 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-config-data" (OuterVolumeSpecName: "config-data") pod "6837b2a8-dfb5-4277-87f4-483200d1ae93" (UID: "6837b2a8-dfb5-4277-87f4-483200d1ae93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.596808 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6837b2a8-dfb5-4277-87f4-483200d1ae93" (UID: "6837b2a8-dfb5-4277-87f4-483200d1ae93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.618089 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6837b2a8-dfb5-4277-87f4-483200d1ae93" (UID: "6837b2a8-dfb5-4277-87f4-483200d1ae93"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.646764 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.646813 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6837b2a8-dfb5-4277-87f4-483200d1ae93-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.646827 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.646841 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6837b2a8-dfb5-4277-87f4-483200d1ae93-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.646853 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtqp8\" (UniqueName: \"kubernetes.io/projected/6837b2a8-dfb5-4277-87f4-483200d1ae93-kube-api-access-mtqp8\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.729297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da8955a2-6deb-440c-97e3-f2420aa5fae8","Type":"ContainerStarted","Data":"a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722"} Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.729341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da8955a2-6deb-440c-97e3-f2420aa5fae8","Type":"ContainerStarted","Data":"66d5cb532dcc036751a45bee87df23976c1ef63228dbdb8d9b4db6f138df1e7a"} Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.731373 4743 generic.go:334] "Generic (PLEG): container finished" podID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerID="80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d" exitCode=0 Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.731417 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6837b2a8-dfb5-4277-87f4-483200d1ae93","Type":"ContainerDied","Data":"80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d"} Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.731436 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6837b2a8-dfb5-4277-87f4-483200d1ae93","Type":"ContainerDied","Data":"e00470aee988549c68eb3a6103d0e38d26e77a855268221a4925e5293b8785ae"} Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.731452 4743 scope.go:117] "RemoveContainer" containerID="80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.731558 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.735641 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db905ec2-675e-48ea-a051-ed3d78c35797","Type":"ContainerStarted","Data":"1cfb29dd7e0a21897754c302d7a14c2ab839c36f149cccd25dabc107f11f9bed"} Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.735675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db905ec2-675e-48ea-a051-ed3d78c35797","Type":"ContainerStarted","Data":"928c4075312b7232f7da07b77b7db3aff8ceda2f473954fe371d1b407a38a03a"} Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.755322 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.755303648 podStartE2EDuration="1.755303648s" podCreationTimestamp="2025-11-22 08:45:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:45:06.751312443 +0000 UTC m=+1380.457673525" watchObservedRunningTime="2025-11-22 08:45:06.755303648 +0000 UTC m=+1380.461664700" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.766826 4743 scope.go:117] "RemoveContainer" containerID="c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.800556 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.800535912 podStartE2EDuration="2.800535912s" podCreationTimestamp="2025-11-22 08:45:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:45:06.783045738 +0000 UTC m=+1380.489406820" watchObservedRunningTime="2025-11-22 08:45:06.800535912 +0000 UTC m=+1380.506896964" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.812648 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.820480 4743 scope.go:117] "RemoveContainer" containerID="80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d" Nov 22 08:45:06 crc kubenswrapper[4743]: E1122 08:45:06.821011 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d\": container with ID starting with 80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d not found: ID does not exist" containerID="80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.821064 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d"} err="failed to get container status \"80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d\": rpc error: code = NotFound desc = could not find container \"80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d\": container with ID starting with 80fd92c5924a19c64e297bad00dd3020de1b9eb02369c8a7a7503b2d0c38ab0d not found: ID does not exist" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.821082 4743 scope.go:117] "RemoveContainer" containerID="c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8" Nov 22 08:45:06 crc kubenswrapper[4743]: E1122 08:45:06.821387 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8\": container with ID starting with c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8 not found: ID does not exist" containerID="c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.821406 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8"} err="failed to get container status \"c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8\": rpc error: code = NotFound desc = could not find container \"c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8\": container with ID starting with c8ea9f58f249d98854293f112948d2ed6136af2514c6a7e31b851e831b3068e8 not found: ID does not exist" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.826280 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.864643 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:45:06 crc kubenswrapper[4743]: E1122 08:45:06.865209 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-log" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.865232 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-log" Nov 22 08:45:06 crc kubenswrapper[4743]: E1122 08:45:06.865275 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-metadata" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.865282 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-metadata" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.865609 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-metadata" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.865641 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" containerName="nova-metadata-log" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.867089 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.870852 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.872980 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.873177 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.958306 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.958435 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-config-data\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.958488 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g68lk\" (UniqueName: \"kubernetes.io/projected/2b24dd85-d686-4fb0-be74-7aca0b03255c-kube-api-access-g68lk\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.958513 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b24dd85-d686-4fb0-be74-7aca0b03255c-logs\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:06 crc kubenswrapper[4743]: I1122 08:45:06.958533 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.060148 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g68lk\" (UniqueName: \"kubernetes.io/projected/2b24dd85-d686-4fb0-be74-7aca0b03255c-kube-api-access-g68lk\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.060198 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b24dd85-d686-4fb0-be74-7aca0b03255c-logs\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.060219 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.060273 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.060381 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-config-data\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.060808 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b24dd85-d686-4fb0-be74-7aca0b03255c-logs\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.064743 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.065051 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-config-data\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.065498 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.080421 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g68lk\" (UniqueName: \"kubernetes.io/projected/2b24dd85-d686-4fb0-be74-7aca0b03255c-kube-api-access-g68lk\") pod \"nova-metadata-0\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.162085 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6837b2a8-dfb5-4277-87f4-483200d1ae93" path="/var/lib/kubelet/pods/6837b2a8-dfb5-4277-87f4-483200d1ae93/volumes" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.200486 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.640128 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:45:07 crc kubenswrapper[4743]: W1122 08:45:07.644259 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b24dd85_d686_4fb0_be74_7aca0b03255c.slice/crio-8f61063094b3f711e1b7cccbf81e7095d03423e00dea436112bcaae08b1a86c9 WatchSource:0}: Error finding container 8f61063094b3f711e1b7cccbf81e7095d03423e00dea436112bcaae08b1a86c9: Status 404 returned error can't find the container with id 8f61063094b3f711e1b7cccbf81e7095d03423e00dea436112bcaae08b1a86c9 Nov 22 08:45:07 crc kubenswrapper[4743]: I1122 08:45:07.747333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b24dd85-d686-4fb0-be74-7aca0b03255c","Type":"ContainerStarted","Data":"8f61063094b3f711e1b7cccbf81e7095d03423e00dea436112bcaae08b1a86c9"} Nov 22 08:45:08 crc kubenswrapper[4743]: I1122 08:45:08.760885 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b24dd85-d686-4fb0-be74-7aca0b03255c","Type":"ContainerStarted","Data":"31240114f37ac66a6ac0ee75966656d89b98f5b65714a820dfeae421393b0b13"} Nov 22 08:45:08 crc kubenswrapper[4743]: I1122 08:45:08.761371 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b24dd85-d686-4fb0-be74-7aca0b03255c","Type":"ContainerStarted","Data":"4b24ccee2f20c0c9bff9ed0577c5b13a5e4c322c8c14f5cae7487c6ed9272a36"} Nov 22 08:45:08 crc kubenswrapper[4743]: I1122 08:45:08.793289 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.793269197 podStartE2EDuration="2.793269197s" podCreationTimestamp="2025-11-22 08:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:45:08.787742918 +0000 UTC m=+1382.494103990" watchObservedRunningTime="2025-11-22 08:45:08.793269197 +0000 UTC m=+1382.499630249" Nov 22 08:45:10 crc kubenswrapper[4743]: I1122 08:45:10.420513 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 08:45:12 crc kubenswrapper[4743]: I1122 08:45:12.200730 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 08:45:12 crc kubenswrapper[4743]: I1122 08:45:12.201042 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 08:45:15 crc kubenswrapper[4743]: I1122 08:45:15.179530 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 08:45:15 crc kubenswrapper[4743]: I1122 08:45:15.179926 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 08:45:15 crc kubenswrapper[4743]: I1122 08:45:15.420303 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 08:45:15 crc kubenswrapper[4743]: I1122 08:45:15.478162 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 08:45:15 crc kubenswrapper[4743]: I1122 08:45:15.858687 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 08:45:16 crc kubenswrapper[4743]: I1122 08:45:16.196059 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 08:45:16 crc kubenswrapper[4743]: I1122 08:45:16.196065 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.200:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 08:45:17 crc kubenswrapper[4743]: I1122 08:45:17.200749 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 08:45:17 crc kubenswrapper[4743]: I1122 08:45:17.200835 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 08:45:18 crc kubenswrapper[4743]: I1122 08:45:18.215831 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 08:45:18 crc kubenswrapper[4743]: I1122 08:45:18.216274 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Nov 22 08:45:25 crc kubenswrapper[4743]: I1122 08:45:25.185357 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 08:45:25 crc kubenswrapper[4743]: I1122 08:45:25.187262 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 08:45:25 crc kubenswrapper[4743]: I1122 08:45:25.187629 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 08:45:25 crc kubenswrapper[4743]: I1122 08:45:25.187688 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 08:45:25 crc kubenswrapper[4743]: I1122 08:45:25.194860 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 08:45:25 crc kubenswrapper[4743]: I1122 08:45:25.195289 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 08:45:27 crc kubenswrapper[4743]: I1122 08:45:27.207261 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 08:45:27 crc kubenswrapper[4743]: I1122 08:45:27.207748 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 08:45:27 crc kubenswrapper[4743]: I1122 08:45:27.213144 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 08:45:27 crc kubenswrapper[4743]: I1122 08:45:27.213775 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 08:45:29 crc kubenswrapper[4743]: I1122 08:45:29.049519 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 08:45:31 crc kubenswrapper[4743]: I1122 08:45:31.241587 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:45:31 crc kubenswrapper[4743]: I1122 08:45:31.241955 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.351906 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.352810 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="5987ad61-2878-4efc-98ca-ea29b123f26e" containerName="openstackclient" containerID="cri-o://46baaf42142233869f49a3ed3725aeb263cb2291e27ce1211801aed7212ad955" gracePeriod=2 Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.360075 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.628031 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.684310 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.685069 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerName="openstack-network-exporter" containerID="cri-o://01419810dd33722ab918d36dfaf000ac018b6b763b7e90647fea3f7eed2c7509" gracePeriod=300 Nov 22 08:45:50 crc kubenswrapper[4743]: E1122 08:45:50.697104 4743 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 22 08:45:50 crc kubenswrapper[4743]: E1122 08:45:50.697200 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data podName:ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1 nodeName:}" failed. No retries permitted until 2025-11-22 08:45:51.197172251 +0000 UTC m=+1424.903533383 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data") pod "rabbitmq-cell1-server-0" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1") : configmap "rabbitmq-cell1-config-data" not found Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.800327 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder89a7-account-delete-gjvvg"] Nov 22 08:45:50 crc kubenswrapper[4743]: E1122 08:45:50.812289 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5987ad61-2878-4efc-98ca-ea29b123f26e" containerName="openstackclient" Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.812326 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5987ad61-2878-4efc-98ca-ea29b123f26e" containerName="openstackclient" Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.815046 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5987ad61-2878-4efc-98ca-ea29b123f26e" containerName="openstackclient" Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.824408 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder89a7-account-delete-gjvvg" Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.906835 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7936e330-2138-4624-b319-902f6a4941ec-operator-scripts\") pod \"cinder89a7-account-delete-gjvvg\" (UID: \"7936e330-2138-4624-b319-902f6a4941ec\") " pod="openstack/cinder89a7-account-delete-gjvvg" Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.906942 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-459l2\" (UniqueName: \"kubernetes.io/projected/7936e330-2138-4624-b319-902f6a4941ec-kube-api-access-459l2\") pod \"cinder89a7-account-delete-gjvvg\" (UID: \"7936e330-2138-4624-b319-902f6a4941ec\") " pod="openstack/cinder89a7-account-delete-gjvvg" Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.957836 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder89a7-account-delete-gjvvg"] Nov 22 08:45:50 crc kubenswrapper[4743]: I1122 08:45:50.995074 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.005986 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement0984-account-delete-82zvj"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.007425 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement0984-account-delete-82zvj" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.015011 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7936e330-2138-4624-b319-902f6a4941ec-operator-scripts\") pod \"cinder89a7-account-delete-gjvvg\" (UID: \"7936e330-2138-4624-b319-902f6a4941ec\") " pod="openstack/cinder89a7-account-delete-gjvvg" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.015125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-459l2\" (UniqueName: \"kubernetes.io/projected/7936e330-2138-4624-b319-902f6a4941ec-kube-api-access-459l2\") pod \"cinder89a7-account-delete-gjvvg\" (UID: \"7936e330-2138-4624-b319-902f6a4941ec\") " pod="openstack/cinder89a7-account-delete-gjvvg" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.016325 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7936e330-2138-4624-b319-902f6a4941ec-operator-scripts\") pod \"cinder89a7-account-delete-gjvvg\" (UID: \"7936e330-2138-4624-b319-902f6a4941ec\") " pod="openstack/cinder89a7-account-delete-gjvvg" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.017810 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican675b-account-delete-bdj7v"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.019291 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican675b-account-delete-bdj7v" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.076768 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.077073 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="ovn-northd" containerID="cri-o://44e22b0e556cf479c4ab148fe02b8b602f8d6a658164bfd210e15bbe9a5c9282" gracePeriod=30 Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.077851 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="openstack-network-exporter" containerID="cri-o://1f420d1e2699e276d82c94d18dd411a4b04324712350d57b7ef8e6cdb952414a" gracePeriod=30 Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.109624 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican675b-account-delete-bdj7v"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.121157 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5310c975-ef7b-4161-ab2e-5ee94b709f9d-operator-scripts\") pod \"barbican675b-account-delete-bdj7v\" (UID: \"5310c975-ef7b-4161-ab2e-5ee94b709f9d\") " pod="openstack/barbican675b-account-delete-bdj7v" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.121241 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts\") pod \"placement0984-account-delete-82zvj\" (UID: \"9375da2b-3776-4c32-8afd-d1ed7b22b308\") " pod="openstack/placement0984-account-delete-82zvj" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.121272 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8tq9\" (UniqueName: \"kubernetes.io/projected/9375da2b-3776-4c32-8afd-d1ed7b22b308-kube-api-access-g8tq9\") pod \"placement0984-account-delete-82zvj\" (UID: \"9375da2b-3776-4c32-8afd-d1ed7b22b308\") " pod="openstack/placement0984-account-delete-82zvj" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.121553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlsc6\" (UniqueName: \"kubernetes.io/projected/5310c975-ef7b-4161-ab2e-5ee94b709f9d-kube-api-access-dlsc6\") pod \"barbican675b-account-delete-bdj7v\" (UID: \"5310c975-ef7b-4161-ab2e-5ee94b709f9d\") " pod="openstack/barbican675b-account-delete-bdj7v" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.153272 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-459l2\" (UniqueName: \"kubernetes.io/projected/7936e330-2138-4624-b319-902f6a4941ec-kube-api-access-459l2\") pod \"cinder89a7-account-delete-gjvvg\" (UID: \"7936e330-2138-4624-b319-902f6a4941ec\") " pod="openstack/cinder89a7-account-delete-gjvvg" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.174811 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerName="ovsdbserver-nb" containerID="cri-o://85ca3e549dbcf99a2f9f8ce67ee485a73244e79666976bd6f0f2fe904f8d3d50" gracePeriod=300 Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.184089 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder89a7-account-delete-gjvvg" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.210616 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-sj8hg"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.228001 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlsc6\" (UniqueName: \"kubernetes.io/projected/5310c975-ef7b-4161-ab2e-5ee94b709f9d-kube-api-access-dlsc6\") pod \"barbican675b-account-delete-bdj7v\" (UID: \"5310c975-ef7b-4161-ab2e-5ee94b709f9d\") " pod="openstack/barbican675b-account-delete-bdj7v" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.228450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5310c975-ef7b-4161-ab2e-5ee94b709f9d-operator-scripts\") pod \"barbican675b-account-delete-bdj7v\" (UID: \"5310c975-ef7b-4161-ab2e-5ee94b709f9d\") " pod="openstack/barbican675b-account-delete-bdj7v" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.228496 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts\") pod \"placement0984-account-delete-82zvj\" (UID: \"9375da2b-3776-4c32-8afd-d1ed7b22b308\") " pod="openstack/placement0984-account-delete-82zvj" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.228520 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8tq9\" (UniqueName: \"kubernetes.io/projected/9375da2b-3776-4c32-8afd-d1ed7b22b308-kube-api-access-g8tq9\") pod \"placement0984-account-delete-82zvj\" (UID: \"9375da2b-3776-4c32-8afd-d1ed7b22b308\") " pod="openstack/placement0984-account-delete-82zvj" Nov 22 08:45:51 crc kubenswrapper[4743]: E1122 08:45:51.230028 4743 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 22 08:45:51 crc kubenswrapper[4743]: E1122 08:45:51.230096 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data podName:e5fac46a-545d-4f30-a7ab-8f5e713e934d nodeName:}" failed. No retries permitted until 2025-11-22 08:45:51.730076253 +0000 UTC m=+1425.436437385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data") pod "rabbitmq-server-0" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d") : configmap "rabbitmq-config-data" not found Nov 22 08:45:51 crc kubenswrapper[4743]: E1122 08:45:51.230263 4743 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 22 08:45:51 crc kubenswrapper[4743]: E1122 08:45:51.230295 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data podName:ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1 nodeName:}" failed. No retries permitted until 2025-11-22 08:45:52.230284839 +0000 UTC m=+1425.936645961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data") pod "rabbitmq-cell1-server-0" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1") : configmap "rabbitmq-cell1-config-data" not found Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.231719 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts\") pod \"placement0984-account-delete-82zvj\" (UID: \"9375da2b-3776-4c32-8afd-d1ed7b22b308\") " pod="openstack/placement0984-account-delete-82zvj" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.239060 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-sj8hg"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.239068 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5310c975-ef7b-4161-ab2e-5ee94b709f9d-operator-scripts\") pod \"barbican675b-account-delete-bdj7v\" (UID: \"5310c975-ef7b-4161-ab2e-5ee94b709f9d\") " pod="openstack/barbican675b-account-delete-bdj7v" Nov 22 08:45:51 crc kubenswrapper[4743]: E1122 08:45:51.245263 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: EOF, stdout: , stderr: , exit code -1" containerID="85ca3e549dbcf99a2f9f8ce67ee485a73244e79666976bd6f0f2fe904f8d3d50" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 22 08:45:51 crc kubenswrapper[4743]: E1122 08:45:51.250800 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="85ca3e549dbcf99a2f9f8ce67ee485a73244e79666976bd6f0f2fe904f8d3d50" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 22 08:45:51 crc kubenswrapper[4743]: E1122 08:45:51.253003 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="85ca3e549dbcf99a2f9f8ce67ee485a73244e79666976bd6f0f2fe904f8d3d50" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 22 08:45:51 crc kubenswrapper[4743]: E1122 08:45:51.253048 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerName="ovsdbserver-nb" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.275970 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement0984-account-delete-82zvj"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.311317 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron8dc4-account-delete-rtl4b"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.317407 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron8dc4-account-delete-rtl4b" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.338622 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron8dc4-account-delete-rtl4b"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.358738 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance4a60-account-delete-j4cg4"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.360734 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4a60-account-delete-j4cg4" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.381807 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlsc6\" (UniqueName: \"kubernetes.io/projected/5310c975-ef7b-4161-ab2e-5ee94b709f9d-kube-api-access-dlsc6\") pod \"barbican675b-account-delete-bdj7v\" (UID: \"5310c975-ef7b-4161-ab2e-5ee94b709f9d\") " pod="openstack/barbican675b-account-delete-bdj7v" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.399177 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8tq9\" (UniqueName: \"kubernetes.io/projected/9375da2b-3776-4c32-8afd-d1ed7b22b308-kube-api-access-g8tq9\") pod \"placement0984-account-delete-82zvj\" (UID: \"9375da2b-3776-4c32-8afd-d1ed7b22b308\") " pod="openstack/placement0984-account-delete-82zvj" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.402633 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance4a60-account-delete-j4cg4"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.441149 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdrrv\" (UniqueName: \"kubernetes.io/projected/f6d1b00d-147b-4865-b659-59d06f360797-kube-api-access-qdrrv\") pod \"neutron8dc4-account-delete-rtl4b\" (UID: \"f6d1b00d-147b-4865-b659-59d06f360797\") " pod="openstack/neutron8dc4-account-delete-rtl4b" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.441243 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48bbac5-2782-4c1e-b74b-520f0457f9ac-operator-scripts\") pod \"glance4a60-account-delete-j4cg4\" (UID: \"f48bbac5-2782-4c1e-b74b-520f0457f9ac\") " pod="openstack/glance4a60-account-delete-j4cg4" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.441273 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9zdc\" (UniqueName: \"kubernetes.io/projected/f48bbac5-2782-4c1e-b74b-520f0457f9ac-kube-api-access-s9zdc\") pod \"glance4a60-account-delete-j4cg4\" (UID: \"f48bbac5-2782-4c1e-b74b-520f0457f9ac\") " pod="openstack/glance4a60-account-delete-j4cg4" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.441290 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d1b00d-147b-4865-b659-59d06f360797-operator-scripts\") pod \"neutron8dc4-account-delete-rtl4b\" (UID: \"f6d1b00d-147b-4865-b659-59d06f360797\") " pod="openstack/neutron8dc4-account-delete-rtl4b" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.458719 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell1c099-account-delete-lqvbc"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.459888 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1c099-account-delete-lqvbc" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.491221 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6t9hh"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.494834 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="ovn-northd" probeResult="failure" output=< Nov 22 08:45:51 crc kubenswrapper[4743]: 2025-11-22T08:45:51Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Nov 22 08:45:51 crc kubenswrapper[4743]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Nov 22 08:45:51 crc kubenswrapper[4743]: 2025-11-22T08:45:51Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Nov 22 08:45:51 crc kubenswrapper[4743]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Nov 22 08:45:51 crc kubenswrapper[4743]: > Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.508678 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-7qctt"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.509111 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-7qctt" podUID="870c700d-9095-4781-ab16-4cce25d24ed2" containerName="openstack-network-exporter" containerID="cri-o://544ee1789868b1e74b94d551ca242dd748b844448c139b17d4767a8ea19814b9" gracePeriod=30 Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.521307 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell1c099-account-delete-lqvbc"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.534147 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-mz9kc"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.545330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdrrv\" (UniqueName: \"kubernetes.io/projected/f6d1b00d-147b-4865-b659-59d06f360797-kube-api-access-qdrrv\") pod \"neutron8dc4-account-delete-rtl4b\" (UID: \"f6d1b00d-147b-4865-b659-59d06f360797\") " pod="openstack/neutron8dc4-account-delete-rtl4b" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.551222 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48bbac5-2782-4c1e-b74b-520f0457f9ac-operator-scripts\") pod \"glance4a60-account-delete-j4cg4\" (UID: \"f48bbac5-2782-4c1e-b74b-520f0457f9ac\") " pod="openstack/glance4a60-account-delete-j4cg4" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.551692 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9zdc\" (UniqueName: \"kubernetes.io/projected/f48bbac5-2782-4c1e-b74b-520f0457f9ac-kube-api-access-s9zdc\") pod \"glance4a60-account-delete-j4cg4\" (UID: \"f48bbac5-2782-4c1e-b74b-520f0457f9ac\") " pod="openstack/glance4a60-account-delete-j4cg4" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.551851 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d1b00d-147b-4865-b659-59d06f360797-operator-scripts\") pod \"neutron8dc4-account-delete-rtl4b\" (UID: \"f6d1b00d-147b-4865-b659-59d06f360797\") " pod="openstack/neutron8dc4-account-delete-rtl4b" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.553012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d1b00d-147b-4865-b659-59d06f360797-operator-scripts\") pod \"neutron8dc4-account-delete-rtl4b\" (UID: \"f6d1b00d-147b-4865-b659-59d06f360797\") " pod="openstack/neutron8dc4-account-delete-rtl4b" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.547726 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell0ad52-account-delete-9r4lw"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.559998 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48bbac5-2782-4c1e-b74b-520f0457f9ac-operator-scripts\") pod \"glance4a60-account-delete-j4cg4\" (UID: \"f48bbac5-2782-4c1e-b74b-520f0457f9ac\") " pod="openstack/glance4a60-account-delete-j4cg4" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.579002 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-n2d6p"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.579045 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-n2d6p"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.579143 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0ad52-account-delete-9r4lw" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.626360 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdrrv\" (UniqueName: \"kubernetes.io/projected/f6d1b00d-147b-4865-b659-59d06f360797-kube-api-access-qdrrv\") pod \"neutron8dc4-account-delete-rtl4b\" (UID: \"f6d1b00d-147b-4865-b659-59d06f360797\") " pod="openstack/neutron8dc4-account-delete-rtl4b" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.643540 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9zdc\" (UniqueName: \"kubernetes.io/projected/f48bbac5-2782-4c1e-b74b-520f0457f9ac-kube-api-access-s9zdc\") pod \"glance4a60-account-delete-j4cg4\" (UID: \"f48bbac5-2782-4c1e-b74b-520f0457f9ac\") " pod="openstack/glance4a60-account-delete-j4cg4" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.663914 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0ad52-account-delete-9r4lw"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.664027 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican675b-account-delete-bdj7v" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.664281 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement0984-account-delete-82zvj" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.665350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmgxv\" (UniqueName: \"kubernetes.io/projected/5fca29fd-c34f-4954-960f-b5ca0812d5b0-kube-api-access-gmgxv\") pod \"novacell0ad52-account-delete-9r4lw\" (UID: \"5fca29fd-c34f-4954-960f-b5ca0812d5b0\") " pod="openstack/novacell0ad52-account-delete-9r4lw" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.665507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fca29fd-c34f-4954-960f-b5ca0812d5b0-operator-scripts\") pod \"novacell0ad52-account-delete-9r4lw\" (UID: \"5fca29fd-c34f-4954-960f-b5ca0812d5b0\") " pod="openstack/novacell0ad52-account-delete-9r4lw" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.665547 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts\") pod \"novacell1c099-account-delete-lqvbc\" (UID: \"d96211ff-f7ba-4e26-ae39-43c8062e2277\") " pod="openstack/novacell1c099-account-delete-lqvbc" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.665594 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlr6w\" (UniqueName: \"kubernetes.io/projected/d96211ff-f7ba-4e26-ae39-43c8062e2277-kube-api-access-mlr6w\") pod \"novacell1c099-account-delete-lqvbc\" (UID: \"d96211ff-f7ba-4e26-ae39-43c8062e2277\") " pod="openstack/novacell1c099-account-delete-lqvbc" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.690672 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.691398 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" containerName="openstack-network-exporter" containerID="cri-o://1673fc12fab7560971cb866548c9e23899260f2050054ca6370d37795dfdb742" gracePeriod=300 Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.750121 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-m9jrr"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.766896 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmgxv\" (UniqueName: \"kubernetes.io/projected/5fca29fd-c34f-4954-960f-b5ca0812d5b0-kube-api-access-gmgxv\") pod \"novacell0ad52-account-delete-9r4lw\" (UID: \"5fca29fd-c34f-4954-960f-b5ca0812d5b0\") " pod="openstack/novacell0ad52-account-delete-9r4lw" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.767249 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fca29fd-c34f-4954-960f-b5ca0812d5b0-operator-scripts\") pod \"novacell0ad52-account-delete-9r4lw\" (UID: \"5fca29fd-c34f-4954-960f-b5ca0812d5b0\") " pod="openstack/novacell0ad52-account-delete-9r4lw" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.767289 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts\") pod \"novacell1c099-account-delete-lqvbc\" (UID: \"d96211ff-f7ba-4e26-ae39-43c8062e2277\") " pod="openstack/novacell1c099-account-delete-lqvbc" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.767319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlr6w\" (UniqueName: \"kubernetes.io/projected/d96211ff-f7ba-4e26-ae39-43c8062e2277-kube-api-access-mlr6w\") pod \"novacell1c099-account-delete-lqvbc\" (UID: \"d96211ff-f7ba-4e26-ae39-43c8062e2277\") " pod="openstack/novacell1c099-account-delete-lqvbc" Nov 22 08:45:51 crc kubenswrapper[4743]: E1122 08:45:51.768332 4743 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 22 08:45:51 crc kubenswrapper[4743]: E1122 08:45:51.768373 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data podName:e5fac46a-545d-4f30-a7ab-8f5e713e934d nodeName:}" failed. No retries permitted until 2025-11-22 08:45:52.768358251 +0000 UTC m=+1426.474719303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data") pod "rabbitmq-server-0" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d") : configmap "rabbitmq-config-data" not found Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.769874 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts\") pod \"novacell1c099-account-delete-lqvbc\" (UID: \"d96211ff-f7ba-4e26-ae39-43c8062e2277\") " pod="openstack/novacell1c099-account-delete-lqvbc" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.772437 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-m9jrr"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.795743 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fca29fd-c34f-4954-960f-b5ca0812d5b0-operator-scripts\") pod \"novacell0ad52-account-delete-9r4lw\" (UID: \"5fca29fd-c34f-4954-960f-b5ca0812d5b0\") " pod="openstack/novacell0ad52-account-delete-9r4lw" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.820428 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlr6w\" (UniqueName: \"kubernetes.io/projected/d96211ff-f7ba-4e26-ae39-43c8062e2277-kube-api-access-mlr6w\") pod \"novacell1c099-account-delete-lqvbc\" (UID: \"d96211ff-f7ba-4e26-ae39-43c8062e2277\") " pod="openstack/novacell1c099-account-delete-lqvbc" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.822261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmgxv\" (UniqueName: \"kubernetes.io/projected/5fca29fd-c34f-4954-960f-b5ca0812d5b0-kube-api-access-gmgxv\") pod \"novacell0ad52-account-delete-9r4lw\" (UID: \"5fca29fd-c34f-4954-960f-b5ca0812d5b0\") " pod="openstack/novacell0ad52-account-delete-9r4lw" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.840780 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi1d8b-account-delete-scjlt"] Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.842289 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi1d8b-account-delete-scjlt" Nov 22 08:45:51 crc kubenswrapper[4743]: I1122 08:45:51.869196 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi1d8b-account-delete-scjlt"] Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.000435 4743 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-6t9hh" message=< Nov 22 08:45:52 crc kubenswrapper[4743]: Exiting ovn-controller (1) [ OK ] Nov 22 08:45:52 crc kubenswrapper[4743]: > Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.000552 4743 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-6t9hh" podUID="5db10427-8546-4dea-b849-36bb02c837bd" containerName="ovn-controller" containerID="cri-o://6c0db6fc539fa60e13ce74a4129ccda55a6455c6da77fd7c2a15ec935e19f792" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.000615 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-6t9hh" podUID="5db10427-8546-4dea-b849-36bb02c837bd" containerName="ovn-controller" containerID="cri-o://6c0db6fc539fa60e13ce74a4129ccda55a6455c6da77fd7c2a15ec935e19f792" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.070010 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk88b\" (UniqueName: \"kubernetes.io/projected/30ee548a-8838-4d52-867b-4dfdb6c4f641-kube-api-access-dk88b\") pod \"novaapi1d8b-account-delete-scjlt\" (UID: \"30ee548a-8838-4d52-867b-4dfdb6c4f641\") " pod="openstack/novaapi1d8b-account-delete-scjlt" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.070414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30ee548a-8838-4d52-867b-4dfdb6c4f641-operator-scripts\") pod \"novaapi1d8b-account-delete-scjlt\" (UID: \"30ee548a-8838-4d52-867b-4dfdb6c4f641\") " pod="openstack/novaapi1d8b-account-delete-scjlt" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.099462 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" containerName="ovsdbserver-sb" containerID="cri-o://144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e" gracePeriod=300 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.179550 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-22g48"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.190731 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk88b\" (UniqueName: \"kubernetes.io/projected/30ee548a-8838-4d52-867b-4dfdb6c4f641-kube-api-access-dk88b\") pod \"novaapi1d8b-account-delete-scjlt\" (UID: \"30ee548a-8838-4d52-867b-4dfdb6c4f641\") " pod="openstack/novaapi1d8b-account-delete-scjlt" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.190808 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30ee548a-8838-4d52-867b-4dfdb6c4f641-operator-scripts\") pod \"novaapi1d8b-account-delete-scjlt\" (UID: \"30ee548a-8838-4d52-867b-4dfdb6c4f641\") " pod="openstack/novaapi1d8b-account-delete-scjlt" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.195837 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron8dc4-account-delete-rtl4b" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.196250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30ee548a-8838-4d52-867b-4dfdb6c4f641-operator-scripts\") pod \"novaapi1d8b-account-delete-scjlt\" (UID: \"30ee548a-8838-4d52-867b-4dfdb6c4f641\") " pod="openstack/novaapi1d8b-account-delete-scjlt" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.209770 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-22g48"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.225232 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-92fnd"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.229387 4743 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/cinder-api-0" secret="" err="secret \"cinder-cinder-dockercfg-2bw4c\" not found" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.239454 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-92fnd"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.252651 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk88b\" (UniqueName: \"kubernetes.io/projected/30ee548a-8838-4d52-867b-4dfdb6c4f641-kube-api-access-dk88b\") pod \"novaapi1d8b-account-delete-scjlt\" (UID: \"30ee548a-8838-4d52-867b-4dfdb6c4f641\") " pod="openstack/novaapi1d8b-account-delete-scjlt" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.268882 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-9fxgn"] Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.294809 4743 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.295098 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data podName:ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1 nodeName:}" failed. No retries permitted until 2025-11-22 08:45:54.295083864 +0000 UTC m=+1428.001444916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data") pod "rabbitmq-cell1-server-0" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1") : configmap "rabbitmq-cell1-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.310490 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-9fxgn"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.323813 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.324343 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-server" containerID="cri-o://bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.324881 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="swift-recon-cron" containerID="cri-o://c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.324940 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="rsync" containerID="cri-o://18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.324974 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-expirer" containerID="cri-o://a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.325005 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-updater" containerID="cri-o://993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.325037 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-auditor" containerID="cri-o://ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.325068 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-replicator" containerID="cri-o://f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.325101 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-server" containerID="cri-o://bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.325131 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-updater" containerID="cri-o://da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.326660 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-auditor" containerID="cri-o://5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.326705 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-replicator" containerID="cri-o://174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.326715 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-server" containerID="cri-o://f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.326726 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-reaper" containerID="cri-o://863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.326734 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-auditor" containerID="cri-o://08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.326744 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-replicator" containerID="cri-o://f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.338224 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d/ovsdbserver-nb/0.log" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.338286 4743 generic.go:334] "Generic (PLEG): container finished" podID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerID="01419810dd33722ab918d36dfaf000ac018b6b763b7e90647fea3f7eed2c7509" exitCode=2 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.338310 4743 generic.go:334] "Generic (PLEG): container finished" podID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerID="85ca3e549dbcf99a2f9f8ce67ee485a73244e79666976bd6f0f2fe904f8d3d50" exitCode=143 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.338412 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d","Type":"ContainerDied","Data":"01419810dd33722ab918d36dfaf000ac018b6b763b7e90647fea3f7eed2c7509"} Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.338439 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d","Type":"ContainerDied","Data":"85ca3e549dbcf99a2f9f8ce67ee485a73244e79666976bd6f0f2fe904f8d3d50"} Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.354466 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3a18c86e-9d86-49ee-918f-76de17000e18/ovsdbserver-sb/0.log" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.354500 4743 generic.go:334] "Generic (PLEG): container finished" podID="3a18c86e-9d86-49ee-918f-76de17000e18" containerID="1673fc12fab7560971cb866548c9e23899260f2050054ca6370d37795dfdb742" exitCode=2 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.354546 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3a18c86e-9d86-49ee-918f-76de17000e18","Type":"ContainerDied","Data":"1673fc12fab7560971cb866548c9e23899260f2050054ca6370d37795dfdb742"} Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.367476 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-hkskb"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.394218 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-hkskb"] Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.397077 4743 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.397123 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:45:52.897109185 +0000 UTC m=+1426.603470237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-scripts" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.397328 4743 secret.go:188] Couldn't get secret openstack/cinder-api-config-data: secret "cinder-api-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.397363 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:45:52.897353882 +0000 UTC m=+1426.603714944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-api-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.398033 4743 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.398058 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:45:52.898050082 +0000 UTC m=+1426.604411134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.405531 4743 generic.go:334] "Generic (PLEG): container finished" podID="b9817865-d957-42d3-8edb-6800e1075d23" containerID="1f420d1e2699e276d82c94d18dd411a4b04324712350d57b7ef8e6cdb952414a" exitCode=2 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.405625 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9817865-d957-42d3-8edb-6800e1075d23","Type":"ContainerDied","Data":"1f420d1e2699e276d82c94d18dd411a4b04324712350d57b7ef8e6cdb952414a"} Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.426374 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-8hnw7"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.431776 4743 generic.go:334] "Generic (PLEG): container finished" podID="5db10427-8546-4dea-b849-36bb02c837bd" containerID="6c0db6fc539fa60e13ce74a4129ccda55a6455c6da77fd7c2a15ec935e19f792" exitCode=0 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.431882 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh" event={"ID":"5db10427-8546-4dea-b849-36bb02c837bd","Type":"ContainerDied","Data":"6c0db6fc539fa60e13ce74a4129ccda55a6455c6da77fd7c2a15ec935e19f792"} Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.449841 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7qctt_870c700d-9095-4781-ab16-4cce25d24ed2/openstack-network-exporter/0.log" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.449890 4743 generic.go:334] "Generic (PLEG): container finished" podID="870c700d-9095-4781-ab16-4cce25d24ed2" containerID="544ee1789868b1e74b94d551ca242dd748b844448c139b17d4767a8ea19814b9" exitCode=2 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.449922 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7qctt" event={"ID":"870c700d-9095-4781-ab16-4cce25d24ed2","Type":"ContainerDied","Data":"544ee1789868b1e74b94d551ca242dd748b844448c139b17d4767a8ea19814b9"} Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.464774 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-8hnw7"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.487261 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-zqvbz"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.487787 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" podUID="aab079ae-b574-40f3-8df0-7deff1356e09" containerName="dnsmasq-dns" containerID="cri-o://b976e4375d0ed0f34734cc0cfaefc7f462784997b77abe216f7a1ac9164a5a03" gracePeriod=10 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.495867 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.496079 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="145d3340-8ded-4082-b9c8-7b1a21390097" containerName="cinder-scheduler" containerID="cri-o://253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.497205 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="145d3340-8ded-4082-b9c8-7b1a21390097" containerName="probe" containerID="cri-o://fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.508506 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.520253 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-84df6c6d8d-v9vxr"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.520619 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-84df6c6d8d-v9vxr" podUID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" containerName="placement-log" containerID="cri-o://7711ec056fa213f2eee796483c27379cd7a134fa6030ba1e23b52a3457a46cec" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.520970 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-84df6c6d8d-v9vxr" podUID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" containerName="placement-api" containerID="cri-o://12838b3e542aa21904acab03d6b27d30ec54f1471909fca6df88ff3e1aee935d" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.551569 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.584615 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5568cf9dfc-ghfzl"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.584876 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5568cf9dfc-ghfzl" podUID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" containerName="neutron-api" containerID="cri-o://f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.585320 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5568cf9dfc-ghfzl" podUID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" containerName="neutron-httpd" containerID="cri-o://59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.625856 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5ff985d64c-mnpj5"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.626116 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5ff985d64c-mnpj5" podUID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" containerName="proxy-httpd" containerID="cri-o://ff8003a3594d25ec03aad9438f8a8b6e3c4495c012f444863e724569495817e4" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.626517 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-5ff985d64c-mnpj5" podUID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" containerName="proxy-server" containerID="cri-o://69a331217c6e9870990cf0477268ec07b586afa72d1bd546c97e364e672bdc27" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.662720 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovs-vswitchd" containerID="cri-o://0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" gracePeriod=29 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.664431 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.664671 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" containerName="glance-log" containerID="cri-o://d713e66a35891819a155186b552565a296254b8c93475b9aa0a54b55dd7cbd38" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.664843 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" containerName="glance-httpd" containerID="cri-o://6671bb99b39fc16a0f6c253ac0e494e49254030b8ca083c6f60cb786f074a063" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.671944 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" containerName="rabbitmq" containerID="cri-o://59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5" gracePeriod=604800 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.688179 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.688437 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c61760fb-827b-4199-bfdb-52698c7b4824" containerName="glance-log" containerID="cri-o://d85aa17d800ad1cbc94e2aaf79a94f094b2da7ff02061d9a5cc19841c8f58bb3" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.693993 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c61760fb-827b-4199-bfdb-52698c7b4824" containerName="glance-httpd" containerID="cri-o://f148d19bec9da2034a614aa3685da5500ee102ac2162e404c9df6a8dd6001346" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.695443 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance4a60-account-delete-j4cg4" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.736021 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.736282 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="da8955a2-6deb-440c-97e3-f2420aa5fae8" containerName="nova-scheduler-scheduler" containerID="cri-o://a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.737442 4743 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell1c099-account-delete-lqvbc" secret="" err="secret \"galera-openstack-cell1-dockercfg-pr4n6\" not found" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.737483 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1c099-account-delete-lqvbc" Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.747762 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e is running failed: container process not found" containerID="144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.753537 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="ovn-northd" probeResult="failure" output="command timed out" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.753769 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.754050 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" containerName="nova-api-log" containerID="cri-o://928c4075312b7232f7da07b77b7db3aff8ceda2f473954fe371d1b407a38a03a" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.754193 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" containerName="nova-api-api" containerID="cri-o://1cfb29dd7e0a21897754c302d7a14c2ab839c36f149cccd25dabc107f11f9bed" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.763443 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.775999 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e is running failed: container process not found" containerID="144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.780202 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.780708 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-log" containerID="cri-o://4b24ccee2f20c0c9bff9ed0577c5b13a5e4c322c8c14f5cae7487c6ed9272a36" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.781119 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-metadata" containerID="cri-o://31240114f37ac66a6ac0ee75966656d89b98f5b65714a820dfeae421393b0b13" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.783174 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0ad52-account-delete-9r4lw" Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.792241 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e is running failed: container process not found" containerID="144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e" cmd=["/usr/bin/pidof","ovsdb-server"] Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.792314 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" containerName="ovsdbserver-sb" Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.820395 4743 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.820464 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data podName:e5fac46a-545d-4f30-a7ab-8f5e713e934d nodeName:}" failed. No retries permitted until 2025-11-22 08:45:54.820448619 +0000 UTC m=+1428.526809671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data") pod "rabbitmq-server-0" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d") : configmap "rabbitmq-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.820803 4743 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.820830 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts podName:d96211ff-f7ba-4e26-ae39-43c8062e2277 nodeName:}" failed. No retries permitted until 2025-11-22 08:45:53.320820779 +0000 UTC m=+1427.027181831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts") pod "novacell1c099-account-delete-lqvbc" (UID: "d96211ff-f7ba-4e26-ae39-43c8062e2277") : configmap "openstack-cell1-scripts" not found Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.821877 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-84bbbc9bdb-72lc6"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.822170 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" podUID="8fe5d70f-5277-4803-ae45-de61d0eefe27" containerName="barbican-keystone-listener-log" containerID="cri-o://b87c8deb0c6f1f3c1134e38ff7289f1edfe2b60e90ae6b47a46057bcb212868c" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.822484 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" podUID="8fe5d70f-5277-4803-ae45-de61d0eefe27" containerName="barbican-keystone-listener" containerID="cri-o://596f95b1d0cc9abb230b4c2a4a8c4b0c1af12cc6eed82f9960b7ca6e13289379" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.842180 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi1d8b-account-delete-scjlt" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.846024 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1c099-account-delete-lqvbc"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.866834 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-c099-account-create-r2vpp"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.873769 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-c099-account-create-r2vpp"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.880899 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-vzs2f"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.894759 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-vzs2f"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.895819 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7cd8fdf575-7kd5c"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.896039 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" podUID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" containerName="barbican-worker-log" containerID="cri-o://b2dbbf998042cd2c9fe978946c70615b254e4f75ec35ea7cfe22feace7d787f6" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.897156 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" podUID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" containerName="barbican-worker" containerID="cri-o://c4a492c46b22ecd2cb2ce30f4d5cbacdf5b41359fdcc9e9ef3d84f92e3284551" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.906321 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dcbbd6f66-kjrm8"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.906563 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerName="barbican-api-log" containerID="cri-o://0febb6e2d7ff4813fd6df7b99de1a803ade35dd751b487778b4585a6c0ce4d64" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.906830 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerName="barbican-api" containerID="cri-o://455de173684d6834930eabe9a480ac739569a3760776782c2c81d8591d036411" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.912981 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.930903 4743 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.930947 4743 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.931004 4743 secret.go:188] Couldn't get secret openstack/cinder-api-config-data: secret "cinder-api-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.930951 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:45:53.930936334 +0000 UTC m=+1427.637297376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-scripts" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.931416 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:45:53.931393947 +0000 UTC m=+1427.637754999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-api-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.935666 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:45:53.935628629 +0000 UTC m=+1427.641989991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-config-data" not found Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.935711 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.935949 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b7be7b8b-96eb-40fb-98b2-bc33e2154343" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d5f358c51f3837120bf2786591f156a51d70cdabcf793d05895bd486bf90bd29" gracePeriod=30 Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.946960 4743 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Nov 22 08:45:52 crc kubenswrapper[4743]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 22 08:45:52 crc kubenswrapper[4743]: + source /usr/local/bin/container-scripts/functions Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNBridge=br-int Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNRemote=tcp:localhost:6642 Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNEncapType=geneve Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNAvailabilityZones= Nov 22 08:45:52 crc kubenswrapper[4743]: ++ EnableChassisAsGateway=true Nov 22 08:45:52 crc kubenswrapper[4743]: ++ PhysicalNetworks= Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNHostName= Nov 22 08:45:52 crc kubenswrapper[4743]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 22 08:45:52 crc kubenswrapper[4743]: ++ ovs_dir=/var/lib/openvswitch Nov 22 08:45:52 crc kubenswrapper[4743]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 22 08:45:52 crc kubenswrapper[4743]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 22 08:45:52 crc kubenswrapper[4743]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 22 08:45:52 crc kubenswrapper[4743]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 22 08:45:52 crc kubenswrapper[4743]: + sleep 0.5 Nov 22 08:45:52 crc kubenswrapper[4743]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 22 08:45:52 crc kubenswrapper[4743]: + sleep 0.5 Nov 22 08:45:52 crc kubenswrapper[4743]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 22 08:45:52 crc kubenswrapper[4743]: + cleanup_ovsdb_server_semaphore Nov 22 08:45:52 crc kubenswrapper[4743]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 22 08:45:52 crc kubenswrapper[4743]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 22 08:45:52 crc kubenswrapper[4743]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-mz9kc" message=< Nov 22 08:45:52 crc kubenswrapper[4743]: Exiting ovsdb-server (5) [ OK ] Nov 22 08:45:52 crc kubenswrapper[4743]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 22 08:45:52 crc kubenswrapper[4743]: + source /usr/local/bin/container-scripts/functions Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNBridge=br-int Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNRemote=tcp:localhost:6642 Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNEncapType=geneve Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNAvailabilityZones= Nov 22 08:45:52 crc kubenswrapper[4743]: ++ EnableChassisAsGateway=true Nov 22 08:45:52 crc kubenswrapper[4743]: ++ PhysicalNetworks= Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNHostName= Nov 22 08:45:52 crc kubenswrapper[4743]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 22 08:45:52 crc kubenswrapper[4743]: ++ ovs_dir=/var/lib/openvswitch Nov 22 08:45:52 crc kubenswrapper[4743]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 22 08:45:52 crc kubenswrapper[4743]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 22 08:45:52 crc kubenswrapper[4743]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 22 08:45:52 crc kubenswrapper[4743]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 22 08:45:52 crc kubenswrapper[4743]: + sleep 0.5 Nov 22 08:45:52 crc kubenswrapper[4743]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 22 08:45:52 crc kubenswrapper[4743]: + sleep 0.5 Nov 22 08:45:52 crc kubenswrapper[4743]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 22 08:45:52 crc kubenswrapper[4743]: + cleanup_ovsdb_server_semaphore Nov 22 08:45:52 crc kubenswrapper[4743]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 22 08:45:52 crc kubenswrapper[4743]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 22 08:45:52 crc kubenswrapper[4743]: > Nov 22 08:45:52 crc kubenswrapper[4743]: E1122 08:45:52.947197 4743 kuberuntime_container.go:691] "PreStop hook failed" err=< Nov 22 08:45:52 crc kubenswrapper[4743]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Nov 22 08:45:52 crc kubenswrapper[4743]: + source /usr/local/bin/container-scripts/functions Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNBridge=br-int Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNRemote=tcp:localhost:6642 Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNEncapType=geneve Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNAvailabilityZones= Nov 22 08:45:52 crc kubenswrapper[4743]: ++ EnableChassisAsGateway=true Nov 22 08:45:52 crc kubenswrapper[4743]: ++ PhysicalNetworks= Nov 22 08:45:52 crc kubenswrapper[4743]: ++ OVNHostName= Nov 22 08:45:52 crc kubenswrapper[4743]: ++ DB_FILE=/etc/openvswitch/conf.db Nov 22 08:45:52 crc kubenswrapper[4743]: ++ ovs_dir=/var/lib/openvswitch Nov 22 08:45:52 crc kubenswrapper[4743]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Nov 22 08:45:52 crc kubenswrapper[4743]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Nov 22 08:45:52 crc kubenswrapper[4743]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 22 08:45:52 crc kubenswrapper[4743]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 22 08:45:52 crc kubenswrapper[4743]: + sleep 0.5 Nov 22 08:45:52 crc kubenswrapper[4743]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 22 08:45:52 crc kubenswrapper[4743]: + sleep 0.5 Nov 22 08:45:52 crc kubenswrapper[4743]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Nov 22 08:45:52 crc kubenswrapper[4743]: + cleanup_ovsdb_server_semaphore Nov 22 08:45:52 crc kubenswrapper[4743]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Nov 22 08:45:52 crc kubenswrapper[4743]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Nov 22 08:45:52 crc kubenswrapper[4743]: > pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" containerID="cri-o://6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.947250 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" containerID="cri-o://6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" gracePeriod=29 Nov 22 08:45:52 crc kubenswrapper[4743]: I1122 08:45:52.971215 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hccj5"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.023644 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hccj5"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.038669 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.038920 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="11c59cd3-7ee4-43f3-83ce-9d22824473d7" containerName="nova-cell1-conductor-conductor" containerID="cri-o://fd43f52e71d508747b99d25448ea2492e1fc68d783ff2250c3357c3281ede81e" gracePeriod=30 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.067551 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.093034 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="500679c5-1691-4831-b5ec-3c6cce19c503" containerName="nova-cell0-conductor-conductor" containerID="cri-o://aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" gracePeriod=30 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.106662 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d/ovsdbserver-nb/0.log" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.106727 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.107273 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh" Nov 22 08:45:53 crc kubenswrapper[4743]: E1122 08:45:53.140145 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:45:53 crc kubenswrapper[4743]: E1122 08:45:53.151274 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:45:53 crc kubenswrapper[4743]: E1122 08:45:53.171200 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:45:53 crc kubenswrapper[4743]: E1122 08:45:53.171311 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.186665 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e5fac46a-545d-4f30-a7ab-8f5e713e934d" containerName="rabbitmq" containerID="cri-o://6e1b913f0b8534fa70afd00b409ba87dcd773b31786f2f1c5518bc5e04e427a8" gracePeriod=604800 Nov 22 08:45:53 crc kubenswrapper[4743]: E1122 08:45:53.252160 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:45:53 crc kubenswrapper[4743]: E1122 08:45:53.288405 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302050 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5db10427-8546-4dea-b849-36bb02c837bd-scripts\") pod \"5db10427-8546-4dea-b849-36bb02c837bd\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302138 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd754\" (UniqueName: \"kubernetes.io/projected/5db10427-8546-4dea-b849-36bb02c837bd-kube-api-access-hd754\") pod \"5db10427-8546-4dea-b849-36bb02c837bd\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302166 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-config\") pod \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302203 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdb-rundir\") pod \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302240 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdbserver-nb-tls-certs\") pod \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302274 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-combined-ca-bundle\") pod \"5db10427-8546-4dea-b849-36bb02c837bd\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302309 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302393 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-ovn-controller-tls-certs\") pod \"5db10427-8546-4dea-b849-36bb02c837bd\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302436 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnlvb\" (UniqueName: \"kubernetes.io/projected/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-kube-api-access-wnlvb\") pod \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302482 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-log-ovn\") pod \"5db10427-8546-4dea-b849-36bb02c837bd\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302548 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-combined-ca-bundle\") pod \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302608 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-metrics-certs-tls-certs\") pod \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302639 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run\") pod \"5db10427-8546-4dea-b849-36bb02c837bd\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302684 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run-ovn\") pod \"5db10427-8546-4dea-b849-36bb02c837bd\" (UID: \"5db10427-8546-4dea-b849-36bb02c837bd\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.302742 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-scripts\") pod \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\" (UID: \"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.306830 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-config" (OuterVolumeSpecName: "config") pod "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" (UID: "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.309738 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run" (OuterVolumeSpecName: "var-run") pod "5db10427-8546-4dea-b849-36bb02c837bd" (UID: "5db10427-8546-4dea-b849-36bb02c837bd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.311701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5db10427-8546-4dea-b849-36bb02c837bd" (UID: "5db10427-8546-4dea-b849-36bb02c837bd"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: E1122 08:45:53.311816 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:45:53 crc kubenswrapper[4743]: E1122 08:45:53.311858 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovs-vswitchd" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.313984 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5db10427-8546-4dea-b849-36bb02c837bd" (UID: "5db10427-8546-4dea-b849-36bb02c837bd"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.323424 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5db10427-8546-4dea-b849-36bb02c837bd-scripts" (OuterVolumeSpecName: "scripts") pod "5db10427-8546-4dea-b849-36bb02c837bd" (UID: "5db10427-8546-4dea-b849-36bb02c837bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.339206 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="009dc869-9ae6-40f0-a055-1303494f16f1" path="/var/lib/kubelet/pods/009dc869-9ae6-40f0-a055-1303494f16f1/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.350425 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="142d1e8a-9aac-4c34-9301-1e069919fe82" path="/var/lib/kubelet/pods/142d1e8a-9aac-4c34-9301-1e069919fe82/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.353131 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-scripts" (OuterVolumeSpecName: "scripts") pod "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" (UID: "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.355384 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfecaa0-8299-4d91-a2eb-11ddb19e029d" path="/var/lib/kubelet/pods/2dfecaa0-8299-4d91-a2eb-11ddb19e029d/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.356087 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="529310a2-d134-4138-a99f-9146e71f32eb" path="/var/lib/kubelet/pods/529310a2-d134-4138-a99f-9146e71f32eb/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.356666 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58260e6d-177b-49c5-beac-c516036341a4" path="/var/lib/kubelet/pods/58260e6d-177b-49c5-beac-c516036341a4/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.357237 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8" path="/var/lib/kubelet/pods/6f7da0e4-94ee-454d-9c3a-98ed4c49c5c8/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.359131 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" (UID: "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.359352 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e84b78-308d-41c1-b9a7-5d0c4cb80d44" path="/var/lib/kubelet/pods/90e84b78-308d-41c1-b9a7-5d0c4cb80d44/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.360016 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f6e846-532f-419c-bd4a-7d2e7eb41a2c" path="/var/lib/kubelet/pods/95f6e846-532f-419c-bd4a-7d2e7eb41a2c/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.362706 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87658ca-ad68-4136-82dd-14201100b4ea" path="/var/lib/kubelet/pods/a87658ca-ad68-4136-82dd-14201100b4ea/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.364639 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5a99c13-319a-4df1-8061-8bb20463cd73" path="/var/lib/kubelet/pods/c5a99c13-319a-4df1-8061-8bb20463cd73/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.365448 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f93e69-5601-45dc-a1f5-0e086d3dce5d" path="/var/lib/kubelet/pods/e5f93e69-5601-45dc-a1f5-0e086d3dce5d/volumes" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.366853 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-kube-api-access-wnlvb" (OuterVolumeSpecName: "kube-api-access-wnlvb") pod "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" (UID: "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d"). InnerVolumeSpecName "kube-api-access-wnlvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.366985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db10427-8546-4dea-b849-36bb02c837bd-kube-api-access-hd754" (OuterVolumeSpecName: "kube-api-access-hd754") pod "5db10427-8546-4dea-b849-36bb02c837bd" (UID: "5db10427-8546-4dea-b849-36bb02c837bd"). InnerVolumeSpecName "kube-api-access-hd754". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.369671 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" (UID: "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.407165 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.407200 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.407214 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.407225 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5db10427-8546-4dea-b849-36bb02c837bd-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.407237 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.407249 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd754\" (UniqueName: \"kubernetes.io/projected/5db10427-8546-4dea-b849-36bb02c837bd-kube-api-access-hd754\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.407262 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.407287 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.407300 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnlvb\" (UniqueName: \"kubernetes.io/projected/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-kube-api-access-wnlvb\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.407313 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5db10427-8546-4dea-b849-36bb02c837bd-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: E1122 08:45:53.408774 4743 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Nov 22 08:45:53 crc kubenswrapper[4743]: E1122 08:45:53.408852 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts podName:d96211ff-f7ba-4e26-ae39-43c8062e2277 nodeName:}" failed. No retries permitted until 2025-11-22 08:45:54.40883241 +0000 UTC m=+1428.115193462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts") pod "novacell1c099-account-delete-lqvbc" (UID: "d96211ff-f7ba-4e26-ae39-43c8062e2277") : configmap "openstack-cell1-scripts" not found Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.491185 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5db10427-8546-4dea-b849-36bb02c837bd" (UID: "5db10427-8546-4dea-b849-36bb02c837bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.503787 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" (UID: "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.508759 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.508792 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.521976 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="29734ea4-591c-478e-8030-55fcbac72d3a" containerName="galera" containerID="cri-o://c89c36b86576b82473835bf0c40ac138380844e7593a7241e2bf4b37e98aadf1" gracePeriod=30 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.526899 4743 generic.go:334] "Generic (PLEG): container finished" podID="db905ec2-675e-48ea-a051-ed3d78c35797" containerID="928c4075312b7232f7da07b77b7db3aff8ceda2f473954fe371d1b407a38a03a" exitCode=143 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.531975 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.537385 4743 generic.go:334] "Generic (PLEG): container finished" podID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" containerID="b2dbbf998042cd2c9fe978946c70615b254e4f75ec35ea7cfe22feace7d787f6" exitCode=143 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.562071 4743 generic.go:334] "Generic (PLEG): container finished" podID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" containerID="d713e66a35891819a155186b552565a296254b8c93475b9aa0a54b55dd7cbd38" exitCode=143 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.584751 4743 generic.go:334] "Generic (PLEG): container finished" podID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" containerID="59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.594484 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" (UID: "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.599056 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3a18c86e-9d86-49ee-918f-76de17000e18/ovsdbserver-sb/0.log" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.599098 4743 generic.go:334] "Generic (PLEG): container finished" podID="3a18c86e-9d86-49ee-918f-76de17000e18" containerID="144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e" exitCode=143 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.605714 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" (UID: "4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.612123 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.612167 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.612182 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.627216 4743 generic.go:334] "Generic (PLEG): container finished" podID="c61760fb-827b-4199-bfdb-52698c7b4824" containerID="d85aa17d800ad1cbc94e2aaf79a94f094b2da7ff02061d9a5cc19841c8f58bb3" exitCode=143 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.653816 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "5db10427-8546-4dea-b849-36bb02c837bd" (UID: "5db10427-8546-4dea-b849-36bb02c837bd"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.654570 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z54x2"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.654670 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-z54x2"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.654693 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db905ec2-675e-48ea-a051-ed3d78c35797","Type":"ContainerDied","Data":"928c4075312b7232f7da07b77b7db3aff8ceda2f473954fe371d1b407a38a03a"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.654726 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" event={"ID":"89d8e638-b97a-4273-9391-5e0c7dd1bfb1","Type":"ContainerDied","Data":"b2dbbf998042cd2c9fe978946c70615b254e4f75ec35ea7cfe22feace7d787f6"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.654740 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dca6d95c-89d6-4b49-bf28-2606b9b5c05e","Type":"ContainerDied","Data":"d713e66a35891819a155186b552565a296254b8c93475b9aa0a54b55dd7cbd38"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.654754 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568cf9dfc-ghfzl" event={"ID":"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48","Type":"ContainerDied","Data":"59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.654765 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3a18c86e-9d86-49ee-918f-76de17000e18","Type":"ContainerDied","Data":"144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.654802 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c61760fb-827b-4199-bfdb-52698c7b4824","Type":"ContainerDied","Data":"d85aa17d800ad1cbc94e2aaf79a94f094b2da7ff02061d9a5cc19841c8f58bb3"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.712209 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3a18c86e-9d86-49ee-918f-76de17000e18/ovsdbserver-sb/0.log" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.712537 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.715438 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.718923 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db10427-8546-4dea-b849-36bb02c837bd-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747003 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7qctt_870c700d-9095-4781-ab16-4cce25d24ed2/openstack-network-exporter/0.log" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747104 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747298 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747414 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747479 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747542 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747624 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747681 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747740 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747792 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747843 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747899 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747950 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.748013 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.748069 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.748157 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.747456 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.748470 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.748557 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.748646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.748726 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.748846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.748918 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.748986 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.749054 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.749139 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.749211 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.749375 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.749466 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.749528 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.762128 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d/ovsdbserver-nb/0.log" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.762395 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.763896 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d","Type":"ContainerDied","Data":"f64f05e22064a45671cb5ffbaa86c35a4105a1bd1e162839e31041dae91c1cd8"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.764017 4743 scope.go:117] "RemoveContainer" containerID="01419810dd33722ab918d36dfaf000ac018b6b763b7e90647fea3f7eed2c7509" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.784224 4743 generic.go:334] "Generic (PLEG): container finished" podID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" containerID="7711ec056fa213f2eee796483c27379cd7a134fa6030ba1e23b52a3457a46cec" exitCode=143 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.784297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84df6c6d8d-v9vxr" event={"ID":"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca","Type":"ContainerDied","Data":"7711ec056fa213f2eee796483c27379cd7a134fa6030ba1e23b52a3457a46cec"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.795952 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder89a7-account-delete-gjvvg"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.803686 4743 generic.go:334] "Generic (PLEG): container finished" podID="aab079ae-b574-40f3-8df0-7deff1356e09" containerID="b976e4375d0ed0f34734cc0cfaefc7f462784997b77abe216f7a1ac9164a5a03" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.803769 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" event={"ID":"aab079ae-b574-40f3-8df0-7deff1356e09","Type":"ContainerDied","Data":"b976e4375d0ed0f34734cc0cfaefc7f462784997b77abe216f7a1ac9164a5a03"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.807028 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.810321 4743 generic.go:334] "Generic (PLEG): container finished" podID="5987ad61-2878-4efc-98ca-ea29b123f26e" containerID="46baaf42142233869f49a3ed3725aeb263cb2291e27ce1211801aed7212ad955" exitCode=137 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.810414 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.814904 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819011 4743 generic.go:334] "Generic (PLEG): container finished" podID="8fe5d70f-5277-4803-ae45-de61d0eefe27" containerID="b87c8deb0c6f1f3c1134e38ff7289f1edfe2b60e90ae6b47a46057bcb212868c" exitCode=143 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819115 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" event={"ID":"8fe5d70f-5277-4803-ae45-de61d0eefe27","Type":"ContainerDied","Data":"b87c8deb0c6f1f3c1134e38ff7289f1edfe2b60e90ae6b47a46057bcb212868c"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819445 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870c700d-9095-4781-ab16-4cce25d24ed2-config\") pod \"870c700d-9095-4781-ab16-4cce25d24ed2\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819486 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"3a18c86e-9d86-49ee-918f-76de17000e18\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819516 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-combined-ca-bundle\") pod \"870c700d-9095-4781-ab16-4cce25d24ed2\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819635 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdbserver-sb-tls-certs\") pod \"3a18c86e-9d86-49ee-918f-76de17000e18\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819674 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch8hg\" (UniqueName: \"kubernetes.io/projected/3a18c86e-9d86-49ee-918f-76de17000e18-kube-api-access-ch8hg\") pod \"3a18c86e-9d86-49ee-918f-76de17000e18\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819714 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-config\") pod \"3a18c86e-9d86-49ee-918f-76de17000e18\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819733 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-metrics-certs-tls-certs\") pod \"870c700d-9095-4781-ab16-4cce25d24ed2\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819770 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdb-rundir\") pod \"3a18c86e-9d86-49ee-918f-76de17000e18\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819827 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-combined-ca-bundle\") pod \"3a18c86e-9d86-49ee-918f-76de17000e18\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819856 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovs-rundir\") pod \"870c700d-9095-4781-ab16-4cce25d24ed2\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819903 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-metrics-certs-tls-certs\") pod \"3a18c86e-9d86-49ee-918f-76de17000e18\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819933 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff594\" (UniqueName: \"kubernetes.io/projected/5987ad61-2878-4efc-98ca-ea29b123f26e-kube-api-access-ff594\") pod \"5987ad61-2878-4efc-98ca-ea29b123f26e\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819951 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config-secret\") pod \"5987ad61-2878-4efc-98ca-ea29b123f26e\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819973 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q7cl\" (UniqueName: \"kubernetes.io/projected/870c700d-9095-4781-ab16-4cce25d24ed2-kube-api-access-6q7cl\") pod \"870c700d-9095-4781-ab16-4cce25d24ed2\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.819990 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovn-rundir\") pod \"870c700d-9095-4781-ab16-4cce25d24ed2\" (UID: \"870c700d-9095-4781-ab16-4cce25d24ed2\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.820058 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config\") pod \"5987ad61-2878-4efc-98ca-ea29b123f26e\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.820084 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-combined-ca-bundle\") pod \"5987ad61-2878-4efc-98ca-ea29b123f26e\" (UID: \"5987ad61-2878-4efc-98ca-ea29b123f26e\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.820126 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-scripts\") pod \"3a18c86e-9d86-49ee-918f-76de17000e18\" (UID: \"3a18c86e-9d86-49ee-918f-76de17000e18\") " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.821386 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-scripts" (OuterVolumeSpecName: "scripts") pod "3a18c86e-9d86-49ee-918f-76de17000e18" (UID: "3a18c86e-9d86-49ee-918f-76de17000e18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.822188 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/870c700d-9095-4781-ab16-4cce25d24ed2-config" (OuterVolumeSpecName: "config") pod "870c700d-9095-4781-ab16-4cce25d24ed2" (UID: "870c700d-9095-4781-ab16-4cce25d24ed2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.823196 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "870c700d-9095-4781-ab16-4cce25d24ed2" (UID: "870c700d-9095-4781-ab16-4cce25d24ed2"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.823375 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-config" (OuterVolumeSpecName: "config") pod "3a18c86e-9d86-49ee-918f-76de17000e18" (UID: "3a18c86e-9d86-49ee-918f-76de17000e18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.824068 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "3a18c86e-9d86-49ee-918f-76de17000e18" (UID: "3a18c86e-9d86-49ee-918f-76de17000e18"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.824107 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "870c700d-9095-4781-ab16-4cce25d24ed2" (UID: "870c700d-9095-4781-ab16-4cce25d24ed2"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.826981 4743 generic.go:334] "Generic (PLEG): container finished" podID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.827059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mz9kc" event={"ID":"03685c6a-5ae9-45cf-b66d-5210d4811bda","Type":"ContainerDied","Data":"6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.836363 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a18c86e-9d86-49ee-918f-76de17000e18-kube-api-access-ch8hg" (OuterVolumeSpecName: "kube-api-access-ch8hg") pod "3a18c86e-9d86-49ee-918f-76de17000e18" (UID: "3a18c86e-9d86-49ee-918f-76de17000e18"). InnerVolumeSpecName "kube-api-access-ch8hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.843304 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "3a18c86e-9d86-49ee-918f-76de17000e18" (UID: "3a18c86e-9d86-49ee-918f-76de17000e18"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.843466 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/870c700d-9095-4781-ab16-4cce25d24ed2-kube-api-access-6q7cl" (OuterVolumeSpecName: "kube-api-access-6q7cl") pod "870c700d-9095-4781-ab16-4cce25d24ed2" (UID: "870c700d-9095-4781-ab16-4cce25d24ed2"). InnerVolumeSpecName "kube-api-access-6q7cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.843690 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5987ad61-2878-4efc-98ca-ea29b123f26e-kube-api-access-ff594" (OuterVolumeSpecName: "kube-api-access-ff594") pod "5987ad61-2878-4efc-98ca-ea29b123f26e" (UID: "5987ad61-2878-4efc-98ca-ea29b123f26e"). InnerVolumeSpecName "kube-api-access-ff594". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.857032 4743 generic.go:334] "Generic (PLEG): container finished" podID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" containerID="69a331217c6e9870990cf0477268ec07b586afa72d1bd546c97e364e672bdc27" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.857063 4743 generic.go:334] "Generic (PLEG): container finished" podID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" containerID="ff8003a3594d25ec03aad9438f8a8b6e3c4495c012f444863e724569495817e4" exitCode=0 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.857176 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ff985d64c-mnpj5" event={"ID":"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba","Type":"ContainerDied","Data":"69a331217c6e9870990cf0477268ec07b586afa72d1bd546c97e364e672bdc27"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.857203 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ff985d64c-mnpj5" event={"ID":"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba","Type":"ContainerDied","Data":"ff8003a3594d25ec03aad9438f8a8b6e3c4495c012f444863e724569495817e4"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.861757 4743 generic.go:334] "Generic (PLEG): container finished" podID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerID="4b24ccee2f20c0c9bff9ed0577c5b13a5e4c322c8c14f5cae7487c6ed9272a36" exitCode=143 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.861831 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b24dd85-d686-4fb0-be74-7aca0b03255c","Type":"ContainerDied","Data":"4b24ccee2f20c0c9bff9ed0577c5b13a5e4c322c8c14f5cae7487c6ed9272a36"} Nov 22 08:45:53 crc kubenswrapper[4743]: W1122 08:45:53.862528 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7936e330_2138_4624_b319_902f6a4941ec.slice/crio-8e1f187c60412a1ebdfddabd0b735fcf4d5b9aa0f3a570813ea31e863abd76e3 WatchSource:0}: Error finding container 8e1f187c60412a1ebdfddabd0b735fcf4d5b9aa0f3a570813ea31e863abd76e3: Status 404 returned error can't find the container with id 8e1f187c60412a1ebdfddabd0b735fcf4d5b9aa0f3a570813ea31e863abd76e3 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.868863 4743 generic.go:334] "Generic (PLEG): container finished" podID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerID="0febb6e2d7ff4813fd6df7b99de1a803ade35dd751b487778b4585a6c0ce4d64" exitCode=143 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.868962 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" event={"ID":"dc034ce8-656e-4c88-92f1-18f384ae1a18","Type":"ContainerDied","Data":"0febb6e2d7ff4813fd6df7b99de1a803ade35dd751b487778b4585a6c0ce4d64"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.871217 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerName="cinder-api-log" containerID="cri-o://bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7" gracePeriod=30 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.871507 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-6t9hh" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.871711 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "870c700d-9095-4781-ab16-4cce25d24ed2" (UID: "870c700d-9095-4781-ab16-4cce25d24ed2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.871780 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-6t9hh" event={"ID":"5db10427-8546-4dea-b849-36bb02c837bd","Type":"ContainerDied","Data":"a5a07d940c5e01f9e847510d625c10e0ef683fe123f8c83ce7269a9f8c1f6185"} Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.872334 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerName="cinder-api" containerID="cri-o://85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895" gracePeriod=30 Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923010 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923032 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870c700d-9095-4781-ab16-4cce25d24ed2-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923052 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923063 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923074 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch8hg\" (UniqueName: \"kubernetes.io/projected/3a18c86e-9d86-49ee-918f-76de17000e18-kube-api-access-ch8hg\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923082 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a18c86e-9d86-49ee-918f-76de17000e18-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923090 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923098 4743 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovs-rundir\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923106 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff594\" (UniqueName: \"kubernetes.io/projected/5987ad61-2878-4efc-98ca-ea29b123f26e-kube-api-access-ff594\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923114 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q7cl\" (UniqueName: \"kubernetes.io/projected/870c700d-9095-4781-ab16-4cce25d24ed2-kube-api-access-6q7cl\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.923122 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/870c700d-9095-4781-ab16-4cce25d24ed2-ovn-rundir\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.937009 4743 scope.go:117] "RemoveContainer" containerID="85ca3e549dbcf99a2f9f8ce67ee485a73244e79666976bd6f0f2fe904f8d3d50" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.962892 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5987ad61-2878-4efc-98ca-ea29b123f26e" (UID: "5987ad61-2878-4efc-98ca-ea29b123f26e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.970941 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a18c86e-9d86-49ee-918f-76de17000e18" (UID: "3a18c86e-9d86-49ee-918f-76de17000e18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.978413 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-6t9hh"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.984939 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-6t9hh"] Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.994225 4743 scope.go:117] "RemoveContainer" containerID="46baaf42142233869f49a3ed3725aeb263cb2291e27ce1211801aed7212ad955" Nov 22 08:45:53 crc kubenswrapper[4743]: I1122 08:45:53.999278 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "5987ad61-2878-4efc-98ca-ea29b123f26e" (UID: "5987ad61-2878-4efc-98ca-ea29b123f26e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.016066 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3a18c86e-9d86-49ee-918f-76de17000e18" (UID: "3a18c86e-9d86-49ee-918f-76de17000e18"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.018133 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.026072 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.026100 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.026112 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.026121 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.026131 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.026211 4743 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.026253 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:45:56.026239428 +0000 UTC m=+1429.732600480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-config-data" not found Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.028009 4743 secret.go:188] Couldn't get secret openstack/cinder-api-config-data: secret "cinder-api-config-data" not found Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.028076 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:45:56.028057201 +0000 UTC m=+1429.734418243 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-api-config-data" not found Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.028511 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.034874 4743 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.034969 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:45:56.03495011 +0000 UTC m=+1429.741311162 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-scripts" not found Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.041460 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.046068 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.047786 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.047816 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="500679c5-1691-4831-b5ec-3c6cce19c503" containerName="nova-cell0-conductor-conductor" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.052297 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "870c700d-9095-4781-ab16-4cce25d24ed2" (UID: "870c700d-9095-4781-ab16-4cce25d24ed2"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.074537 4743 scope.go:117] "RemoveContainer" containerID="6c0db6fc539fa60e13ce74a4129ccda55a6455c6da77fd7c2a15ec935e19f792" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.101124 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "3a18c86e-9d86-49ee-918f-76de17000e18" (UID: "3a18c86e-9d86-49ee-918f-76de17000e18"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.111049 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement0984-account-delete-82zvj"] Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.132611 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-svc\") pod \"aab079ae-b574-40f3-8df0-7deff1356e09\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.132666 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnkgq\" (UniqueName: \"kubernetes.io/projected/aab079ae-b574-40f3-8df0-7deff1356e09-kube-api-access-tnkgq\") pod \"aab079ae-b574-40f3-8df0-7deff1356e09\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.132709 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-sb\") pod \"aab079ae-b574-40f3-8df0-7deff1356e09\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.132746 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-swift-storage-0\") pod \"aab079ae-b574-40f3-8df0-7deff1356e09\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.132782 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-config\") pod \"aab079ae-b574-40f3-8df0-7deff1356e09\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.132903 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-nb\") pod \"aab079ae-b574-40f3-8df0-7deff1356e09\" (UID: \"aab079ae-b574-40f3-8df0-7deff1356e09\") " Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.133270 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a18c86e-9d86-49ee-918f-76de17000e18-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.133286 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/870c700d-9095-4781-ab16-4cce25d24ed2-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.136255 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican675b-account-delete-bdj7v"] Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.150767 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab079ae-b574-40f3-8df0-7deff1356e09-kube-api-access-tnkgq" (OuterVolumeSpecName: "kube-api-access-tnkgq") pod "aab079ae-b574-40f3-8df0-7deff1356e09" (UID: "aab079ae-b574-40f3-8df0-7deff1356e09"). InnerVolumeSpecName "kube-api-access-tnkgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.185191 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "5987ad61-2878-4efc-98ca-ea29b123f26e" (UID: "5987ad61-2878-4efc-98ca-ea29b123f26e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.235041 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5987ad61-2878-4efc-98ca-ea29b123f26e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.235077 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnkgq\" (UniqueName: \"kubernetes.io/projected/aab079ae-b574-40f3-8df0-7deff1356e09-kube-api-access-tnkgq\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.242593 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron8dc4-account-delete-rtl4b"] Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.271273 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1c099-account-delete-lqvbc"] Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.336769 4743 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.337011 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data podName:ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1 nodeName:}" failed. No retries permitted until 2025-11-22 08:45:58.336994117 +0000 UTC m=+1432.043355169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data") pod "rabbitmq-cell1-server-0" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1") : configmap "rabbitmq-cell1-config-data" not found Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.438215 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell0ad52-account-delete-9r4lw"] Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.439532 4743 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Nov 22 08:45:54 crc kubenswrapper[4743]: E1122 08:45:54.440210 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts podName:d96211ff-f7ba-4e26-ae39-43c8062e2277 nodeName:}" failed. No retries permitted until 2025-11-22 08:45:56.440180101 +0000 UTC m=+1430.146541153 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts") pod "novacell1c099-account-delete-lqvbc" (UID: "d96211ff-f7ba-4e26-ae39-43c8062e2277") : configmap "openstack-cell1-scripts" not found Nov 22 08:45:54 crc kubenswrapper[4743]: W1122 08:45:54.457050 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf48bbac5_2782_4c1e_b74b_520f0457f9ac.slice/crio-c88d4928e4941bbba968e503dcd1361302276773fdecc26ed008af20ac9e5165 WatchSource:0}: Error finding container c88d4928e4941bbba968e503dcd1361302276773fdecc26ed008af20ac9e5165: Status 404 returned error can't find the container with id c88d4928e4941bbba968e503dcd1361302276773fdecc26ed008af20ac9e5165 Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.459758 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance4a60-account-delete-j4cg4"] Nov 22 08:45:54 crc kubenswrapper[4743]: W1122 08:45:54.498096 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fca29fd_c34f_4954_960f_b5ca0812d5b0.slice/crio-6d6eeaaa805b28bfc333c74829288c5cc15dca9e40890f87a5ab4f98de096a46 WatchSource:0}: Error finding container 6d6eeaaa805b28bfc333c74829288c5cc15dca9e40890f87a5ab4f98de096a46: Status 404 returned error can't find the container with id 6d6eeaaa805b28bfc333c74829288c5cc15dca9e40890f87a5ab4f98de096a46 Nov 22 08:45:54 crc kubenswrapper[4743]: I1122 08:45:54.504034 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aab079ae-b574-40f3-8df0-7deff1356e09" (UID: "aab079ae-b574-40f3-8df0-7deff1356e09"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.550854 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aab079ae-b574-40f3-8df0-7deff1356e09" (UID: "aab079ae-b574-40f3-8df0-7deff1356e09"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.551180 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-config" (OuterVolumeSpecName: "config") pod "aab079ae-b574-40f3-8df0-7deff1356e09" (UID: "aab079ae-b574-40f3-8df0-7deff1356e09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.557770 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.557795 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.557806 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.567797 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aab079ae-b574-40f3-8df0-7deff1356e09" (UID: "aab079ae-b574-40f3-8df0-7deff1356e09"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.611440 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi1d8b-account-delete-scjlt"] Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.630056 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aab079ae-b574-40f3-8df0-7deff1356e09" (UID: "aab079ae-b574-40f3-8df0-7deff1356e09"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.661177 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.661201 4743 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aab079ae-b574-40f3-8df0-7deff1356e09-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: E1122 08:45:54.864635 4743 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 22 08:45:55 crc kubenswrapper[4743]: E1122 08:45:54.864703 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data podName:e5fac46a-545d-4f30-a7ab-8f5e713e934d nodeName:}" failed. No retries permitted until 2025-11-22 08:45:58.864686109 +0000 UTC m=+1432.571047161 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data") pod "rabbitmq-server-0" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d") : configmap "rabbitmq-config-data" not found Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.907694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1c099-account-delete-lqvbc" event={"ID":"d96211ff-f7ba-4e26-ae39-43c8062e2277","Type":"ContainerStarted","Data":"a2423c2f3a40b761371f356af39a17b1fe4f3bcaae35153ffbd526cf8a43e870"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.925321 4743 generic.go:334] "Generic (PLEG): container finished" podID="7936e330-2138-4624-b319-902f6a4941ec" containerID="e3311847d040f49ee2f688d4b62c7bd813884fd1d9ee40438d2d48f3bbbd5240" exitCode=0 Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.925410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder89a7-account-delete-gjvvg" event={"ID":"7936e330-2138-4624-b319-902f6a4941ec","Type":"ContainerDied","Data":"e3311847d040f49ee2f688d4b62c7bd813884fd1d9ee40438d2d48f3bbbd5240"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.925460 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder89a7-account-delete-gjvvg" event={"ID":"7936e330-2138-4624-b319-902f6a4941ec","Type":"ContainerStarted","Data":"8e1f187c60412a1ebdfddabd0b735fcf4d5b9aa0f3a570813ea31e863abd76e3"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.939912 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement0984-account-delete-82zvj" event={"ID":"9375da2b-3776-4c32-8afd-d1ed7b22b308","Type":"ContainerStarted","Data":"5831c8101ffb619513c175100e0c23fc1f2a816be37dfd715e1f552d2a6971be"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.955195 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_3a18c86e-9d86-49ee-918f-76de17000e18/ovsdbserver-sb/0.log" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.955291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"3a18c86e-9d86-49ee-918f-76de17000e18","Type":"ContainerDied","Data":"f40b416d068b4981562f93627cae411148e6542e542f0db18ab24e7e66969d03"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.955333 4743 scope.go:117] "RemoveContainer" containerID="1673fc12fab7560971cb866548c9e23899260f2050054ca6370d37795dfdb742" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.955348 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.965737 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0ad52-account-delete-9r4lw" event={"ID":"5fca29fd-c34f-4954-960f-b5ca0812d5b0","Type":"ContainerStarted","Data":"6d6eeaaa805b28bfc333c74829288c5cc15dca9e40890f87a5ab4f98de096a46"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.975957 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron8dc4-account-delete-rtl4b" event={"ID":"f6d1b00d-147b-4865-b659-59d06f360797","Type":"ContainerStarted","Data":"e0b17a045985a2ee9326dd2c57115606fefec5c4be6421e1f11446a84efc107c"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.978307 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican675b-account-delete-bdj7v" event={"ID":"5310c975-ef7b-4161-ab2e-5ee94b709f9d","Type":"ContainerStarted","Data":"94ab297637887ec62b68254745a2399c55743ad3c0e45375143ff38f5eae56ed"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.982203 4743 generic.go:334] "Generic (PLEG): container finished" podID="8fe5d70f-5277-4803-ae45-de61d0eefe27" containerID="596f95b1d0cc9abb230b4c2a4a8c4b0c1af12cc6eed82f9960b7ca6e13289379" exitCode=0 Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:54.982304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" event={"ID":"8fe5d70f-5277-4803-ae45-de61d0eefe27","Type":"ContainerDied","Data":"596f95b1d0cc9abb230b4c2a4a8c4b0c1af12cc6eed82f9960b7ca6e13289379"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.004982 4743 generic.go:334] "Generic (PLEG): container finished" podID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerID="bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7" exitCode=143 Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.005060 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29bf9036-d8fc-43f7-9153-f133a723c6df","Type":"ContainerDied","Data":"bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.009871 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.009827 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" event={"ID":"aab079ae-b574-40f3-8df0-7deff1356e09","Type":"ContainerDied","Data":"cc5f1d9b8e5ad4aaa231f5aea0ea23e9d8319e0b69a165218b9b2b15bffe5323"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.019460 4743 generic.go:334] "Generic (PLEG): container finished" podID="b7be7b8b-96eb-40fb-98b2-bc33e2154343" containerID="d5f358c51f3837120bf2786591f156a51d70cdabcf793d05895bd486bf90bd29" exitCode=0 Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.019552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7be7b8b-96eb-40fb-98b2-bc33e2154343","Type":"ContainerDied","Data":"d5f358c51f3837120bf2786591f156a51d70cdabcf793d05895bd486bf90bd29"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.019608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7be7b8b-96eb-40fb-98b2-bc33e2154343","Type":"ContainerDied","Data":"a516b410fb650b2b8b6e948a05b7feaddc7628bdd3f6fb42e7660e03ec892467"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.019624 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a516b410fb650b2b8b6e948a05b7feaddc7628bdd3f6fb42e7660e03ec892467" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.021553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4a60-account-delete-j4cg4" event={"ID":"f48bbac5-2782-4c1e-b74b-520f0457f9ac","Type":"ContainerStarted","Data":"c88d4928e4941bbba968e503dcd1361302276773fdecc26ed008af20ac9e5165"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.024558 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi1d8b-account-delete-scjlt" event={"ID":"30ee548a-8838-4d52-867b-4dfdb6c4f641","Type":"ContainerStarted","Data":"55d8bb28b2a06d9eeab5de7f880cb8bbdefa5f0995dd6971a51f9f0b5040e274"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.030713 4743 generic.go:334] "Generic (PLEG): container finished" podID="29734ea4-591c-478e-8030-55fcbac72d3a" containerID="c89c36b86576b82473835bf0c40ac138380844e7593a7241e2bf4b37e98aadf1" exitCode=0 Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.030790 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"29734ea4-591c-478e-8030-55fcbac72d3a","Type":"ContainerDied","Data":"c89c36b86576b82473835bf0c40ac138380844e7593a7241e2bf4b37e98aadf1"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.033856 4743 generic.go:334] "Generic (PLEG): container finished" podID="145d3340-8ded-4082-b9c8-7b1a21390097" containerID="fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af" exitCode=0 Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.033916 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"145d3340-8ded-4082-b9c8-7b1a21390097","Type":"ContainerDied","Data":"fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.035858 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5ff985d64c-mnpj5" event={"ID":"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba","Type":"ContainerDied","Data":"61d37ec764385fefd39781c27a55f0988a7bc61d3a51286576ac5cb2f4e35267"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.035881 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61d37ec764385fefd39781c27a55f0988a7bc61d3a51286576ac5cb2f4e35267" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.044000 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-7qctt_870c700d-9095-4781-ab16-4cce25d24ed2/openstack-network-exporter/0.log" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.044059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-7qctt" event={"ID":"870c700d-9095-4781-ab16-4cce25d24ed2","Type":"ContainerDied","Data":"0bab0ae56585ab496c78b427d6c0cc668cf56d71b1d51bc48c30ab8fbc9736d7"} Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.044141 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.100206 4743 scope.go:117] "RemoveContainer" containerID="144ff3f3018a5c4ab62f7b7b6d9306bb85c460947fb43313619911edc249e05e" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.184303 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="041f321c-a19a-46ba-83e0-5934dd806565" path="/var/lib/kubelet/pods/041f321c-a19a-46ba-83e0-5934dd806565/volumes" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.187171 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" path="/var/lib/kubelet/pods/4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d/volumes" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.188269 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5987ad61-2878-4efc-98ca-ea29b123f26e" path="/var/lib/kubelet/pods/5987ad61-2878-4efc-98ca-ea29b123f26e/volumes" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.189081 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db10427-8546-4dea-b849-36bb02c837bd" path="/var/lib/kubelet/pods/5db10427-8546-4dea-b849-36bb02c837bd/volumes" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.204411 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.224361 4743 scope.go:117] "RemoveContainer" containerID="b976e4375d0ed0f34734cc0cfaefc7f462784997b77abe216f7a1ac9164a5a03" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.376803 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-run-httpd\") pod \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.376905 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n52xh\" (UniqueName: \"kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-kube-api-access-n52xh\") pod \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.376977 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-internal-tls-certs\") pod \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.377046 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-log-httpd\") pod \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.377075 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-config-data\") pod \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.377186 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-combined-ca-bundle\") pod \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.377239 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-etc-swift\") pod \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.377301 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-public-tls-certs\") pod \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\" (UID: \"178ccbe4-360f-4a0d-b97c-edf5b8b8dcba\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.379597 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" (UID: "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.379775 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" (UID: "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.382691 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-kube-api-access-n52xh" (OuterVolumeSpecName: "kube-api-access-n52xh") pod "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" (UID: "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba"). InnerVolumeSpecName "kube-api-access-n52xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.383467 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" (UID: "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.411865 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:45:55 crc kubenswrapper[4743]: E1122 08:45:55.424912 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722 is running failed: container process not found" containerID="a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 08:45:55 crc kubenswrapper[4743]: E1122 08:45:55.432224 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722 is running failed: container process not found" containerID="a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 08:45:55 crc kubenswrapper[4743]: E1122 08:45:55.434157 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722 is running failed: container process not found" containerID="a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 08:45:55 crc kubenswrapper[4743]: E1122 08:45:55.434379 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="da8955a2-6deb-440c-97e3-f2420aa5fae8" containerName="nova-scheduler-scheduler" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.458205 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" (UID: "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.460426 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-config-data" (OuterVolumeSpecName: "config-data") pod "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" (UID: "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.478544 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" (UID: "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.480443 4743 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.480470 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.480484 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.480497 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n52xh\" (UniqueName: \"kubernetes.io/projected/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-kube-api-access-n52xh\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.480508 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.480517 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.480527 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.509057 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" (UID: "178ccbe4-360f-4a0d-b97c-edf5b8b8dcba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.582511 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-combined-ca-bundle\") pod \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.582686 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-vencrypt-tls-certs\") pod \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.582761 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-nova-novncproxy-tls-certs\") pod \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.582842 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-config-data\") pod \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.583108 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk94d\" (UniqueName: \"kubernetes.io/projected/b7be7b8b-96eb-40fb-98b2-bc33e2154343-kube-api-access-mk94d\") pod \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\" (UID: \"b7be7b8b-96eb-40fb-98b2-bc33e2154343\") " Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.583625 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.589907 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7be7b8b-96eb-40fb-98b2-bc33e2154343-kube-api-access-mk94d" (OuterVolumeSpecName: "kube-api-access-mk94d") pod "b7be7b8b-96eb-40fb-98b2-bc33e2154343" (UID: "b7be7b8b-96eb-40fb-98b2-bc33e2154343"). InnerVolumeSpecName "kube-api-access-mk94d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.595410 4743 scope.go:117] "RemoveContainer" containerID="c84bf01830ea8518e6ca660ac284815461a856f5cc89ce3d25184993d73472c4" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.610277 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-config-data" (OuterVolumeSpecName: "config-data") pod "b7be7b8b-96eb-40fb-98b2-bc33e2154343" (UID: "b7be7b8b-96eb-40fb-98b2-bc33e2154343"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.616797 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7be7b8b-96eb-40fb-98b2-bc33e2154343" (UID: "b7be7b8b-96eb-40fb-98b2-bc33e2154343"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.638500 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "b7be7b8b-96eb-40fb-98b2-bc33e2154343" (UID: "b7be7b8b-96eb-40fb-98b2-bc33e2154343"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.674842 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "b7be7b8b-96eb-40fb-98b2-bc33e2154343" (UID: "b7be7b8b-96eb-40fb-98b2-bc33e2154343"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.685214 4743 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.685469 4743 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.685541 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.685624 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk94d\" (UniqueName: \"kubernetes.io/projected/b7be7b8b-96eb-40fb-98b2-bc33e2154343-kube-api-access-mk94d\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.685707 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7be7b8b-96eb-40fb-98b2-bc33e2154343-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.744076 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.744346 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="ceilometer-central-agent" containerID="cri-o://d286017c65a937c25c9d60c902e5c3e3d08c52047202b26fc6832bb9d0b90d7b" gracePeriod=30 Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.744729 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="sg-core" containerID="cri-o://0fc045e12216ec39ccb7f9f0e77c526341966c30dccbf1e54a1b50e33279ef52" gracePeriod=30 Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.744781 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="ceilometer-notification-agent" containerID="cri-o://37fbdedccc446a670cae08eb9632f568f7a7eccdeada9e4e6146ebf59a36b2e3" gracePeriod=30 Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.744940 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="proxy-httpd" containerID="cri-o://3a8d852d3f689474ae890d019ad247d38067084ae22ca4aca5bee80eb525bd90" gracePeriod=30 Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.775426 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.775646 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d3a93a60-b315-4de2-96d7-d23c9cedbc9c" containerName="kube-state-metrics" containerID="cri-o://4f25a424241601e91504248ab884e7a0f9860edb39f6eab7afdb79fa3b729315" gracePeriod=30 Nov 22 08:45:55 crc kubenswrapper[4743]: E1122 08:45:55.810347 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd43f52e71d508747b99d25448ea2492e1fc68d783ff2250c3357c3281ede81e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 08:45:55 crc kubenswrapper[4743]: E1122 08:45:55.820502 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd43f52e71d508747b99d25448ea2492e1fc68d783ff2250c3357c3281ede81e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 08:45:55 crc kubenswrapper[4743]: E1122 08:45:55.823073 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="fd43f52e71d508747b99d25448ea2492e1fc68d783ff2250c3357c3281ede81e" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 08:45:55 crc kubenswrapper[4743]: E1122 08:45:55.823128 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="11c59cd3-7ee4-43f3-83ce-9d22824473d7" containerName="nova-cell1-conductor-conductor" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.846547 4743 scope.go:117] "RemoveContainer" containerID="544ee1789868b1e74b94d551ca242dd748b844448c139b17d4767a8ea19814b9" Nov 22 08:45:55 crc kubenswrapper[4743]: I1122 08:45:55.850781 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:55.897105 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:55.897335 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="861e40f8-c596-40a1-b192-2fa51f567b55" containerName="memcached" containerID="cri-o://05a05410e859d0aa129b195d9be97306e21232ea6ee301f84157b08562c6ad1b" gracePeriod=30 Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:55.946734 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="44e22b0e556cf479c4ab148fe02b8b602f8d6a658164bfd210e15bbe9a5c9282" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:55.968555 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="44e22b0e556cf479c4ab148fe02b8b602f8d6a658164bfd210e15bbe9a5c9282" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:55.968813 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sz5kf"] Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:55.971728 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="44e22b0e556cf479c4ab148fe02b8b602f8d6a658164bfd210e15bbe9a5c9282" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:55.971790 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="ovn-northd" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.000867 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-kolla-config\") pod \"29734ea4-591c-478e-8030-55fcbac72d3a\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.000958 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-generated\") pod \"29734ea4-591c-478e-8030-55fcbac72d3a\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.001038 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnwx4\" (UniqueName: \"kubernetes.io/projected/29734ea4-591c-478e-8030-55fcbac72d3a-kube-api-access-mnwx4\") pod \"29734ea4-591c-478e-8030-55fcbac72d3a\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.001072 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-operator-scripts\") pod \"29734ea4-591c-478e-8030-55fcbac72d3a\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.001089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-galera-tls-certs\") pod \"29734ea4-591c-478e-8030-55fcbac72d3a\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.001106 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-default\") pod \"29734ea4-591c-478e-8030-55fcbac72d3a\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.001188 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-combined-ca-bundle\") pod \"29734ea4-591c-478e-8030-55fcbac72d3a\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.001214 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"29734ea4-591c-478e-8030-55fcbac72d3a\" (UID: \"29734ea4-591c-478e-8030-55fcbac72d3a\") " Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.004476 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29734ea4-591c-478e-8030-55fcbac72d3a" (UID: "29734ea4-591c-478e-8030-55fcbac72d3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.005320 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "29734ea4-591c-478e-8030-55fcbac72d3a" (UID: "29734ea4-591c-478e-8030-55fcbac72d3a"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.005686 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "29734ea4-591c-478e-8030-55fcbac72d3a" (UID: "29734ea4-591c-478e-8030-55fcbac72d3a"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.005729 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sz5kf"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.007794 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "29734ea4-591c-478e-8030-55fcbac72d3a" (UID: "29734ea4-591c-478e-8030-55fcbac72d3a"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.010552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29734ea4-591c-478e-8030-55fcbac72d3a-kube-api-access-mnwx4" (OuterVolumeSpecName: "kube-api-access-mnwx4") pod "29734ea4-591c-478e-8030-55fcbac72d3a" (UID: "29734ea4-591c-478e-8030-55fcbac72d3a"). InnerVolumeSpecName "kube-api-access-mnwx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.032867 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "mysql-db") pod "29734ea4-591c-478e-8030-55fcbac72d3a" (UID: "29734ea4-591c-478e-8030-55fcbac72d3a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.040661 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jqcdm"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.050300 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29734ea4-591c-478e-8030-55fcbac72d3a" (UID: "29734ea4-591c-478e-8030-55fcbac72d3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.051253 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jqcdm"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.061181 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.074787 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-66c4f7f76d-b9q4p"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.074992 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-66c4f7f76d-b9q4p" podUID="f5b21104-eefe-4583-9af8-731d561b78c2" containerName="keystone-api" containerID="cri-o://8e1277095f530b9d213cf681f4500af6bf174fddfe554f6556001dae2a813e03" gracePeriod=30 Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.097782 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-s6m4s"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.102251 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "29734ea4-591c-478e-8030-55fcbac72d3a" (UID: "29734ea4-591c-478e-8030-55fcbac72d3a"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.103594 4743 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.103616 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.103625 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29734ea4-591c-478e-8030-55fcbac72d3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.103646 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.103763 4743 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.103773 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/29734ea4-591c-478e-8030-55fcbac72d3a-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.103782 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnwx4\" (UniqueName: \"kubernetes.io/projected/29734ea4-591c-478e-8030-55fcbac72d3a-kube-api-access-mnwx4\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.103790 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29734ea4-591c-478e-8030-55fcbac72d3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:56.109868 4743 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:56.109955 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:46:00.109931155 +0000 UTC m=+1433.816292207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-scripts" not found Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:56.109968 4743 secret.go:188] Couldn't get secret openstack/cinder-api-config-data: secret "cinder-api-config-data" not found Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:56.110024 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:46:00.110007457 +0000 UTC m=+1433.816368509 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-api-config-data" not found Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:56.110064 4743 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:56.110085 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data podName:29bf9036-d8fc-43f7-9153-f133a723c6df nodeName:}" failed. No retries permitted until 2025-11-22 08:46:00.110076439 +0000 UTC m=+1433.816437491 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data") pod "cinder-api-0" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df") : secret "cinder-config-data" not found Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.112202 4743 generic.go:334] "Generic (PLEG): container finished" podID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" containerID="12838b3e542aa21904acab03d6b27d30ec54f1471909fca6df88ff3e1aee935d" exitCode=0 Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.112250 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84df6c6d8d-v9vxr" event={"ID":"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca","Type":"ContainerDied","Data":"12838b3e542aa21904acab03d6b27d30ec54f1471909fca6df88ff3e1aee935d"} Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.117327 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-s6m4s"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.129360 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.129418 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f92e-account-create-f4xn7"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.138008 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f92e-account-create-f4xn7"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.141364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" event={"ID":"8fe5d70f-5277-4803-ae45-de61d0eefe27","Type":"ContainerDied","Data":"0ba3551be192c9ca67dafe3b3334189b068599cd139082c54c7923a0610758ca"} Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.141403 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ba3551be192c9ca67dafe3b3334189b068599cd139082c54c7923a0610758ca" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.144675 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-svwz4"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.159105 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-svwz4"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.172206 4743 generic.go:334] "Generic (PLEG): container finished" podID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerID="0fc045e12216ec39ccb7f9f0e77c526341966c30dccbf1e54a1b50e33279ef52" exitCode=2 Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.172264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58b7a46d-98c7-4e9e-94df-80d359fd68c7","Type":"ContainerDied","Data":"0fc045e12216ec39ccb7f9f0e77c526341966c30dccbf1e54a1b50e33279ef52"} Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.180926 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican675b-account-delete-bdj7v"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.187710 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-675b-account-create-t7bnk"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.193893 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-675b-account-create-t7bnk"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.197920 4743 generic.go:334] "Generic (PLEG): container finished" podID="da8955a2-6deb-440c-97e3-f2420aa5fae8" containerID="a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722" exitCode=0 Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.198019 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da8955a2-6deb-440c-97e3-f2420aa5fae8","Type":"ContainerDied","Data":"a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722"} Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.198048 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"da8955a2-6deb-440c-97e3-f2420aa5fae8","Type":"ContainerDied","Data":"66d5cb532dcc036751a45bee87df23976c1ef63228dbdb8d9b4db6f138df1e7a"} Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.198062 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d5cb532dcc036751a45bee87df23976c1ef63228dbdb8d9b4db6f138df1e7a" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.208238 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.225909 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"29734ea4-591c-478e-8030-55fcbac72d3a","Type":"ContainerDied","Data":"d608b4c2033083afc4449c6a3e9fb408094fd3a4c09eb692f1c19645b680c841"} Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.226004 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.250543 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fs8kc"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.251115 4743 generic.go:334] "Generic (PLEG): container finished" podID="d3a93a60-b315-4de2-96d7-d23c9cedbc9c" containerID="4f25a424241601e91504248ab884e7a0f9860edb39f6eab7afdb79fa3b729315" exitCode=2 Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.251264 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3a93a60-b315-4de2-96d7-d23c9cedbc9c","Type":"ContainerDied","Data":"4f25a424241601e91504248ab884e7a0f9860edb39f6eab7afdb79fa3b729315"} Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.251316 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5ff985d64c-mnpj5" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.255933 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.258270 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fs8kc"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.273639 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron8dc4-account-delete-rtl4b"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.283543 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8dc4-account-create-f65gm"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.288454 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8dc4-account-create-f65gm"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.388315 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-c4zmd"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.402522 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-c4zmd"] Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:56.472273 4743 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Nov 22 08:45:56 crc kubenswrapper[4743]: E1122 08:45:56.472342 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts podName:d96211ff-f7ba-4e26-ae39-43c8062e2277 nodeName:}" failed. No retries permitted until 2025-11-22 08:46:00.472327812 +0000 UTC m=+1434.178688864 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts") pod "novacell1c099-account-delete-lqvbc" (UID: "d96211ff-f7ba-4e26-ae39-43c8062e2277") : configmap "openstack-cell1-scripts" not found Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.540771 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance4a60-account-delete-j4cg4"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.561857 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-4a60-account-create-qgm25"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.586479 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-4a60-account-create-qgm25"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.639727 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-mwrrj"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.654155 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-mwrrj"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.662071 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ad52-account-create-gpn4k"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.672786 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ad52-account-create-gpn4k"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.681219 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0ad52-account-delete-9r4lw"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.691583 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-449vs"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.708691 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-449vs"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.713656 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1d8b-account-create-x94zr"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.723269 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="d3d63130-217d-400e-afc5-6b6bb3d56658" containerName="galera" containerID="cri-o://9b8889acf70f714f96a12fef606aa84ad0497560bdf80aa3dddac96e77684d76" gracePeriod=30 Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.723408 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi1d8b-account-delete-scjlt"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.740175 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1d8b-account-create-x94zr"] Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.958829 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": dial tcp 10.217.0.159:9311: connect: connection refused" Nov 22 08:45:56 crc kubenswrapper[4743]: I1122 08:45:56.958829 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": dial tcp 10.217.0.159:9311: connect: connection refused" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.150715 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": read tcp 10.217.0.2:45294->10.217.0.163:8776: read: connection reset by peer" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.180166 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131c329f-8c7c-4d30-a0c3-37ecaac9db82" path="/var/lib/kubelet/pods/131c329f-8c7c-4d30-a0c3-37ecaac9db82/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.180809 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1520cff6-cefe-47d7-bce3-1c80dd5eb3dc" path="/var/lib/kubelet/pods/1520cff6-cefe-47d7-bce3-1c80dd5eb3dc/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.181288 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b40c85c-3938-405e-902c-c4ea5a19fe20" path="/var/lib/kubelet/pods/1b40c85c-3938-405e-902c-c4ea5a19fe20/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.181787 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf0a98f-65ac-4997-a98c-fb20ef181219" path="/var/lib/kubelet/pods/1bf0a98f-65ac-4997-a98c-fb20ef181219/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.182707 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c41a20c-5a07-4187-bb4f-3f900256ea49" path="/var/lib/kubelet/pods/1c41a20c-5a07-4187-bb4f-3f900256ea49/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.183168 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e8c36ca-0f50-4702-ad97-b1956797b4ab" path="/var/lib/kubelet/pods/2e8c36ca-0f50-4702-ad97-b1956797b4ab/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.183653 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a314de-cc63-4bd3-9abc-aaa38391e873" path="/var/lib/kubelet/pods/31a314de-cc63-4bd3-9abc-aaa38391e873/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.196318 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3633999d-a3b2-483d-9ca9-601350b07e59" path="/var/lib/kubelet/pods/3633999d-a3b2-483d-9ca9-601350b07e59/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.196918 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f4c19ea-fd20-422d-a4c0-efce91c256fc" path="/var/lib/kubelet/pods/5f4c19ea-fd20-422d-a4c0-efce91c256fc/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.197394 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84edae0e-41a9-42b0-a1bc-1a303dc92946" path="/var/lib/kubelet/pods/84edae0e-41a9-42b0-a1bc-1a303dc92946/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.198319 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86aa9353-ef38-45ec-8e1f-12f3ec108756" path="/var/lib/kubelet/pods/86aa9353-ef38-45ec-8e1f-12f3ec108756/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.198810 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b429a37f-69bf-4a7d-93c4-a3fa043b5f9b" path="/var/lib/kubelet/pods/b429a37f-69bf-4a7d-93c4-a3fa043b5f9b/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.199281 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b624f29e-b759-4767-ab76-de4d94d4e2af" path="/var/lib/kubelet/pods/b624f29e-b759-4767-ab76-de4d94d4e2af/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.199805 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5bb8cbe-6922-4961-9327-f3711da41234" path="/var/lib/kubelet/pods/e5bb8cbe-6922-4961-9327-f3711da41234/volumes" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.266429 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84df6c6d8d-v9vxr" event={"ID":"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca","Type":"ContainerDied","Data":"308da01ed407044c48117ec420e360d5dda27692aad8d33916b3bb6c489ba6ad"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.266476 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="308da01ed407044c48117ec420e360d5dda27692aad8d33916b3bb6c489ba6ad" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.271831 4743 generic.go:334] "Generic (PLEG): container finished" podID="c61760fb-827b-4199-bfdb-52698c7b4824" containerID="f148d19bec9da2034a614aa3685da5500ee102ac2162e404c9df6a8dd6001346" exitCode=0 Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.271889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c61760fb-827b-4199-bfdb-52698c7b4824","Type":"ContainerDied","Data":"f148d19bec9da2034a614aa3685da5500ee102ac2162e404c9df6a8dd6001346"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.284002 4743 generic.go:334] "Generic (PLEG): container finished" podID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerID="3a8d852d3f689474ae890d019ad247d38067084ae22ca4aca5bee80eb525bd90" exitCode=0 Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.284389 4743 generic.go:334] "Generic (PLEG): container finished" podID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerID="d286017c65a937c25c9d60c902e5c3e3d08c52047202b26fc6832bb9d0b90d7b" exitCode=0 Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.284246 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58b7a46d-98c7-4e9e-94df-80d359fd68c7","Type":"ContainerDied","Data":"3a8d852d3f689474ae890d019ad247d38067084ae22ca4aca5bee80eb525bd90"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.284644 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58b7a46d-98c7-4e9e-94df-80d359fd68c7","Type":"ContainerDied","Data":"d286017c65a937c25c9d60c902e5c3e3d08c52047202b26fc6832bb9d0b90d7b"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.292942 4743 generic.go:334] "Generic (PLEG): container finished" podID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" containerID="6671bb99b39fc16a0f6c253ac0e494e49254030b8ca083c6f60cb786f074a063" exitCode=0 Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.293791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dca6d95c-89d6-4b49-bf28-2606b9b5c05e","Type":"ContainerDied","Data":"6671bb99b39fc16a0f6c253ac0e494e49254030b8ca083c6f60cb786f074a063"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.302289 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3a93a60-b315-4de2-96d7-d23c9cedbc9c","Type":"ContainerDied","Data":"9599f4fa0992155906bf350a593ed3aa398353ce4ff4f43f444ac8fe006585ac"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.302344 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9599f4fa0992155906bf350a593ed3aa398353ce4ff4f43f444ac8fe006585ac" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.325396 4743 generic.go:334] "Generic (PLEG): container finished" podID="11c59cd3-7ee4-43f3-83ce-9d22824473d7" containerID="fd43f52e71d508747b99d25448ea2492e1fc68d783ff2250c3357c3281ede81e" exitCode=0 Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.325496 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"11c59cd3-7ee4-43f3-83ce-9d22824473d7","Type":"ContainerDied","Data":"fd43f52e71d508747b99d25448ea2492e1fc68d783ff2250c3357c3281ede81e"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.334352 4743 scope.go:117] "RemoveContainer" containerID="c89c36b86576b82473835bf0c40ac138380844e7593a7241e2bf4b37e98aadf1" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.351857 4743 generic.go:334] "Generic (PLEG): container finished" podID="861e40f8-c596-40a1-b192-2fa51f567b55" containerID="05a05410e859d0aa129b195d9be97306e21232ea6ee301f84157b08562c6ad1b" exitCode=0 Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.351955 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"861e40f8-c596-40a1-b192-2fa51f567b55","Type":"ContainerDied","Data":"05a05410e859d0aa129b195d9be97306e21232ea6ee301f84157b08562c6ad1b"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.355963 4743 generic.go:334] "Generic (PLEG): container finished" podID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerID="31240114f37ac66a6ac0ee75966656d89b98f5b65714a820dfeae421393b0b13" exitCode=0 Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.356061 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b24dd85-d686-4fb0-be74-7aca0b03255c","Type":"ContainerDied","Data":"31240114f37ac66a6ac0ee75966656d89b98f5b65714a820dfeae421393b0b13"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.356112 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2b24dd85-d686-4fb0-be74-7aca0b03255c","Type":"ContainerDied","Data":"8f61063094b3f711e1b7cccbf81e7095d03423e00dea436112bcaae08b1a86c9"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.356123 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f61063094b3f711e1b7cccbf81e7095d03423e00dea436112bcaae08b1a86c9" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.364644 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder89a7-account-delete-gjvvg" event={"ID":"7936e330-2138-4624-b319-902f6a4941ec","Type":"ContainerDied","Data":"8e1f187c60412a1ebdfddabd0b735fcf4d5b9aa0f3a570813ea31e863abd76e3"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.364685 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e1f187c60412a1ebdfddabd0b735fcf4d5b9aa0f3a570813ea31e863abd76e3" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.367874 4743 generic.go:334] "Generic (PLEG): container finished" podID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" containerID="c4a492c46b22ecd2cb2ce30f4d5cbacdf5b41359fdcc9e9ef3d84f92e3284551" exitCode=0 Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.367918 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" event={"ID":"89d8e638-b97a-4273-9391-5e0c7dd1bfb1","Type":"ContainerDied","Data":"c4a492c46b22ecd2cb2ce30f4d5cbacdf5b41359fdcc9e9ef3d84f92e3284551"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.370821 4743 generic.go:334] "Generic (PLEG): container finished" podID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerID="455de173684d6834930eabe9a480ac739569a3760776782c2c81d8591d036411" exitCode=0 Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.370889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" event={"ID":"dc034ce8-656e-4c88-92f1-18f384ae1a18","Type":"ContainerDied","Data":"455de173684d6834930eabe9a480ac739569a3760776782c2c81d8591d036411"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.374265 4743 generic.go:334] "Generic (PLEG): container finished" podID="db905ec2-675e-48ea-a051-ed3d78c35797" containerID="1cfb29dd7e0a21897754c302d7a14c2ab839c36f149cccd25dabc107f11f9bed" exitCode=0 Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.374297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db905ec2-675e-48ea-a051-ed3d78c35797","Type":"ContainerDied","Data":"1cfb29dd7e0a21897754c302d7a14c2ab839c36f149cccd25dabc107f11f9bed"} Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.820676 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.835725 4743 scope.go:117] "RemoveContainer" containerID="07a73aeee6e1a47a17627947bea433502593cbb95367d72965b6e41790f9de15" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.845456 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.864505 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.864593 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-5ff985d64c-mnpj5"] Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.883631 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-5ff985d64c-mnpj5"] Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.891662 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.895653 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.920630 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.926326 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.939437 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2nfs\" (UniqueName: \"kubernetes.io/projected/8fe5d70f-5277-4803-ae45-de61d0eefe27-kube-api-access-p2nfs\") pod \"8fe5d70f-5277-4803-ae45-de61d0eefe27\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.939500 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7p5b\" (UniqueName: \"kubernetes.io/projected/da8955a2-6deb-440c-97e3-f2420aa5fae8-kube-api-access-r7p5b\") pod \"da8955a2-6deb-440c-97e3-f2420aa5fae8\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.939524 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data-custom\") pod \"8fe5d70f-5277-4803-ae45-de61d0eefe27\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.939564 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-config-data\") pod \"da8955a2-6deb-440c-97e3-f2420aa5fae8\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.939601 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-combined-ca-bundle\") pod \"da8955a2-6deb-440c-97e3-f2420aa5fae8\" (UID: \"da8955a2-6deb-440c-97e3-f2420aa5fae8\") " Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.939633 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data\") pod \"8fe5d70f-5277-4803-ae45-de61d0eefe27\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.939661 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-combined-ca-bundle\") pod \"8fe5d70f-5277-4803-ae45-de61d0eefe27\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.939725 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fe5d70f-5277-4803-ae45-de61d0eefe27-logs\") pod \"8fe5d70f-5277-4803-ae45-de61d0eefe27\" (UID: \"8fe5d70f-5277-4803-ae45-de61d0eefe27\") " Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.940664 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fe5d70f-5277-4803-ae45-de61d0eefe27-logs" (OuterVolumeSpecName: "logs") pod "8fe5d70f-5277-4803-ae45-de61d0eefe27" (UID: "8fe5d70f-5277-4803-ae45-de61d0eefe27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.954459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fe5d70f-5277-4803-ae45-de61d0eefe27-kube-api-access-p2nfs" (OuterVolumeSpecName: "kube-api-access-p2nfs") pod "8fe5d70f-5277-4803-ae45-de61d0eefe27" (UID: "8fe5d70f-5277-4803-ae45-de61d0eefe27"). InnerVolumeSpecName "kube-api-access-p2nfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.959472 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8fe5d70f-5277-4803-ae45-de61d0eefe27" (UID: "8fe5d70f-5277-4803-ae45-de61d0eefe27"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.962449 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da8955a2-6deb-440c-97e3-f2420aa5fae8-kube-api-access-r7p5b" (OuterVolumeSpecName: "kube-api-access-r7p5b") pod "da8955a2-6deb-440c-97e3-f2420aa5fae8" (UID: "da8955a2-6deb-440c-97e3-f2420aa5fae8"). InnerVolumeSpecName "kube-api-access-r7p5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.990312 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-config-data" (OuterVolumeSpecName: "config-data") pod "da8955a2-6deb-440c-97e3-f2420aa5fae8" (UID: "da8955a2-6deb-440c-97e3-f2420aa5fae8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:57 crc kubenswrapper[4743]: I1122 08:45:57.995442 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da8955a2-6deb-440c-97e3-f2420aa5fae8" (UID: "da8955a2-6deb-440c-97e3-f2420aa5fae8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.005512 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fe5d70f-5277-4803-ae45-de61d0eefe27" (UID: "8fe5d70f-5277-4803-ae45-de61d0eefe27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.021955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data" (OuterVolumeSpecName: "config-data") pod "8fe5d70f-5277-4803-ae45-de61d0eefe27" (UID: "8fe5d70f-5277-4803-ae45-de61d0eefe27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.047454 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.049240 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-logs\") pod \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.049274 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-config-data\") pod \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.049412 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-combined-ca-bundle\") pod \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.049460 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-internal-tls-certs\") pod \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.049510 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-public-tls-certs\") pod \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.049538 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42gjr\" (UniqueName: \"kubernetes.io/projected/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-kube-api-access-42gjr\") pod \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.049556 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-scripts\") pod \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\" (UID: \"abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.049744 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-logs" (OuterVolumeSpecName: "logs") pod "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" (UID: "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.049984 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.049996 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fe5d70f-5277-4803-ae45-de61d0eefe27-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.050004 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2nfs\" (UniqueName: \"kubernetes.io/projected/8fe5d70f-5277-4803-ae45-de61d0eefe27-kube-api-access-p2nfs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.050017 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7p5b\" (UniqueName: \"kubernetes.io/projected/da8955a2-6deb-440c-97e3-f2420aa5fae8-kube-api-access-r7p5b\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.050026 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.050034 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.050043 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.050051 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8955a2-6deb-440c-97e3-f2420aa5fae8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.050058 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fe5d70f-5277-4803-ae45-de61d0eefe27-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.054508 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-scripts" (OuterVolumeSpecName: "scripts") pod "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" (UID: "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.078476 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.092373 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-kube-api-access-42gjr" (OuterVolumeSpecName: "kube-api-access-42gjr") pod "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" (UID: "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca"). InnerVolumeSpecName "kube-api-access-42gjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.110538 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder89a7-account-delete-gjvvg" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.111537 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.114850 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.116799 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" (UID: "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.117981 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-config-data" (OuterVolumeSpecName: "config-data") pod "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" (UID: "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.125086 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.130478 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.130310 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.131481 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.131506 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.131831 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.134618 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.136742 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.136798 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovs-vswitchd" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.139686 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.156770 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-config\") pod \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.156891 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-certs\") pod \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.157058 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rpj7\" (UniqueName: \"kubernetes.io/projected/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-api-access-5rpj7\") pod \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.157110 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-combined-ca-bundle\") pod \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\" (UID: \"d3a93a60-b315-4de2-96d7-d23c9cedbc9c\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.158697 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.159291 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.159384 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42gjr\" (UniqueName: \"kubernetes.io/projected/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-kube-api-access-42gjr\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.159459 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.177315 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.214472 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.214851 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-api-access-5rpj7" (OuterVolumeSpecName: "kube-api-access-5rpj7") pod "d3a93a60-b315-4de2-96d7-d23c9cedbc9c" (UID: "d3a93a60-b315-4de2-96d7-d23c9cedbc9c"). InnerVolumeSpecName "kube-api-access-5rpj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.236024 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.261805 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-public-tls-certs\") pod \"db905ec2-675e-48ea-a051-ed3d78c35797\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.261855 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-nova-metadata-tls-certs\") pod \"2b24dd85-d686-4fb0-be74-7aca0b03255c\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.261894 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-config-data\") pod \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.261923 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-httpd-run\") pod \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.261956 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-954qj\" (UniqueName: \"kubernetes.io/projected/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-kube-api-access-954qj\") pod \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.261974 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-scripts\") pod \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.261994 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-internal-tls-certs\") pod \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262018 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-combined-ca-bundle\") pod \"2b24dd85-d686-4fb0-be74-7aca0b03255c\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262039 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-combined-ca-bundle\") pod \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262065 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-kolla-config\") pod \"861e40f8-c596-40a1-b192-2fa51f567b55\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262090 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-combined-ca-bundle\") pod \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262115 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7936e330-2138-4624-b319-902f6a4941ec-operator-scripts\") pod \"7936e330-2138-4624-b319-902f6a4941ec\" (UID: \"7936e330-2138-4624-b319-902f6a4941ec\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262150 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-459l2\" (UniqueName: \"kubernetes.io/projected/7936e330-2138-4624-b319-902f6a4941ec-kube-api-access-459l2\") pod \"7936e330-2138-4624-b319-902f6a4941ec\" (UID: \"7936e330-2138-4624-b319-902f6a4941ec\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262167 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9vdx\" (UniqueName: \"kubernetes.io/projected/11c59cd3-7ee4-43f3-83ce-9d22824473d7-kube-api-access-m9vdx\") pod \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262215 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db905ec2-675e-48ea-a051-ed3d78c35797-logs\") pod \"db905ec2-675e-48ea-a051-ed3d78c35797\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262243 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng59n\" (UniqueName: \"kubernetes.io/projected/db905ec2-675e-48ea-a051-ed3d78c35797-kube-api-access-ng59n\") pod \"db905ec2-675e-48ea-a051-ed3d78c35797\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262260 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-internal-tls-certs\") pod \"db905ec2-675e-48ea-a051-ed3d78c35797\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-logs\") pod \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262332 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-memcached-tls-certs\") pod \"861e40f8-c596-40a1-b192-2fa51f567b55\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262357 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\" (UID: \"dca6d95c-89d6-4b49-bf28-2606b9b5c05e\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262376 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g68lk\" (UniqueName: \"kubernetes.io/projected/2b24dd85-d686-4fb0-be74-7aca0b03255c-kube-api-access-g68lk\") pod \"2b24dd85-d686-4fb0-be74-7aca0b03255c\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262400 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b24dd85-d686-4fb0-be74-7aca0b03255c-logs\") pod \"2b24dd85-d686-4fb0-be74-7aca0b03255c\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262426 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-combined-ca-bundle\") pod \"db905ec2-675e-48ea-a051-ed3d78c35797\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262457 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-config-data\") pod \"2b24dd85-d686-4fb0-be74-7aca0b03255c\" (UID: \"2b24dd85-d686-4fb0-be74-7aca0b03255c\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-config-data\") pod \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\" (UID: \"11c59cd3-7ee4-43f3-83ce-9d22824473d7\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262531 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-combined-ca-bundle\") pod \"861e40f8-c596-40a1-b192-2fa51f567b55\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262554 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-config-data\") pod \"db905ec2-675e-48ea-a051-ed3d78c35797\" (UID: \"db905ec2-675e-48ea-a051-ed3d78c35797\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262609 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nwnx\" (UniqueName: \"kubernetes.io/projected/861e40f8-c596-40a1-b192-2fa51f567b55-kube-api-access-7nwnx\") pod \"861e40f8-c596-40a1-b192-2fa51f567b55\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.262628 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-config-data\") pod \"861e40f8-c596-40a1-b192-2fa51f567b55\" (UID: \"861e40f8-c596-40a1-b192-2fa51f567b55\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.271393 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db905ec2-675e-48ea-a051-ed3d78c35797-logs" (OuterVolumeSpecName: "logs") pod "db905ec2-675e-48ea-a051-ed3d78c35797" (UID: "db905ec2-675e-48ea-a051-ed3d78c35797"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.272317 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3a93a60-b315-4de2-96d7-d23c9cedbc9c" (UID: "d3a93a60-b315-4de2-96d7-d23c9cedbc9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.272617 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rpj7\" (UniqueName: \"kubernetes.io/projected/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-api-access-5rpj7\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.281281 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dca6d95c-89d6-4b49-bf28-2606b9b5c05e" (UID: "dca6d95c-89d6-4b49-bf28-2606b9b5c05e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.295109 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "861e40f8-c596-40a1-b192-2fa51f567b55" (UID: "861e40f8-c596-40a1-b192-2fa51f567b55"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.295613 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "d3a93a60-b315-4de2-96d7-d23c9cedbc9c" (UID: "d3a93a60-b315-4de2-96d7-d23c9cedbc9c"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.296043 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "dca6d95c-89d6-4b49-bf28-2606b9b5c05e" (UID: "dca6d95c-89d6-4b49-bf28-2606b9b5c05e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.301703 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b24dd85-d686-4fb0-be74-7aca0b03255c-logs" (OuterVolumeSpecName: "logs") pod "2b24dd85-d686-4fb0-be74-7aca0b03255c" (UID: "2b24dd85-d686-4fb0-be74-7aca0b03255c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.308292 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-kube-api-access-954qj" (OuterVolumeSpecName: "kube-api-access-954qj") pod "dca6d95c-89d6-4b49-bf28-2606b9b5c05e" (UID: "dca6d95c-89d6-4b49-bf28-2606b9b5c05e"). InnerVolumeSpecName "kube-api-access-954qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.309950 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11c59cd3-7ee4-43f3-83ce-9d22824473d7-kube-api-access-m9vdx" (OuterVolumeSpecName: "kube-api-access-m9vdx") pod "11c59cd3-7ee4-43f3-83ce-9d22824473d7" (UID: "11c59cd3-7ee4-43f3-83ce-9d22824473d7"). InnerVolumeSpecName "kube-api-access-m9vdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.310515 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-logs" (OuterVolumeSpecName: "logs") pod "dca6d95c-89d6-4b49-bf28-2606b9b5c05e" (UID: "dca6d95c-89d6-4b49-bf28-2606b9b5c05e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.311985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-scripts" (OuterVolumeSpecName: "scripts") pod "dca6d95c-89d6-4b49-bf28-2606b9b5c05e" (UID: "dca6d95c-89d6-4b49-bf28-2606b9b5c05e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.312216 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.312305 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7936e330-2138-4624-b319-902f6a4941ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7936e330-2138-4624-b319-902f6a4941ec" (UID: "7936e330-2138-4624-b319-902f6a4941ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.312839 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-config-data" (OuterVolumeSpecName: "config-data") pod "861e40f8-c596-40a1-b192-2fa51f567b55" (UID: "861e40f8-c596-40a1-b192-2fa51f567b55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.334530 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861e40f8-c596-40a1-b192-2fa51f567b55-kube-api-access-7nwnx" (OuterVolumeSpecName: "kube-api-access-7nwnx") pod "861e40f8-c596-40a1-b192-2fa51f567b55" (UID: "861e40f8-c596-40a1-b192-2fa51f567b55"). InnerVolumeSpecName "kube-api-access-7nwnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.358545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db905ec2-675e-48ea-a051-ed3d78c35797-kube-api-access-ng59n" (OuterVolumeSpecName: "kube-api-access-ng59n") pod "db905ec2-675e-48ea-a051-ed3d78c35797" (UID: "db905ec2-675e-48ea-a051-ed3d78c35797"). InnerVolumeSpecName "kube-api-access-ng59n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.358985 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b24dd85-d686-4fb0-be74-7aca0b03255c-kube-api-access-g68lk" (OuterVolumeSpecName: "kube-api-access-g68lk") pod "2b24dd85-d686-4fb0-be74-7aca0b03255c" (UID: "2b24dd85-d686-4fb0-be74-7aca0b03255c"). InnerVolumeSpecName "kube-api-access-g68lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.359403 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7936e330-2138-4624-b319-902f6a4941ec-kube-api-access-459l2" (OuterVolumeSpecName: "kube-api-access-459l2") pod "7936e330-2138-4624-b319-902f6a4941ec" (UID: "7936e330-2138-4624-b319-902f6a4941ec"). InnerVolumeSpecName "kube-api-access-459l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.373028 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b24dd85-d686-4fb0-be74-7aca0b03255c" (UID: "2b24dd85-d686-4fb0-be74-7aca0b03255c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.373684 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tv7j\" (UniqueName: \"kubernetes.io/projected/dc034ce8-656e-4c88-92f1-18f384ae1a18-kube-api-access-5tv7j\") pod \"dc034ce8-656e-4c88-92f1-18f384ae1a18\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.373824 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc034ce8-656e-4c88-92f1-18f384ae1a18-logs\") pod \"dc034ce8-656e-4c88-92f1-18f384ae1a18\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.373895 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-scripts\") pod \"c61760fb-827b-4199-bfdb-52698c7b4824\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.373922 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-combined-ca-bundle\") pod \"c61760fb-827b-4199-bfdb-52698c7b4824\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.373975 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data-custom\") pod \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"c61760fb-827b-4199-bfdb-52698c7b4824\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374032 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data\") pod \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374114 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-combined-ca-bundle\") pod \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374181 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data-custom\") pod \"dc034ce8-656e-4c88-92f1-18f384ae1a18\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374219 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-public-tls-certs\") pod \"c61760fb-827b-4199-bfdb-52698c7b4824\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374244 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bflkx\" (UniqueName: \"kubernetes.io/projected/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-kube-api-access-bflkx\") pod \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374275 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-config-data\") pod \"c61760fb-827b-4199-bfdb-52698c7b4824\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374357 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-httpd-run\") pod \"c61760fb-827b-4199-bfdb-52698c7b4824\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374411 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-logs\") pod \"c61760fb-827b-4199-bfdb-52698c7b4824\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374441 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gm9q\" (UniqueName: \"kubernetes.io/projected/c61760fb-827b-4199-bfdb-52698c7b4824-kube-api-access-5gm9q\") pod \"c61760fb-827b-4199-bfdb-52698c7b4824\" (UID: \"c61760fb-827b-4199-bfdb-52698c7b4824\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374505 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-combined-ca-bundle\") pod \"dc034ce8-656e-4c88-92f1-18f384ae1a18\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374550 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-logs\") pod \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\" (UID: \"89d8e638-b97a-4273-9391-5e0c7dd1bfb1\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374659 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-internal-tls-certs\") pod \"dc034ce8-656e-4c88-92f1-18f384ae1a18\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data\") pod \"dc034ce8-656e-4c88-92f1-18f384ae1a18\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.374739 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-public-tls-certs\") pod \"dc034ce8-656e-4c88-92f1-18f384ae1a18\" (UID: \"dc034ce8-656e-4c88-92f1-18f384ae1a18\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375396 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375421 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-954qj\" (UniqueName: \"kubernetes.io/projected/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-kube-api-access-954qj\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375436 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375451 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375463 4743 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375477 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7936e330-2138-4624-b319-902f6a4941ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375489 4743 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375503 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-459l2\" (UniqueName: \"kubernetes.io/projected/7936e330-2138-4624-b319-902f6a4941ec-kube-api-access-459l2\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375517 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9vdx\" (UniqueName: \"kubernetes.io/projected/11c59cd3-7ee4-43f3-83ce-9d22824473d7-kube-api-access-m9vdx\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375528 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db905ec2-675e-48ea-a051-ed3d78c35797-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375540 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng59n\" (UniqueName: \"kubernetes.io/projected/db905ec2-675e-48ea-a051-ed3d78c35797-kube-api-access-ng59n\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375551 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375563 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g68lk\" (UniqueName: \"kubernetes.io/projected/2b24dd85-d686-4fb0-be74-7aca0b03255c-kube-api-access-g68lk\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375602 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375616 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b24dd85-d686-4fb0-be74-7aca0b03255c-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375629 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nwnx\" (UniqueName: \"kubernetes.io/projected/861e40f8-c596-40a1-b192-2fa51f567b55-kube-api-access-7nwnx\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375641 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/861e40f8-c596-40a1-b192-2fa51f567b55-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.375654 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.378208 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc034ce8-656e-4c88-92f1-18f384ae1a18-logs" (OuterVolumeSpecName: "logs") pod "dc034ce8-656e-4c88-92f1-18f384ae1a18" (UID: "dc034ce8-656e-4c88-92f1-18f384ae1a18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.378463 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-logs" (OuterVolumeSpecName: "logs") pod "c61760fb-827b-4199-bfdb-52698c7b4824" (UID: "c61760fb-827b-4199-bfdb-52698c7b4824"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.382701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-scripts" (OuterVolumeSpecName: "scripts") pod "c61760fb-827b-4199-bfdb-52698c7b4824" (UID: "c61760fb-827b-4199-bfdb-52698c7b4824"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.392797 4743 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.392873 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data podName:ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1 nodeName:}" failed. No retries permitted until 2025-11-22 08:46:06.392854445 +0000 UTC m=+1440.099215497 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data") pod "rabbitmq-cell1-server-0" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1") : configmap "rabbitmq-cell1-config-data" not found Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.393320 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c61760fb-827b-4199-bfdb-52698c7b4824" (UID: "c61760fb-827b-4199-bfdb-52698c7b4824"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.393624 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-logs" (OuterVolumeSpecName: "logs") pod "89d8e638-b97a-4273-9391-5e0c7dd1bfb1" (UID: "89d8e638-b97a-4273-9391-5e0c7dd1bfb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.401498 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc034ce8-656e-4c88-92f1-18f384ae1a18-kube-api-access-5tv7j" (OuterVolumeSpecName: "kube-api-access-5tv7j") pod "dc034ce8-656e-4c88-92f1-18f384ae1a18" (UID: "dc034ce8-656e-4c88-92f1-18f384ae1a18"). InnerVolumeSpecName "kube-api-access-5tv7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.412882 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "c61760fb-827b-4199-bfdb-52698c7b4824" (UID: "c61760fb-827b-4199-bfdb-52698c7b4824"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.416863 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dc034ce8-656e-4c88-92f1-18f384ae1a18" (UID: "dc034ce8-656e-4c88-92f1-18f384ae1a18"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.416888 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-kube-api-access-bflkx" (OuterVolumeSpecName: "kube-api-access-bflkx") pod "89d8e638-b97a-4273-9391-5e0c7dd1bfb1" (UID: "89d8e638-b97a-4273-9391-5e0c7dd1bfb1"). InnerVolumeSpecName "kube-api-access-bflkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.418901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c61760fb-827b-4199-bfdb-52698c7b4824-kube-api-access-5gm9q" (OuterVolumeSpecName: "kube-api-access-5gm9q") pod "c61760fb-827b-4199-bfdb-52698c7b4824" (UID: "c61760fb-827b-4199-bfdb-52698c7b4824"). InnerVolumeSpecName "kube-api-access-5gm9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.420033 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "89d8e638-b97a-4273-9391-5e0c7dd1bfb1" (UID: "89d8e638-b97a-4273-9391-5e0c7dd1bfb1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.424506 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dca6d95c-89d6-4b49-bf28-2606b9b5c05e" (UID: "dca6d95c-89d6-4b49-bf28-2606b9b5c05e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.424827 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"11c59cd3-7ee4-43f3-83ce-9d22824473d7","Type":"ContainerDied","Data":"92caa2e13cbb5fceb5c72acb9106cffb671d3aaa272061ee6717337df6bb7392"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.424961 4743 scope.go:117] "RemoveContainer" containerID="fd43f52e71d508747b99d25448ea2492e1fc68d783ff2250c3357c3281ede81e" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.425079 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.436682 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"db905ec2-675e-48ea-a051-ed3d78c35797","Type":"ContainerDied","Data":"223b383c0ac8b350c1df0ab0af665d4f99d8904987d1963f07e7f334e3a37d46"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.437601 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.456756 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" (UID: "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.460966 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.461878 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"861e40f8-c596-40a1-b192-2fa51f567b55","Type":"ContainerDied","Data":"21a3b5351f89ec759c17eadcb40c8a78728afbe6149c5ce1492d256037e3e42a"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.466724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4a60-account-delete-j4cg4" event={"ID":"f48bbac5-2782-4c1e-b74b-520f0457f9ac","Type":"ContainerStarted","Data":"76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.466990 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance4a60-account-delete-j4cg4" podUID="f48bbac5-2782-4c1e-b74b-520f0457f9ac" containerName="mariadb-account-delete" containerID="cri-o://76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d" gracePeriod=30 Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.481160 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-internal-tls-certs\") pod \"29bf9036-d8fc-43f7-9153-f133a723c6df\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.481333 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29bf9036-d8fc-43f7-9153-f133a723c6df-logs\") pod \"29bf9036-d8fc-43f7-9153-f133a723c6df\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.481652 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29bf9036-d8fc-43f7-9153-f133a723c6df-etc-machine-id\") pod \"29bf9036-d8fc-43f7-9153-f133a723c6df\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.481733 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts\") pod \"29bf9036-d8fc-43f7-9153-f133a723c6df\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.482568 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-public-tls-certs\") pod \"29bf9036-d8fc-43f7-9153-f133a723c6df\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.483202 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data\") pod \"29bf9036-d8fc-43f7-9153-f133a723c6df\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.483231 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgcsc\" (UniqueName: \"kubernetes.io/projected/29bf9036-d8fc-43f7-9153-f133a723c6df-kube-api-access-fgcsc\") pod \"29bf9036-d8fc-43f7-9153-f133a723c6df\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.483255 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom\") pod \"29bf9036-d8fc-43f7-9153-f133a723c6df\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.483296 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-combined-ca-bundle\") pod \"29bf9036-d8fc-43f7-9153-f133a723c6df\" (UID: \"29bf9036-d8fc-43f7-9153-f133a723c6df\") " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.484628 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance4a60-account-delete-j4cg4" podStartSLOduration=7.484491087 podStartE2EDuration="7.484491087s" podCreationTimestamp="2025-11-22 08:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:45:58.484428285 +0000 UTC m=+1432.190789347" watchObservedRunningTime="2025-11-22 08:45:58.484491087 +0000 UTC m=+1432.190852139" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.485962 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29bf9036-d8fc-43f7-9153-f133a723c6df-logs" (OuterVolumeSpecName: "logs") pod "29bf9036-d8fc-43f7-9153-f133a723c6df" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.486377 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement0984-account-delete-82zvj" event={"ID":"9375da2b-3776-4c32-8afd-d1ed7b22b308","Type":"ContainerStarted","Data":"6ef5c4847a495226bbd00d72b1c06fd2666d37d7018b465aac2d955130276623"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.489750 4743 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/placement0984-account-delete-82zvj" secret="" err="secret \"galera-openstack-dockercfg-8fkdw\" not found" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.489941 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c61760fb-827b-4199-bfdb-52698c7b4824" (UID: "c61760fb-827b-4199-bfdb-52698c7b4824"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.490157 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29bf9036-d8fc-43f7-9153-f133a723c6df-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "29bf9036-d8fc-43f7-9153-f133a723c6df" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.494726 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi1d8b-account-delete-scjlt" event={"ID":"30ee548a-8838-4d52-867b-4dfdb6c4f641","Type":"ContainerStarted","Data":"d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.494853 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi1d8b-account-delete-scjlt" podUID="30ee548a-8838-4d52-867b-4dfdb6c4f641" containerName="mariadb-account-delete" containerID="cri-o://d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818" gracePeriod=30 Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.494937 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.494974 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bflkx\" (UniqueName: \"kubernetes.io/projected/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-kube-api-access-bflkx\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.494992 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495004 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c61760fb-827b-4199-bfdb-52698c7b4824-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495017 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gm9q\" (UniqueName: \"kubernetes.io/projected/c61760fb-827b-4199-bfdb-52698c7b4824-kube-api-access-5gm9q\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495028 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495039 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495052 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29bf9036-d8fc-43f7-9153-f133a723c6df-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495064 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tv7j\" (UniqueName: \"kubernetes.io/projected/dc034ce8-656e-4c88-92f1-18f384ae1a18-kube-api-access-5tv7j\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495074 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc034ce8-656e-4c88-92f1-18f384ae1a18-logs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495085 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495094 4743 scope.go:117] "RemoveContainer" containerID="1cfb29dd7e0a21897754c302d7a14c2ab839c36f149cccd25dabc107f11f9bed" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495096 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495185 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.495199 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.509664 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-config-data" (OuterVolumeSpecName: "config-data") pod "2b24dd85-d686-4fb0-be74-7aca0b03255c" (UID: "2b24dd85-d686-4fb0-be74-7aca0b03255c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.516676 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts" (OuterVolumeSpecName: "scripts") pod "29bf9036-d8fc-43f7-9153-f133a723c6df" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.522499 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" event={"ID":"dc034ce8-656e-4c88-92f1-18f384ae1a18","Type":"ContainerDied","Data":"5c99c956fa088361d1835204709a0f50ffb52b9cc07253eced2b8fa90aa30577"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.522615 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dcbbd6f66-kjrm8" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.528263 4743 generic.go:334] "Generic (PLEG): container finished" podID="d96211ff-f7ba-4e26-ae39-43c8062e2277" containerID="005a94ed2d08108098c7e56929f699b03c95b20224af467c218ff109ef554523" exitCode=1 Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.528624 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1c099-account-delete-lqvbc" event={"ID":"d96211ff-f7ba-4e26-ae39-43c8062e2277","Type":"ContainerDied","Data":"005a94ed2d08108098c7e56929f699b03c95b20224af467c218ff109ef554523"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.531246 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement0984-account-delete-82zvj" podStartSLOduration=8.531226804 podStartE2EDuration="8.531226804s" podCreationTimestamp="2025-11-22 08:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:45:58.509310713 +0000 UTC m=+1432.215671765" watchObservedRunningTime="2025-11-22 08:45:58.531226804 +0000 UTC m=+1432.237587856" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.538992 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29bf9036-d8fc-43f7-9153-f133a723c6df-kube-api-access-fgcsc" (OuterVolumeSpecName: "kube-api-access-fgcsc") pod "29bf9036-d8fc-43f7-9153-f133a723c6df" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df"). InnerVolumeSpecName "kube-api-access-fgcsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.540272 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "29bf9036-d8fc-43f7-9153-f133a723c6df" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.541265 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"dca6d95c-89d6-4b49-bf28-2606b9b5c05e","Type":"ContainerDied","Data":"fe08eff973531e6f3659274cc446f032181fff809dc4a40cc87a8af36f126183"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.541446 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.546062 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi1d8b-account-delete-scjlt" podStartSLOduration=7.546041781 podStartE2EDuration="7.546041781s" podCreationTimestamp="2025-11-22 08:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:45:58.527280141 +0000 UTC m=+1432.233641193" watchObservedRunningTime="2025-11-22 08:45:58.546041781 +0000 UTC m=+1432.252402833" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.553946 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.555011 4743 generic.go:334] "Generic (PLEG): container finished" podID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerID="85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895" exitCode=0 Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.555139 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29bf9036-d8fc-43f7-9153-f133a723c6df","Type":"ContainerDied","Data":"85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.555226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29bf9036-d8fc-43f7-9153-f133a723c6df","Type":"ContainerDied","Data":"fb1a2f68abea61d3fee74635e859d7ba84e53bbf1e5a5f005f35b7ba574c63c1"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.555405 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.556663 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-config-data" (OuterVolumeSpecName: "config-data") pod "db905ec2-675e-48ea-a051-ed3d78c35797" (UID: "db905ec2-675e-48ea-a051-ed3d78c35797"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.579196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0ad52-account-delete-9r4lw" event={"ID":"5fca29fd-c34f-4954-960f-b5ca0812d5b0","Type":"ContainerStarted","Data":"8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.579367 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell0ad52-account-delete-9r4lw" podUID="5fca29fd-c34f-4954-960f-b5ca0812d5b0" containerName="mariadb-account-delete" containerID="cri-o://8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5" gracePeriod=30 Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.579722 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89d8e638-b97a-4273-9391-5e0c7dd1bfb1" (UID: "89d8e638-b97a-4273-9391-5e0c7dd1bfb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.581172 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db905ec2-675e-48ea-a051-ed3d78c35797" (UID: "db905ec2-675e-48ea-a051-ed3d78c35797"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.583028 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11c59cd3-7ee4-43f3-83ce-9d22824473d7" (UID: "11c59cd3-7ee4-43f3-83ce-9d22824473d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.583817 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron8dc4-account-delete-rtl4b" event={"ID":"f6d1b00d-147b-4865-b659-59d06f360797","Type":"ContainerStarted","Data":"bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.583900 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron8dc4-account-delete-rtl4b" podUID="f6d1b00d-147b-4865-b659-59d06f360797" containerName="mariadb-account-delete" containerID="cri-o://bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9" gracePeriod=30 Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.587936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c61760fb-827b-4199-bfdb-52698c7b4824","Type":"ContainerDied","Data":"46b71e1b48f9c7665baab713cd7dc6fa5b9dc43ef77f56a297a7f8b0a42d5cf6"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.588044 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.589752 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican675b-account-delete-bdj7v" event={"ID":"5310c975-ef7b-4161-ab2e-5ee94b709f9d","Type":"ContainerStarted","Data":"9d5ed428f2c04f357aa903343aea3a22402d7ece9dcdf137788ad34286fe9ec5"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.590547 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican675b-account-delete-bdj7v" podUID="5310c975-ef7b-4161-ab2e-5ee94b709f9d" containerName="mariadb-account-delete" containerID="cri-o://9d5ed428f2c04f357aa903343aea3a22402d7ece9dcdf137788ad34286fe9ec5" gracePeriod=30 Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.598181 4743 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598226 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.598251 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts podName:9375da2b-3776-4c32-8afd-d1ed7b22b308 nodeName:}" failed. No retries permitted until 2025-11-22 08:45:59.098231176 +0000 UTC m=+1432.804592228 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts") pod "placement0984-account-delete-82zvj" (UID: "9375da2b-3776-4c32-8afd-d1ed7b22b308") : configmap "openstack-scripts" not found Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598279 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29bf9036-d8fc-43f7-9153-f133a723c6df-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598296 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598310 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598322 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598333 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598346 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598358 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgcsc\" (UniqueName: \"kubernetes.io/projected/29bf9036-d8fc-43f7-9153-f133a723c6df-kube-api-access-fgcsc\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598372 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598384 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.598397 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.599059 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder89a7-account-delete-gjvvg" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.600717 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.600886 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84df6c6d8d-v9vxr" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.601288 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.601691 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-84bbbc9bdb-72lc6" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.601718 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" event={"ID":"89d8e638-b97a-4273-9391-5e0c7dd1bfb1","Type":"ContainerDied","Data":"fde53c906c1f4f9a19f61b1d47e8097f66d4506cfd818fa986ab4390d6a26513"} Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.601814 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7cd8fdf575-7kd5c" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.602502 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.603744 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell0ad52-account-delete-9r4lw" podStartSLOduration=7.603724334 podStartE2EDuration="7.603724334s" podCreationTimestamp="2025-11-22 08:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:45:58.597304129 +0000 UTC m=+1432.303665201" watchObservedRunningTime="2025-11-22 08:45:58.603724334 +0000 UTC m=+1432.310085386" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.656306 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican675b-account-delete-bdj7v" podStartSLOduration=8.6562885 podStartE2EDuration="8.6562885s" podCreationTimestamp="2025-11-22 08:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:45:58.617380388 +0000 UTC m=+1432.323741440" watchObservedRunningTime="2025-11-22 08:45:58.6562885 +0000 UTC m=+1432.362649552" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.659238 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron8dc4-account-delete-rtl4b" podStartSLOduration=7.659226234 podStartE2EDuration="7.659226234s" podCreationTimestamp="2025-11-22 08:45:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 08:45:58.640213106 +0000 UTC m=+1432.346574168" watchObservedRunningTime="2025-11-22 08:45:58.659226234 +0000 UTC m=+1432.365587286" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.663881 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-config-data" (OuterVolumeSpecName: "config-data") pod "11c59cd3-7ee4-43f3-83ce-9d22824473d7" (UID: "11c59cd3-7ee4-43f3-83ce-9d22824473d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.665726 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dca6d95c-89d6-4b49-bf28-2606b9b5c05e" (UID: "dca6d95c-89d6-4b49-bf28-2606b9b5c05e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.679433 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.686835 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "861e40f8-c596-40a1-b192-2fa51f567b55" (UID: "861e40f8-c596-40a1-b192-2fa51f567b55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.698808 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29bf9036-d8fc-43f7-9153-f133a723c6df" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.699843 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11c59cd3-7ee4-43f3-83ce-9d22824473d7-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.699857 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.699866 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.699874 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.699883 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.713429 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "db905ec2-675e-48ea-a051-ed3d78c35797" (UID: "db905ec2-675e-48ea-a051-ed3d78c35797"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.746628 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-config-data" (OuterVolumeSpecName: "config-data") pod "dca6d95c-89d6-4b49-bf28-2606b9b5c05e" (UID: "dca6d95c-89d6-4b49-bf28-2606b9b5c05e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.746692 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc034ce8-656e-4c88-92f1-18f384ae1a18" (UID: "dc034ce8-656e-4c88-92f1-18f384ae1a18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.748226 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c61760fb-827b-4199-bfdb-52698c7b4824" (UID: "c61760fb-827b-4199-bfdb-52698c7b4824"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.793995 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "861e40f8-c596-40a1-b192-2fa51f567b55" (UID: "861e40f8-c596-40a1-b192-2fa51f567b55"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.794797 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data" (OuterVolumeSpecName: "config-data") pod "dc034ce8-656e-4c88-92f1-18f384ae1a18" (UID: "dc034ce8-656e-4c88-92f1-18f384ae1a18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.805610 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.806059 4743 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/861e40f8-c596-40a1-b192-2fa51f567b55-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.806560 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.807271 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.807596 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.808308 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dca6d95c-89d6-4b49-bf28-2606b9b5c05e-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.836594 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2b24dd85-d686-4fb0-be74-7aca0b03255c" (UID: "2b24dd85-d686-4fb0-be74-7aca0b03255c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.849728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data" (OuterVolumeSpecName: "config-data") pod "89d8e638-b97a-4273-9391-5e0c7dd1bfb1" (UID: "89d8e638-b97a-4273-9391-5e0c7dd1bfb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.867032 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "db905ec2-675e-48ea-a051-ed3d78c35797" (UID: "db905ec2-675e-48ea-a051-ed3d78c35797"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.874693 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data" (OuterVolumeSpecName: "config-data") pod "29bf9036-d8fc-43f7-9153-f133a723c6df" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.897110 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "29bf9036-d8fc-43f7-9153-f133a723c6df" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.898554 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "29bf9036-d8fc-43f7-9153-f133a723c6df" (UID: "29bf9036-d8fc-43f7-9153-f133a723c6df"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.902217 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dc034ce8-656e-4c88-92f1-18f384ae1a18" (UID: "dc034ce8-656e-4c88-92f1-18f384ae1a18"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.902679 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-config-data" (OuterVolumeSpecName: "config-data") pod "c61760fb-827b-4199-bfdb-52698c7b4824" (UID: "c61760fb-827b-4199-bfdb-52698c7b4824"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.914854 4743 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.914880 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c61760fb-827b-4199-bfdb-52698c7b4824-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: E1122 08:45:58.914914 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data podName:e5fac46a-545d-4f30-a7ab-8f5e713e934d nodeName:}" failed. No retries permitted until 2025-11-22 08:46:06.914897765 +0000 UTC m=+1440.621258817 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data") pod "rabbitmq-server-0" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d") : configmap "rabbitmq-config-data" not found Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.914947 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db905ec2-675e-48ea-a051-ed3d78c35797-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.914960 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.914970 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.914979 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.914988 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b24dd85-d686-4fb0-be74-7aca0b03255c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.915000 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89d8e638-b97a-4273-9391-5e0c7dd1bfb1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.915008 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29bf9036-d8fc-43f7-9153-f133a723c6df-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.922233 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "dc034ce8-656e-4c88-92f1-18f384ae1a18" (UID: "dc034ce8-656e-4c88-92f1-18f384ae1a18"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.922523 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "d3a93a60-b315-4de2-96d7-d23c9cedbc9c" (UID: "d3a93a60-b315-4de2-96d7-d23c9cedbc9c"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:58 crc kubenswrapper[4743]: I1122 08:45:58.931749 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" (UID: "abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.017119 4743 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3a93a60-b315-4de2-96d7-d23c9cedbc9c-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.017152 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc034ce8-656e-4c88-92f1-18f384ae1a18-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.017262 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: E1122 08:45:59.025844 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 08:45:59 crc kubenswrapper[4743]: E1122 08:45:59.029103 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 08:45:59 crc kubenswrapper[4743]: E1122 08:45:59.032650 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 08:45:59 crc kubenswrapper[4743]: E1122 08:45:59.032737 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="500679c5-1691-4831-b5ec-3c6cce19c503" containerName="nova-cell0-conductor-conductor" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.057731 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.198:3000/\": dial tcp 10.217.0.198:3000: connect: connection refused" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.072978 4743 scope.go:117] "RemoveContainer" containerID="928c4075312b7232f7da07b77b7db3aff8ceda2f473954fe371d1b407a38a03a" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.074177 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1c099-account-delete-lqvbc" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.113632 4743 scope.go:117] "RemoveContainer" containerID="05a05410e859d0aa129b195d9be97306e21232ea6ee301f84157b08562c6ad1b" Nov 22 08:45:59 crc kubenswrapper[4743]: E1122 08:45:59.119708 4743 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 22 08:45:59 crc kubenswrapper[4743]: E1122 08:45:59.119781 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts podName:9375da2b-3776-4c32-8afd-d1ed7b22b308 nodeName:}" failed. No retries permitted until 2025-11-22 08:46:00.119764229 +0000 UTC m=+1433.826125291 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts") pod "placement0984-account-delete-82zvj" (UID: "9375da2b-3776-4c32-8afd-d1ed7b22b308") : configmap "openstack-scripts" not found Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.266529 4743 scope.go:117] "RemoveContainer" containerID="455de173684d6834930eabe9a480ac739569a3760776782c2c81d8591d036411" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.270527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlr6w\" (UniqueName: \"kubernetes.io/projected/d96211ff-f7ba-4e26-ae39-43c8062e2277-kube-api-access-mlr6w\") pod \"d96211ff-f7ba-4e26-ae39-43c8062e2277\" (UID: \"d96211ff-f7ba-4e26-ae39-43c8062e2277\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.271588 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts\") pod \"d96211ff-f7ba-4e26-ae39-43c8062e2277\" (UID: \"d96211ff-f7ba-4e26-ae39-43c8062e2277\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.277959 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d96211ff-f7ba-4e26-ae39-43c8062e2277" (UID: "d96211ff-f7ba-4e26-ae39-43c8062e2277"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.338781 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d96211ff-f7ba-4e26-ae39-43c8062e2277-kube-api-access-mlr6w" (OuterVolumeSpecName: "kube-api-access-mlr6w") pod "d96211ff-f7ba-4e26-ae39-43c8062e2277" (UID: "d96211ff-f7ba-4e26-ae39-43c8062e2277"). InnerVolumeSpecName "kube-api-access-mlr6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.358597 4743 scope.go:117] "RemoveContainer" containerID="0febb6e2d7ff4813fd6df7b99de1a803ade35dd751b487778b4585a6c0ce4d64" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.378205 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d96211ff-f7ba-4e26-ae39-43c8062e2277-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.378232 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlr6w\" (UniqueName: \"kubernetes.io/projected/d96211ff-f7ba-4e26-ae39-43c8062e2277-kube-api-access-mlr6w\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.381907 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" path="/var/lib/kubelet/pods/178ccbe4-360f-4a0d-b97c-edf5b8b8dcba/volumes" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.418464 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29734ea4-591c-478e-8030-55fcbac72d3a" path="/var/lib/kubelet/pods/29734ea4-591c-478e-8030-55fcbac72d3a/volumes" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.419241 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7be7b8b-96eb-40fb-98b2-bc33e2154343" path="/var/lib/kubelet/pods/b7be7b8b-96eb-40fb-98b2-bc33e2154343/volumes" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.421503 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.421534 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.421550 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-84bbbc9bdb-72lc6"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.421564 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-84bbbc9bdb-72lc6"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.421596 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7cd8fdf575-7kd5c"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.421610 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7cd8fdf575-7kd5c"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.421622 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.421635 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.426214 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.500183 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.516778 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.523275 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.542549 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.556206 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.569694 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.580896 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.587344 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.593279 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.601108 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-84df6c6d8d-v9vxr"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.610365 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-84df6c6d8d-v9vxr"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.615512 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.622482 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.628803 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dcbbd6f66-kjrm8"] Nov 22 08:45:59 crc kubenswrapper[4743]: E1122 08:45:59.631517 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29bf9036_d8fc_43f7_9153_f133a723c6df.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a93a60_b315_4de2_96d7_d23c9cedbc9c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5b21104_eefe_4583_9af8_731d561b78c2.slice/crio-conmon-8e1277095f530b9d213cf681f4500af6bf174fddfe554f6556001dae2a813e03.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29bf9036_d8fc_43f7_9153_f133a723c6df.slice/crio-fb1a2f68abea61d3fee74635e859d7ba84e53bbf1e5a5f005f35b7ba574c63c1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3a93a60_b315_4de2_96d7_d23c9cedbc9c.slice/crio-9599f4fa0992155906bf350a593ed3aa398353ce4ff4f43f444ac8fe006585ac\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabdfc89d_bd20_4fae_b6f6_ee4d1729e8ca.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d63130_217d_400e_afc5_6b6bb3d56658.slice/crio-9b8889acf70f714f96a12fef606aa84ad0497560bdf80aa3dddac96e77684d76.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc61760fb_827b_4199_bfdb_52698c7b4824.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d63130_217d_400e_afc5_6b6bb3d56658.slice/crio-conmon-9b8889acf70f714f96a12fef606aa84ad0497560bdf80aa3dddac96e77684d76.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5fac46a_545d_4f30_a7ab_8f5e713e934d.slice/crio-6e1b913f0b8534fa70afd00b409ba87dcd773b31786f2f1c5518bc5e04e427a8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabdfc89d_bd20_4fae_b6f6_ee4d1729e8ca.slice/crio-308da01ed407044c48117ec420e360d5dda27692aad8d33916b3bb6c489ba6ad\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5fac46a_545d_4f30_a7ab_8f5e713e934d.slice/crio-conmon-6e1b913f0b8534fa70afd00b409ba87dcd773b31786f2f1c5518bc5e04e427a8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc61760fb_827b_4199_bfdb_52698c7b4824.slice/crio-46b71e1b48f9c7665baab713cd7dc6fa5b9dc43ef77f56a297a7f8b0a42d5cf6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5b21104_eefe_4583_9af8_731d561b78c2.slice/crio-8e1277095f530b9d213cf681f4500af6bf174fddfe554f6556001dae2a813e03.scope\": RecentStats: unable to find data in memory cache]" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.636412 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6dcbbd6f66-kjrm8"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.649872 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.658666 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.662759 4743 generic.go:334] "Generic (PLEG): container finished" podID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" containerID="59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5" exitCode=0 Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.663101 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1","Type":"ContainerDied","Data":"59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5"} Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.663126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1","Type":"ContainerDied","Data":"b507b4d34cd4f86c446e4edafb6b74db493c1dbcc29dd36d4787d8b073d954b7"} Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.668043 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell1c099-account-delete-lqvbc" event={"ID":"d96211ff-f7ba-4e26-ae39-43c8062e2277","Type":"ContainerDied","Data":"a2423c2f3a40b761371f356af39a17b1fe4f3bcaae35153ffbd526cf8a43e870"} Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.668148 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell1c099-account-delete-lqvbc" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.669877 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.672746 4743 scope.go:117] "RemoveContainer" containerID="6671bb99b39fc16a0f6c253ac0e494e49254030b8ca083c6f60cb786f074a063" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.678921 4743 generic.go:334] "Generic (PLEG): container finished" podID="f5b21104-eefe-4583-9af8-731d561b78c2" containerID="8e1277095f530b9d213cf681f4500af6bf174fddfe554f6556001dae2a813e03" exitCode=0 Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.678987 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66c4f7f76d-b9q4p" event={"ID":"f5b21104-eefe-4583-9af8-731d561b78c2","Type":"ContainerDied","Data":"8e1277095f530b9d213cf681f4500af6bf174fddfe554f6556001dae2a813e03"} Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.703798 4743 generic.go:334] "Generic (PLEG): container finished" podID="e5fac46a-545d-4f30-a7ab-8f5e713e934d" containerID="6e1b913f0b8534fa70afd00b409ba87dcd773b31786f2f1c5518bc5e04e427a8" exitCode=0 Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.703842 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5fac46a-545d-4f30-a7ab-8f5e713e934d","Type":"ContainerDied","Data":"6e1b913f0b8534fa70afd00b409ba87dcd773b31786f2f1c5518bc5e04e427a8"} Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.724309 4743 generic.go:334] "Generic (PLEG): container finished" podID="d3d63130-217d-400e-afc5-6b6bb3d56658" containerID="9b8889acf70f714f96a12fef606aa84ad0497560bdf80aa3dddac96e77684d76" exitCode=0 Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.724690 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d3d63130-217d-400e-afc5-6b6bb3d56658","Type":"ContainerDied","Data":"9b8889acf70f714f96a12fef606aa84ad0497560bdf80aa3dddac96e77684d76"} Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.729928 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell1c099-account-delete-lqvbc"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.732211 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b9817865-d957-42d3-8edb-6800e1075d23/ovn-northd/0.log" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.732758 4743 generic.go:334] "Generic (PLEG): container finished" podID="b9817865-d957-42d3-8edb-6800e1075d23" containerID="44e22b0e556cf479c4ab148fe02b8b602f8d6a658164bfd210e15bbe9a5c9282" exitCode=139 Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.732996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9817865-d957-42d3-8edb-6800e1075d23","Type":"ContainerDied","Data":"44e22b0e556cf479c4ab148fe02b8b602f8d6a658164bfd210e15bbe9a5c9282"} Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.733859 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell1c099-account-delete-lqvbc"] Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.734945 4743 scope.go:117] "RemoveContainer" containerID="d713e66a35891819a155186b552565a296254b8c93475b9aa0a54b55dd7cbd38" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.747233 4743 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/placement0984-account-delete-82zvj" secret="" err="secret \"galera-openstack-dockercfg-8fkdw\" not found" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.777321 4743 scope.go:117] "RemoveContainer" containerID="85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784210 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-erlang-cookie-secret\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784288 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-plugins\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784319 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-server-conf\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784374 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784422 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh9pp\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-kube-api-access-bh9pp\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784469 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784529 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-tls\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784570 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-plugins-conf\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784810 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-erlang-cookie\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784876 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-pod-info\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.784939 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-confd\") pod \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\" (UID: \"ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1\") " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.787675 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.789080 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.789554 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.791698 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.794996 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "persistence") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.799457 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-pod-info" (OuterVolumeSpecName: "pod-info") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.800890 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-kube-api-access-bh9pp" (OuterVolumeSpecName: "kube-api-access-bh9pp") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "kube-api-access-bh9pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.805116 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.844831 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data" (OuterVolumeSpecName: "config-data") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.858452 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-server-conf" (OuterVolumeSpecName: "server-conf") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.888035 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.888067 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.888078 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.888102 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.888116 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh9pp\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-kube-api-access-bh9pp\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.888130 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.888141 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.888153 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.888163 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.888175 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.916959 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.930954 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" (UID: "ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.954977 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.963085 4743 scope.go:117] "RemoveContainer" containerID="bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.980894 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.990592 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.990636 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.996693 4743 scope.go:117] "RemoveContainer" containerID="85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895" Nov 22 08:45:59 crc kubenswrapper[4743]: E1122 08:45:59.997902 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895\": container with ID starting with 85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895 not found: ID does not exist" containerID="85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.997934 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895"} err="failed to get container status \"85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895\": rpc error: code = NotFound desc = could not find container \"85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895\": container with ID starting with 85ae9ba938f55103f57f752b57e2e509d3bcf46163465dd05d2c38b506aa8895 not found: ID does not exist" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.997962 4743 scope.go:117] "RemoveContainer" containerID="bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7" Nov 22 08:45:59 crc kubenswrapper[4743]: E1122 08:45:59.998614 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7\": container with ID starting with bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7 not found: ID does not exist" containerID="bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.998668 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7"} err="failed to get container status \"bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7\": rpc error: code = NotFound desc = could not find container \"bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7\": container with ID starting with bc9750a201c6bcf7446cbcf687f1621cf01ea70ac3b39909c7442337f9104dd7 not found: ID does not exist" Nov 22 08:45:59 crc kubenswrapper[4743]: I1122 08:45:59.998699 4743 scope.go:117] "RemoveContainer" containerID="f148d19bec9da2034a614aa3685da5500ee102ac2162e404c9df6a8dd6001346" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.004098 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.039463 4743 scope.go:117] "RemoveContainer" containerID="d85aa17d800ad1cbc94e2aaf79a94f094b2da7ff02061d9a5cc19841c8f58bb3" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.090055 4743 scope.go:117] "RemoveContainer" containerID="c4a492c46b22ecd2cb2ce30f4d5cbacdf5b41359fdcc9e9ef3d84f92e3284551" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091209 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-combined-ca-bundle\") pod \"d3d63130-217d-400e-afc5-6b6bb3d56658\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091268 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-erlang-cookie\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091321 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-credential-keys\") pod \"f5b21104-eefe-4583-9af8-731d561b78c2\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091351 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-generated\") pod \"d3d63130-217d-400e-afc5-6b6bb3d56658\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091377 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-internal-tls-certs\") pod \"f5b21104-eefe-4583-9af8-731d561b78c2\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091407 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5fac46a-545d-4f30-a7ab-8f5e713e934d-erlang-cookie-secret\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091435 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5fac46a-545d-4f30-a7ab-8f5e713e934d-pod-info\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091470 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091496 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-public-tls-certs\") pod \"f5b21104-eefe-4583-9af8-731d561b78c2\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091524 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-combined-ca-bundle\") pod \"f5b21104-eefe-4583-9af8-731d561b78c2\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091593 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-config-data\") pod \"f5b21104-eefe-4583-9af8-731d561b78c2\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091646 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-default\") pod \"d3d63130-217d-400e-afc5-6b6bb3d56658\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091672 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-kolla-config\") pod \"d3d63130-217d-400e-afc5-6b6bb3d56658\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091697 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-galera-tls-certs\") pod \"d3d63130-217d-400e-afc5-6b6bb3d56658\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091802 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-plugins-conf\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091829 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-plugins\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091878 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"d3d63130-217d-400e-afc5-6b6bb3d56658\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091913 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6zt9\" (UniqueName: \"kubernetes.io/projected/f5b21104-eefe-4583-9af8-731d561b78c2-kube-api-access-l6zt9\") pod \"f5b21104-eefe-4583-9af8-731d561b78c2\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091935 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091955 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-scripts\") pod \"f5b21104-eefe-4583-9af8-731d561b78c2\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.091979 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnrgs\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-kube-api-access-hnrgs\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.092020 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-operator-scripts\") pod \"d3d63130-217d-400e-afc5-6b6bb3d56658\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.092044 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxhb4\" (UniqueName: \"kubernetes.io/projected/d3d63130-217d-400e-afc5-6b6bb3d56658-kube-api-access-xxhb4\") pod \"d3d63130-217d-400e-afc5-6b6bb3d56658\" (UID: \"d3d63130-217d-400e-afc5-6b6bb3d56658\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.092073 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-fernet-keys\") pod \"f5b21104-eefe-4583-9af8-731d561b78c2\" (UID: \"f5b21104-eefe-4583-9af8-731d561b78c2\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.092100 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-tls\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.092132 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-server-conf\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.092154 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-confd\") pod \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\" (UID: \"e5fac46a-545d-4f30-a7ab-8f5e713e934d\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.092067 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.093432 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "d3d63130-217d-400e-afc5-6b6bb3d56658" (UID: "d3d63130-217d-400e-afc5-6b6bb3d56658"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.099260 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.099598 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.102926 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "d3d63130-217d-400e-afc5-6b6bb3d56658" (UID: "d3d63130-217d-400e-afc5-6b6bb3d56658"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.113129 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "d3d63130-217d-400e-afc5-6b6bb3d56658" (UID: "d3d63130-217d-400e-afc5-6b6bb3d56658"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.113705 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3d63130-217d-400e-afc5-6b6bb3d56658" (UID: "d3d63130-217d-400e-afc5-6b6bb3d56658"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.118762 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5fac46a-545d-4f30-a7ab-8f5e713e934d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.123777 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-kube-api-access-hnrgs" (OuterVolumeSpecName: "kube-api-access-hnrgs") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "kube-api-access-hnrgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.124190 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.132567 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f5b21104-eefe-4583-9af8-731d561b78c2" (UID: "f5b21104-eefe-4583-9af8-731d561b78c2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.136268 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f5b21104-eefe-4583-9af8-731d561b78c2" (UID: "f5b21104-eefe-4583-9af8-731d561b78c2"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.142770 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-scripts" (OuterVolumeSpecName: "scripts") pod "f5b21104-eefe-4583-9af8-731d561b78c2" (UID: "f5b21104-eefe-4583-9af8-731d561b78c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.147745 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.153894 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d63130-217d-400e-afc5-6b6bb3d56658-kube-api-access-xxhb4" (OuterVolumeSpecName: "kube-api-access-xxhb4") pod "d3d63130-217d-400e-afc5-6b6bb3d56658" (UID: "d3d63130-217d-400e-afc5-6b6bb3d56658"). InnerVolumeSpecName "kube-api-access-xxhb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.156701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b21104-eefe-4583-9af8-731d561b78c2-kube-api-access-l6zt9" (OuterVolumeSpecName: "kube-api-access-l6zt9") pod "f5b21104-eefe-4583-9af8-731d561b78c2" (UID: "f5b21104-eefe-4583-9af8-731d561b78c2"). InnerVolumeSpecName "kube-api-access-l6zt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.158676 4743 scope.go:117] "RemoveContainer" containerID="b2dbbf998042cd2c9fe978946c70615b254e4f75ec35ea7cfe22feace7d787f6" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.158678 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e5fac46a-545d-4f30-a7ab-8f5e713e934d-pod-info" (OuterVolumeSpecName: "pod-info") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.165382 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-config-data" (OuterVolumeSpecName: "config-data") pod "f5b21104-eefe-4583-9af8-731d561b78c2" (UID: "f5b21104-eefe-4583-9af8-731d561b78c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.170230 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data" (OuterVolumeSpecName: "config-data") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.177945 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3d63130-217d-400e-afc5-6b6bb3d56658" (UID: "d3d63130-217d-400e-afc5-6b6bb3d56658"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.178003 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "d3d63130-217d-400e-afc5-6b6bb3d56658" (UID: "d3d63130-217d-400e-afc5-6b6bb3d56658"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.186490 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5b21104-eefe-4583-9af8-731d561b78c2" (UID: "f5b21104-eefe-4583-9af8-731d561b78c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.188905 4743 scope.go:117] "RemoveContainer" containerID="59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193760 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e5fac46a-545d-4f30-a7ab-8f5e713e934d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193783 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e5fac46a-545d-4f30-a7ab-8f5e713e934d-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193792 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193801 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193809 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193818 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-default\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193826 4743 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-kolla-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193833 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193842 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193858 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193874 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193882 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193891 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6zt9\" (UniqueName: \"kubernetes.io/projected/f5b21104-eefe-4583-9af8-731d561b78c2-kube-api-access-l6zt9\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193902 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnrgs\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-kube-api-access-hnrgs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193910 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d63130-217d-400e-afc5-6b6bb3d56658-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193919 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxhb4\" (UniqueName: \"kubernetes.io/projected/d3d63130-217d-400e-afc5-6b6bb3d56658-kube-api-access-xxhb4\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193926 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193934 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193941 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193949 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193956 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.193964 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d3d63130-217d-400e-afc5-6b6bb3d56658-config-data-generated\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: E1122 08:46:00.193762 4743 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 22 08:46:00 crc kubenswrapper[4743]: E1122 08:46:00.194498 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts podName:9375da2b-3776-4c32-8afd-d1ed7b22b308 nodeName:}" failed. No retries permitted until 2025-11-22 08:46:02.194479241 +0000 UTC m=+1435.900840293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts") pod "placement0984-account-delete-82zvj" (UID: "9375da2b-3776-4c32-8afd-d1ed7b22b308") : configmap "openstack-scripts" not found Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.201418 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-server-conf" (OuterVolumeSpecName: "server-conf") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.208795 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.215087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f5b21104-eefe-4583-9af8-731d561b78c2" (UID: "f5b21104-eefe-4583-9af8-731d561b78c2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.221309 4743 scope.go:117] "RemoveContainer" containerID="9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.223248 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.238833 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "d3d63130-217d-400e-afc5-6b6bb3d56658" (UID: "d3d63130-217d-400e-afc5-6b6bb3d56658"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.248473 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f5b21104-eefe-4583-9af8-731d561b78c2" (UID: "f5b21104-eefe-4583-9af8-731d561b78c2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.274800 4743 scope.go:117] "RemoveContainer" containerID="59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.275019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e5fac46a-545d-4f30-a7ab-8f5e713e934d" (UID: "e5fac46a-545d-4f30-a7ab-8f5e713e934d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: E1122 08:46:00.275454 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5\": container with ID starting with 59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5 not found: ID does not exist" containerID="59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.275488 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5"} err="failed to get container status \"59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5\": rpc error: code = NotFound desc = could not find container \"59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5\": container with ID starting with 59d916858b087416734785e00eedae29ca8cf12c25cb89cdd38a538f993e76c5 not found: ID does not exist" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.275514 4743 scope.go:117] "RemoveContainer" containerID="9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27" Nov 22 08:46:00 crc kubenswrapper[4743]: E1122 08:46:00.275799 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27\": container with ID starting with 9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27 not found: ID does not exist" containerID="9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.275826 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27"} err="failed to get container status \"9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27\": rpc error: code = NotFound desc = could not find container \"9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27\": container with ID starting with 9a702619958ecbc9da3c83f06a50a7a7cef93b5f9690bd840a5ecc80273d3e27 not found: ID does not exist" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.275847 4743 scope.go:117] "RemoveContainer" containerID="005a94ed2d08108098c7e56929f699b03c95b20224af467c218ff109ef554523" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.296189 4743 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d63130-217d-400e-afc5-6b6bb3d56658-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.296223 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.296235 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.296248 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e5fac46a-545d-4f30-a7ab-8f5e713e934d-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.296261 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e5fac46a-545d-4f30-a7ab-8f5e713e934d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.296272 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.296284 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5b21104-eefe-4583-9af8-731d561b78c2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.369464 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b9817865-d957-42d3-8edb-6800e1075d23/ovn-northd/0.log" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.369522 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.499603 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-config\") pod \"b9817865-d957-42d3-8edb-6800e1075d23\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.499727 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp55t\" (UniqueName: \"kubernetes.io/projected/b9817865-d957-42d3-8edb-6800e1075d23-kube-api-access-fp55t\") pod \"b9817865-d957-42d3-8edb-6800e1075d23\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.499774 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-metrics-certs-tls-certs\") pod \"b9817865-d957-42d3-8edb-6800e1075d23\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.499798 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-combined-ca-bundle\") pod \"b9817865-d957-42d3-8edb-6800e1075d23\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.499837 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9817865-d957-42d3-8edb-6800e1075d23-ovn-rundir\") pod \"b9817865-d957-42d3-8edb-6800e1075d23\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.499865 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-ovn-northd-tls-certs\") pod \"b9817865-d957-42d3-8edb-6800e1075d23\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.499916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-scripts\") pod \"b9817865-d957-42d3-8edb-6800e1075d23\" (UID: \"b9817865-d957-42d3-8edb-6800e1075d23\") " Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.500155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-config" (OuterVolumeSpecName: "config") pod "b9817865-d957-42d3-8edb-6800e1075d23" (UID: "b9817865-d957-42d3-8edb-6800e1075d23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.504605 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9817865-d957-42d3-8edb-6800e1075d23-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "b9817865-d957-42d3-8edb-6800e1075d23" (UID: "b9817865-d957-42d3-8edb-6800e1075d23"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.504904 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9817865-d957-42d3-8edb-6800e1075d23-kube-api-access-fp55t" (OuterVolumeSpecName: "kube-api-access-fp55t") pod "b9817865-d957-42d3-8edb-6800e1075d23" (UID: "b9817865-d957-42d3-8edb-6800e1075d23"). InnerVolumeSpecName "kube-api-access-fp55t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.525203 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.525497 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-scripts" (OuterVolumeSpecName: "scripts") pod "b9817865-d957-42d3-8edb-6800e1075d23" (UID: "b9817865-d957-42d3-8edb-6800e1075d23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.551391 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9817865-d957-42d3-8edb-6800e1075d23" (UID: "b9817865-d957-42d3-8edb-6800e1075d23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.578235 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "b9817865-d957-42d3-8edb-6800e1075d23" (UID: "b9817865-d957-42d3-8edb-6800e1075d23"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.581390 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b9817865-d957-42d3-8edb-6800e1075d23" (UID: "b9817865-d957-42d3-8edb-6800e1075d23"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.627079 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp55t\" (UniqueName: \"kubernetes.io/projected/b9817865-d957-42d3-8edb-6800e1075d23-kube-api-access-fp55t\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.627108 4743 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.627118 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.627126 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b9817865-d957-42d3-8edb-6800e1075d23-ovn-rundir\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.627135 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9817865-d957-42d3-8edb-6800e1075d23-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.627143 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9817865-d957-42d3-8edb-6800e1075d23-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.823638 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder89a7-account-delete-gjvvg"] Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.842647 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-89a7-account-create-g24xd"] Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.848167 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-96rh7"] Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.851211 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-66c4f7f76d-b9q4p" event={"ID":"f5b21104-eefe-4583-9af8-731d561b78c2","Type":"ContainerDied","Data":"645a9beeeb4567422b02850f9ebfc65784da718ccac8780aca3db28db4c0fb2b"} Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.851269 4743 scope.go:117] "RemoveContainer" containerID="8e1277095f530b9d213cf681f4500af6bf174fddfe554f6556001dae2a813e03" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.851411 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-66c4f7f76d-b9q4p" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.880112 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder89a7-account-delete-gjvvg"] Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.890864 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_b9817865-d957-42d3-8edb-6800e1075d23/ovn-northd/0.log" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.890946 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"b9817865-d957-42d3-8edb-6800e1075d23","Type":"ContainerDied","Data":"d99accdee1e475696e1a788c3d6b69812093615938aecdf117c478b35231380f"} Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.891058 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.911892 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-96rh7"] Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.913536 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d3d63130-217d-400e-afc5-6b6bb3d56658","Type":"ContainerDied","Data":"c6eafdf2e1a185ee3be73f8fb0387c62cc373faf5a53ee47f0ce1c525ad711e3"} Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.913678 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.923309 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.927525 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-89a7-account-create-g24xd"] Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.938879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e5fac46a-545d-4f30-a7ab-8f5e713e934d","Type":"ContainerDied","Data":"76ce132c90151d9f020a53331ee30677627ac885b042cd6fe138821b149b063b"} Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.939181 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 08:46:00 crc kubenswrapper[4743]: I1122 08:46:00.971763 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-66c4f7f76d-b9q4p"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:00.999759 4743 scope.go:117] "RemoveContainer" containerID="1f420d1e2699e276d82c94d18dd411a4b04324712350d57b7ef8e6cdb952414a" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.000900 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-66c4f7f76d-b9q4p"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.046957 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-s4q44"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.062652 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-s4q44"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.063938 4743 scope.go:117] "RemoveContainer" containerID="44e22b0e556cf479c4ab148fe02b8b602f8d6a658164bfd210e15bbe9a5c9282" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.099852 4743 scope.go:117] "RemoveContainer" containerID="9b8889acf70f714f96a12fef606aa84ad0497560bdf80aa3dddac96e77684d76" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.108449 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-0984-account-create-dwgfw"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.130415 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement0984-account-delete-82zvj"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.130782 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement0984-account-delete-82zvj" podUID="9375da2b-3776-4c32-8afd-d1ed7b22b308" containerName="mariadb-account-delete" containerID="cri-o://6ef5c4847a495226bbd00d72b1c06fd2666d37d7018b465aac2d955130276623" gracePeriod=30 Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.139912 4743 scope.go:117] "RemoveContainer" containerID="a13a6453f504348b1fc37cbf718993799543c1d0ff16da7e91ea05103e4dfc4b" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.143462 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-0984-account-create-dwgfw"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.149591 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.168409 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="054d2889-0839-4b71-9515-904051c64bc7" path="/var/lib/kubelet/pods/054d2889-0839-4b71-9515-904051c64bc7/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.170022 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11c59cd3-7ee4-43f3-83ce-9d22824473d7" path="/var/lib/kubelet/pods/11c59cd3-7ee4-43f3-83ce-9d22824473d7/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.170521 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2019c268-c5f1-4eff-aa27-6f26c3f37dfa" path="/var/lib/kubelet/pods/2019c268-c5f1-4eff-aa27-6f26c3f37dfa/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.171641 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29bf9036-d8fc-43f7-9153-f133a723c6df" path="/var/lib/kubelet/pods/29bf9036-d8fc-43f7-9153-f133a723c6df/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.172408 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" path="/var/lib/kubelet/pods/2b24dd85-d686-4fb0-be74-7aca0b03255c/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.173011 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b3ae5bd-983a-4ef5-95a6-52f6db24ac82" path="/var/lib/kubelet/pods/4b3ae5bd-983a-4ef5-95a6-52f6db24ac82/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.173900 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7936e330-2138-4624-b319-902f6a4941ec" path="/var/lib/kubelet/pods/7936e330-2138-4624-b319-902f6a4941ec/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.174433 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="861e40f8-c596-40a1-b192-2fa51f567b55" path="/var/lib/kubelet/pods/861e40f8-c596-40a1-b192-2fa51f567b55/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.174935 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" path="/var/lib/kubelet/pods/89d8e638-b97a-4273-9391-5e0c7dd1bfb1/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.175933 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fe5d70f-5277-4803-ae45-de61d0eefe27" path="/var/lib/kubelet/pods/8fe5d70f-5277-4803-ae45-de61d0eefe27/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.176501 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" path="/var/lib/kubelet/pods/abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.177099 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c61760fb-827b-4199-bfdb-52698c7b4824" path="/var/lib/kubelet/pods/c61760fb-827b-4199-bfdb-52698c7b4824/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.178624 4743 scope.go:117] "RemoveContainer" containerID="6e1b913f0b8534fa70afd00b409ba87dcd773b31786f2f1c5518bc5e04e427a8" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.178756 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c685a13e-8100-43c1-a0c4-417a12135281" path="/var/lib/kubelet/pods/c685a13e-8100-43c1-a0c4-417a12135281/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.179727 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3a93a60-b315-4de2-96d7-d23c9cedbc9c" path="/var/lib/kubelet/pods/d3a93a60-b315-4de2-96d7-d23c9cedbc9c/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.180334 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d96211ff-f7ba-4e26-ae39-43c8062e2277" path="/var/lib/kubelet/pods/d96211ff-f7ba-4e26-ae39-43c8062e2277/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.181453 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da8955a2-6deb-440c-97e3-f2420aa5fae8" path="/var/lib/kubelet/pods/da8955a2-6deb-440c-97e3-f2420aa5fae8/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.183955 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" path="/var/lib/kubelet/pods/db905ec2-675e-48ea-a051-ed3d78c35797/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.185296 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" path="/var/lib/kubelet/pods/dc034ce8-656e-4c88-92f1-18f384ae1a18/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.186432 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" path="/var/lib/kubelet/pods/dca6d95c-89d6-4b49-bf28-2606b9b5c05e/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.187906 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5b21104-eefe-4583-9af8-731d561b78c2" path="/var/lib/kubelet/pods/f5b21104-eefe-4583-9af8-731d561b78c2/volumes" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.188387 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.188410 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.188423 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.188432 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.188441 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.188451 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.193823 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.220914 4743 scope.go:117] "RemoveContainer" containerID="f84c516977fabf4d12664420973d6cd6aad1dffc3d9d6296c1edb5fc3318472d" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.251320 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.251363 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.251400 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.252093 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"986e2145b00a5a447ecb09e84f860b781baabf3cc2562b60d26d99a571cd2cc8"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.252161 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://986e2145b00a5a447ecb09e84f860b781baabf3cc2562b60d26d99a571cd2cc8" gracePeriod=600 Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.363400 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.556271 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-config-data\") pod \"500679c5-1691-4831-b5ec-3c6cce19c503\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.556389 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-combined-ca-bundle\") pod \"500679c5-1691-4831-b5ec-3c6cce19c503\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.556560 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcjdh\" (UniqueName: \"kubernetes.io/projected/500679c5-1691-4831-b5ec-3c6cce19c503-kube-api-access-mcjdh\") pod \"500679c5-1691-4831-b5ec-3c6cce19c503\" (UID: \"500679c5-1691-4831-b5ec-3c6cce19c503\") " Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.561392 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/500679c5-1691-4831-b5ec-3c6cce19c503-kube-api-access-mcjdh" (OuterVolumeSpecName: "kube-api-access-mcjdh") pod "500679c5-1691-4831-b5ec-3c6cce19c503" (UID: "500679c5-1691-4831-b5ec-3c6cce19c503"). InnerVolumeSpecName "kube-api-access-mcjdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.585860 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-config-data" (OuterVolumeSpecName: "config-data") pod "500679c5-1691-4831-b5ec-3c6cce19c503" (UID: "500679c5-1691-4831-b5ec-3c6cce19c503"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.592788 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "500679c5-1691-4831-b5ec-3c6cce19c503" (UID: "500679c5-1691-4831-b5ec-3c6cce19c503"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.659480 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.659521 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/500679c5-1691-4831-b5ec-3c6cce19c503-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.659538 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcjdh\" (UniqueName: \"kubernetes.io/projected/500679c5-1691-4831-b5ec-3c6cce19c503-kube-api-access-mcjdh\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.965760 4743 generic.go:334] "Generic (PLEG): container finished" podID="500679c5-1691-4831-b5ec-3c6cce19c503" containerID="aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" exitCode=0 Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.965809 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.965823 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"500679c5-1691-4831-b5ec-3c6cce19c503","Type":"ContainerDied","Data":"aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8"} Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.966293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"500679c5-1691-4831-b5ec-3c6cce19c503","Type":"ContainerDied","Data":"a20de782e28e0e8379c6797d783a4046209389085a5974006ff7b1c59c9dc05c"} Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.966316 4743 scope.go:117] "RemoveContainer" containerID="aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.980417 4743 generic.go:334] "Generic (PLEG): container finished" podID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerID="37fbdedccc446a670cae08eb9632f568f7a7eccdeada9e4e6146ebf59a36b2e3" exitCode=0 Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.980467 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58b7a46d-98c7-4e9e-94df-80d359fd68c7","Type":"ContainerDied","Data":"37fbdedccc446a670cae08eb9632f568f7a7eccdeada9e4e6146ebf59a36b2e3"} Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.980529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"58b7a46d-98c7-4e9e-94df-80d359fd68c7","Type":"ContainerDied","Data":"89b4182dfa6e79f4534d0c5ef9b9ec4a209bc42eddfd081856572c80729ad686"} Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.980548 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b4182dfa6e79f4534d0c5ef9b9ec4a209bc42eddfd081856572c80729ad686" Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.991403 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="986e2145b00a5a447ecb09e84f860b781baabf3cc2562b60d26d99a571cd2cc8" exitCode=0 Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.991453 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"986e2145b00a5a447ecb09e84f860b781baabf3cc2562b60d26d99a571cd2cc8"} Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.991487 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5"} Nov 22 08:46:01 crc kubenswrapper[4743]: I1122 08:46:01.998796 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.007791 4743 scope.go:117] "RemoveContainer" containerID="aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" Nov 22 08:46:02 crc kubenswrapper[4743]: E1122 08:46:02.008946 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8\": container with ID starting with aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8 not found: ID does not exist" containerID="aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.009164 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8"} err="failed to get container status \"aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8\": rpc error: code = NotFound desc = could not find container \"aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8\": container with ID starting with aa4d80805a1c83526f4d1a786c2012497b3c2920e132d3d5a0da8fd9766dc0e8 not found: ID does not exist" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.009330 4743 scope.go:117] "RemoveContainer" containerID="5b78c811f3b4d026db1f9c1117668378bf529a268e8a2883a781fcb01b039225" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.074839 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.095518 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.165215 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-run-httpd\") pod \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.165859 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdlbj\" (UniqueName: \"kubernetes.io/projected/58b7a46d-98c7-4e9e-94df-80d359fd68c7-kube-api-access-wdlbj\") pod \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.165746 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "58b7a46d-98c7-4e9e-94df-80d359fd68c7" (UID: "58b7a46d-98c7-4e9e-94df-80d359fd68c7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.166128 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-config-data\") pod \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.166523 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-scripts\") pod \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.166650 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-ceilometer-tls-certs\") pod \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.166740 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-sg-core-conf-yaml\") pod \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.166884 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-log-httpd\") pod \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.166990 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-combined-ca-bundle\") pod \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\" (UID: \"58b7a46d-98c7-4e9e-94df-80d359fd68c7\") " Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.167159 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "58b7a46d-98c7-4e9e-94df-80d359fd68c7" (UID: "58b7a46d-98c7-4e9e-94df-80d359fd68c7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.167531 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.167823 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/58b7a46d-98c7-4e9e-94df-80d359fd68c7-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.170324 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58b7a46d-98c7-4e9e-94df-80d359fd68c7-kube-api-access-wdlbj" (OuterVolumeSpecName: "kube-api-access-wdlbj") pod "58b7a46d-98c7-4e9e-94df-80d359fd68c7" (UID: "58b7a46d-98c7-4e9e-94df-80d359fd68c7"). InnerVolumeSpecName "kube-api-access-wdlbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.170592 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-scripts" (OuterVolumeSpecName: "scripts") pod "58b7a46d-98c7-4e9e-94df-80d359fd68c7" (UID: "58b7a46d-98c7-4e9e-94df-80d359fd68c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.191144 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "58b7a46d-98c7-4e9e-94df-80d359fd68c7" (UID: "58b7a46d-98c7-4e9e-94df-80d359fd68c7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.208376 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": dial tcp 10.217.0.202:8775: i/o timeout" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.208428 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": context deadline exceeded" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.223495 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "58b7a46d-98c7-4e9e-94df-80d359fd68c7" (UID: "58b7a46d-98c7-4e9e-94df-80d359fd68c7"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.226796 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58b7a46d-98c7-4e9e-94df-80d359fd68c7" (UID: "58b7a46d-98c7-4e9e-94df-80d359fd68c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.247521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-config-data" (OuterVolumeSpecName: "config-data") pod "58b7a46d-98c7-4e9e-94df-80d359fd68c7" (UID: "58b7a46d-98c7-4e9e-94df-80d359fd68c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.269879 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.269915 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.269925 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.269935 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdlbj\" (UniqueName: \"kubernetes.io/projected/58b7a46d-98c7-4e9e-94df-80d359fd68c7-kube-api-access-wdlbj\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.269945 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:02 crc kubenswrapper[4743]: I1122 08:46:02.269954 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58b7a46d-98c7-4e9e-94df-80d359fd68c7-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:02 crc kubenswrapper[4743]: E1122 08:46:02.270606 4743 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 22 08:46:02 crc kubenswrapper[4743]: E1122 08:46:02.270788 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts podName:9375da2b-3776-4c32-8afd-d1ed7b22b308 nodeName:}" failed. No retries permitted until 2025-11-22 08:46:06.270764304 +0000 UTC m=+1439.977125356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts") pod "placement0984-account-delete-82zvj" (UID: "9375da2b-3776-4c32-8afd-d1ed7b22b308") : configmap "openstack-scripts" not found Nov 22 08:46:03 crc kubenswrapper[4743]: I1122 08:46:03.005394 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 08:46:03 crc kubenswrapper[4743]: I1122 08:46:03.045608 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:46:03 crc kubenswrapper[4743]: I1122 08:46:03.054633 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 08:46:03 crc kubenswrapper[4743]: E1122 08:46:03.126290 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:03 crc kubenswrapper[4743]: E1122 08:46:03.126950 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:03 crc kubenswrapper[4743]: E1122 08:46:03.127352 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:03 crc kubenswrapper[4743]: E1122 08:46:03.127432 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" Nov 22 08:46:03 crc kubenswrapper[4743]: E1122 08:46:03.127904 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:03 crc kubenswrapper[4743]: E1122 08:46:03.129798 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:03 crc kubenswrapper[4743]: E1122 08:46:03.134106 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:03 crc kubenswrapper[4743]: E1122 08:46:03.134350 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovs-vswitchd" Nov 22 08:46:03 crc kubenswrapper[4743]: I1122 08:46:03.162241 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="500679c5-1691-4831-b5ec-3c6cce19c503" path="/var/lib/kubelet/pods/500679c5-1691-4831-b5ec-3c6cce19c503/volumes" Nov 22 08:46:03 crc kubenswrapper[4743]: I1122 08:46:03.163144 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" path="/var/lib/kubelet/pods/58b7a46d-98c7-4e9e-94df-80d359fd68c7/volumes" Nov 22 08:46:03 crc kubenswrapper[4743]: I1122 08:46:03.164077 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9817865-d957-42d3-8edb-6800e1075d23" path="/var/lib/kubelet/pods/b9817865-d957-42d3-8edb-6800e1075d23/volumes" Nov 22 08:46:03 crc kubenswrapper[4743]: I1122 08:46:03.165509 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d63130-217d-400e-afc5-6b6bb3d56658" path="/var/lib/kubelet/pods/d3d63130-217d-400e-afc5-6b6bb3d56658/volumes" Nov 22 08:46:03 crc kubenswrapper[4743]: I1122 08:46:03.166693 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5fac46a-545d-4f30-a7ab-8f5e713e934d" path="/var/lib/kubelet/pods/e5fac46a-545d-4f30-a7ab-8f5e713e934d/volumes" Nov 22 08:46:03 crc kubenswrapper[4743]: I1122 08:46:03.168919 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" path="/var/lib/kubelet/pods/ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1/volumes" Nov 22 08:46:06 crc kubenswrapper[4743]: E1122 08:46:06.333400 4743 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 22 08:46:06 crc kubenswrapper[4743]: E1122 08:46:06.334545 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts podName:9375da2b-3776-4c32-8afd-d1ed7b22b308 nodeName:}" failed. No retries permitted until 2025-11-22 08:46:14.33452495 +0000 UTC m=+1448.040886002 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts") pod "placement0984-account-delete-82zvj" (UID: "9375da2b-3776-4c32-8afd-d1ed7b22b308") : configmap "openstack-scripts" not found Nov 22 08:46:08 crc kubenswrapper[4743]: E1122 08:46:08.126372 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:08 crc kubenswrapper[4743]: E1122 08:46:08.128571 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:08 crc kubenswrapper[4743]: E1122 08:46:08.128646 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:08 crc kubenswrapper[4743]: E1122 08:46:08.130663 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:08 crc kubenswrapper[4743]: E1122 08:46:08.131108 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:08 crc kubenswrapper[4743]: E1122 08:46:08.131212 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" Nov 22 08:46:08 crc kubenswrapper[4743]: E1122 08:46:08.132035 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:08 crc kubenswrapper[4743]: E1122 08:46:08.132097 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovs-vswitchd" Nov 22 08:46:13 crc kubenswrapper[4743]: E1122 08:46:13.125594 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:13 crc kubenswrapper[4743]: E1122 08:46:13.126848 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:13 crc kubenswrapper[4743]: E1122 08:46:13.126923 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:13 crc kubenswrapper[4743]: E1122 08:46:13.127186 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:13 crc kubenswrapper[4743]: E1122 08:46:13.127228 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" Nov 22 08:46:13 crc kubenswrapper[4743]: E1122 08:46:13.132324 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:13 crc kubenswrapper[4743]: E1122 08:46:13.134362 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:13 crc kubenswrapper[4743]: E1122 08:46:13.134489 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovs-vswitchd" Nov 22 08:46:14 crc kubenswrapper[4743]: E1122 08:46:14.366096 4743 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 22 08:46:14 crc kubenswrapper[4743]: E1122 08:46:14.366185 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts podName:9375da2b-3776-4c32-8afd-d1ed7b22b308 nodeName:}" failed. No retries permitted until 2025-11-22 08:46:30.366168898 +0000 UTC m=+1464.072529950 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts") pod "placement0984-account-delete-82zvj" (UID: "9375da2b-3776-4c32-8afd-d1ed7b22b308") : configmap "openstack-scripts" not found Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.035703 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.146242 4743 generic.go:334] "Generic (PLEG): container finished" podID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" containerID="f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424" exitCode=0 Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.146277 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568cf9dfc-ghfzl" event={"ID":"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48","Type":"ContainerDied","Data":"f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424"} Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.146303 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5568cf9dfc-ghfzl" event={"ID":"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48","Type":"ContainerDied","Data":"8faad5719e36efa859c72dde84b12b5cae9cc0bcb0b55d021f8b97425c658e9d"} Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.146319 4743 scope.go:117] "RemoveContainer" containerID="59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.146370 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5568cf9dfc-ghfzl" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.174892 4743 scope.go:117] "RemoveContainer" containerID="f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.176562 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-ovndb-tls-certs\") pod \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.176657 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-combined-ca-bundle\") pod \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.176697 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hrhs\" (UniqueName: \"kubernetes.io/projected/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-kube-api-access-7hrhs\") pod \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.176729 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-internal-tls-certs\") pod \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.176753 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-httpd-config\") pod \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.176785 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-config\") pod \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.177279 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-public-tls-certs\") pod \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\" (UID: \"fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48\") " Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.184720 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" (UID: "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.189284 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-kube-api-access-7hrhs" (OuterVolumeSpecName: "kube-api-access-7hrhs") pod "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" (UID: "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48"). InnerVolumeSpecName "kube-api-access-7hrhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.197441 4743 scope.go:117] "RemoveContainer" containerID="59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3" Nov 22 08:46:15 crc kubenswrapper[4743]: E1122 08:46:15.198006 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3\": container with ID starting with 59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3 not found: ID does not exist" containerID="59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.198053 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3"} err="failed to get container status \"59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3\": rpc error: code = NotFound desc = could not find container \"59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3\": container with ID starting with 59e86fdbf507dba327beaec84759043d11d102a3abb3931d091a4afb31ec3fc3 not found: ID does not exist" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.198085 4743 scope.go:117] "RemoveContainer" containerID="f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424" Nov 22 08:46:15 crc kubenswrapper[4743]: E1122 08:46:15.198493 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424\": container with ID starting with f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424 not found: ID does not exist" containerID="f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.198553 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424"} err="failed to get container status \"f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424\": rpc error: code = NotFound desc = could not find container \"f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424\": container with ID starting with f0552cae968565e0fc5419878b1747057005f33e8c628ca1ba7961154ba93424 not found: ID does not exist" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.215821 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" (UID: "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.216861 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" (UID: "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.218811 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-config" (OuterVolumeSpecName: "config") pod "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" (UID: "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.222180 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" (UID: "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.235082 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" (UID: "fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.279765 4743 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-public-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.279823 4743 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.279836 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.279848 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hrhs\" (UniqueName: \"kubernetes.io/projected/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-kube-api-access-7hrhs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.279875 4743 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.279885 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.279895 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48-config\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.484144 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5568cf9dfc-ghfzl"] Nov 22 08:46:15 crc kubenswrapper[4743]: I1122 08:46:15.490777 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5568cf9dfc-ghfzl"] Nov 22 08:46:17 crc kubenswrapper[4743]: I1122 08:46:17.161032 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" path="/var/lib/kubelet/pods/fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48/volumes" Nov 22 08:46:18 crc kubenswrapper[4743]: E1122 08:46:18.125401 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:18 crc kubenswrapper[4743]: E1122 08:46:18.126021 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:18 crc kubenswrapper[4743]: E1122 08:46:18.126928 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Nov 22 08:46:18 crc kubenswrapper[4743]: E1122 08:46:18.127001 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" Nov 22 08:46:18 crc kubenswrapper[4743]: E1122 08:46:18.127333 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:18 crc kubenswrapper[4743]: E1122 08:46:18.131311 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:18 crc kubenswrapper[4743]: E1122 08:46:18.133399 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Nov 22 08:46:18 crc kubenswrapper[4743]: E1122 08:46:18.133460 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-mz9kc" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovs-vswitchd" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.169869 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mz9kc_03685c6a-5ae9-45cf-b66d-5210d4811bda/ovs-vswitchd/0.log" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.172035 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.209431 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-mz9kc_03685c6a-5ae9-45cf-b66d-5210d4811bda/ovs-vswitchd/0.log" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.210471 4743 generic.go:334] "Generic (PLEG): container finished" podID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" exitCode=137 Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.210508 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mz9kc" event={"ID":"03685c6a-5ae9-45cf-b66d-5210d4811bda","Type":"ContainerDied","Data":"0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df"} Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.210535 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-mz9kc" event={"ID":"03685c6a-5ae9-45cf-b66d-5210d4811bda","Type":"ContainerDied","Data":"e52c0670bd78e3fedbdaf421e24fb03c508396f93d62fc1f36c75c01b05f5630"} Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.210552 4743 scope.go:117] "RemoveContainer" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.210688 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-mz9kc" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.231558 4743 scope.go:117] "RemoveContainer" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.258432 4743 scope.go:117] "RemoveContainer" containerID="5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.287882 4743 scope.go:117] "RemoveContainer" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" Nov 22 08:46:22 crc kubenswrapper[4743]: E1122 08:46:22.288481 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df\": container with ID starting with 0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df not found: ID does not exist" containerID="0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.288549 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df"} err="failed to get container status \"0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df\": rpc error: code = NotFound desc = could not find container \"0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df\": container with ID starting with 0a43324c3cc0a4ee2af5b68bed088914833679f68bcfece6d3c8afdf836798df not found: ID does not exist" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.288608 4743 scope.go:117] "RemoveContainer" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" Nov 22 08:46:22 crc kubenswrapper[4743]: E1122 08:46:22.289090 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e\": container with ID starting with 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e not found: ID does not exist" containerID="6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.289165 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e"} err="failed to get container status \"6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e\": rpc error: code = NotFound desc = could not find container \"6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e\": container with ID starting with 6c178817705e502ff5e5e7c7588c686dcd057d43c4b5fd62fe833f120bb2bb4e not found: ID does not exist" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.289192 4743 scope.go:117] "RemoveContainer" containerID="5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024" Nov 22 08:46:22 crc kubenswrapper[4743]: E1122 08:46:22.289619 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024\": container with ID starting with 5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024 not found: ID does not exist" containerID="5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.289653 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024"} err="failed to get container status \"5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024\": rpc error: code = NotFound desc = could not find container \"5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024\": container with ID starting with 5b619e1f2c89db1627000498842ad3c99e5be561b2d6b1d969ebe3b8f5728024 not found: ID does not exist" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294172 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03685c6a-5ae9-45cf-b66d-5210d4811bda-scripts\") pod \"03685c6a-5ae9-45cf-b66d-5210d4811bda\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294238 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8flq\" (UniqueName: \"kubernetes.io/projected/03685c6a-5ae9-45cf-b66d-5210d4811bda-kube-api-access-c8flq\") pod \"03685c6a-5ae9-45cf-b66d-5210d4811bda\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294342 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-log\") pod \"03685c6a-5ae9-45cf-b66d-5210d4811bda\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294371 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-lib\") pod \"03685c6a-5ae9-45cf-b66d-5210d4811bda\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294388 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-run\") pod \"03685c6a-5ae9-45cf-b66d-5210d4811bda\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294404 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-etc-ovs\") pod \"03685c6a-5ae9-45cf-b66d-5210d4811bda\" (UID: \"03685c6a-5ae9-45cf-b66d-5210d4811bda\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294471 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-lib" (OuterVolumeSpecName: "var-lib") pod "03685c6a-5ae9-45cf-b66d-5210d4811bda" (UID: "03685c6a-5ae9-45cf-b66d-5210d4811bda"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294476 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-log" (OuterVolumeSpecName: "var-log") pod "03685c6a-5ae9-45cf-b66d-5210d4811bda" (UID: "03685c6a-5ae9-45cf-b66d-5210d4811bda"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294489 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-run" (OuterVolumeSpecName: "var-run") pod "03685c6a-5ae9-45cf-b66d-5210d4811bda" (UID: "03685c6a-5ae9-45cf-b66d-5210d4811bda"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294607 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "03685c6a-5ae9-45cf-b66d-5210d4811bda" (UID: "03685c6a-5ae9-45cf-b66d-5210d4811bda"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294909 4743 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-log\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294930 4743 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-lib\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294939 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-var-run\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.294948 4743 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/03685c6a-5ae9-45cf-b66d-5210d4811bda-etc-ovs\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.296062 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03685c6a-5ae9-45cf-b66d-5210d4811bda-scripts" (OuterVolumeSpecName: "scripts") pod "03685c6a-5ae9-45cf-b66d-5210d4811bda" (UID: "03685c6a-5ae9-45cf-b66d-5210d4811bda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.300353 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03685c6a-5ae9-45cf-b66d-5210d4811bda-kube-api-access-c8flq" (OuterVolumeSpecName: "kube-api-access-c8flq") pod "03685c6a-5ae9-45cf-b66d-5210d4811bda" (UID: "03685c6a-5ae9-45cf-b66d-5210d4811bda"). InnerVolumeSpecName "kube-api-access-c8flq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.396865 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03685c6a-5ae9-45cf-b66d-5210d4811bda-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.396909 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8flq\" (UniqueName: \"kubernetes.io/projected/03685c6a-5ae9-45cf-b66d-5210d4811bda-kube-api-access-c8flq\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.537548 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-mz9kc"] Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.543212 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-mz9kc"] Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.731018 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.896124 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.904527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") pod \"1638fe70-d5cb-4edc-9513-e5ae475c0909\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.904671 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-lock\") pod \"1638fe70-d5cb-4edc-9513-e5ae475c0909\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.904726 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlm8c\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-kube-api-access-tlm8c\") pod \"1638fe70-d5cb-4edc-9513-e5ae475c0909\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.904807 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"1638fe70-d5cb-4edc-9513-e5ae475c0909\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.904854 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-cache\") pod \"1638fe70-d5cb-4edc-9513-e5ae475c0909\" (UID: \"1638fe70-d5cb-4edc-9513-e5ae475c0909\") " Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.905262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-lock" (OuterVolumeSpecName: "lock") pod "1638fe70-d5cb-4edc-9513-e5ae475c0909" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.905743 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-cache" (OuterVolumeSpecName: "cache") pod "1638fe70-d5cb-4edc-9513-e5ae475c0909" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.909211 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-kube-api-access-tlm8c" (OuterVolumeSpecName: "kube-api-access-tlm8c") pod "1638fe70-d5cb-4edc-9513-e5ae475c0909" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909"). InnerVolumeSpecName "kube-api-access-tlm8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.909722 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1638fe70-d5cb-4edc-9513-e5ae475c0909" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:22 crc kubenswrapper[4743]: I1122 08:46:22.909753 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "swift") pod "1638fe70-d5cb-4edc-9513-e5ae475c0909" (UID: "1638fe70-d5cb-4edc-9513-e5ae475c0909"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.005660 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-combined-ca-bundle\") pod \"145d3340-8ded-4082-b9c8-7b1a21390097\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006033 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/145d3340-8ded-4082-b9c8-7b1a21390097-etc-machine-id\") pod \"145d3340-8ded-4082-b9c8-7b1a21390097\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006085 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data\") pod \"145d3340-8ded-4082-b9c8-7b1a21390097\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006106 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data-custom\") pod \"145d3340-8ded-4082-b9c8-7b1a21390097\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006180 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/145d3340-8ded-4082-b9c8-7b1a21390097-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "145d3340-8ded-4082-b9c8-7b1a21390097" (UID: "145d3340-8ded-4082-b9c8-7b1a21390097"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006225 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndv6n\" (UniqueName: \"kubernetes.io/projected/145d3340-8ded-4082-b9c8-7b1a21390097-kube-api-access-ndv6n\") pod \"145d3340-8ded-4082-b9c8-7b1a21390097\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006251 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-scripts\") pod \"145d3340-8ded-4082-b9c8-7b1a21390097\" (UID: \"145d3340-8ded-4082-b9c8-7b1a21390097\") " Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006823 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/145d3340-8ded-4082-b9c8-7b1a21390097-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006845 4743 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-etc-swift\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006854 4743 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-lock\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006863 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlm8c\" (UniqueName: \"kubernetes.io/projected/1638fe70-d5cb-4edc-9513-e5ae475c0909-kube-api-access-tlm8c\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006885 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.006893 4743 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1638fe70-d5cb-4edc-9513-e5ae475c0909-cache\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.009293 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "145d3340-8ded-4082-b9c8-7b1a21390097" (UID: "145d3340-8ded-4082-b9c8-7b1a21390097"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.009421 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-scripts" (OuterVolumeSpecName: "scripts") pod "145d3340-8ded-4082-b9c8-7b1a21390097" (UID: "145d3340-8ded-4082-b9c8-7b1a21390097"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.010529 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145d3340-8ded-4082-b9c8-7b1a21390097-kube-api-access-ndv6n" (OuterVolumeSpecName: "kube-api-access-ndv6n") pod "145d3340-8ded-4082-b9c8-7b1a21390097" (UID: "145d3340-8ded-4082-b9c8-7b1a21390097"). InnerVolumeSpecName "kube-api-access-ndv6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.023540 4743 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.044359 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "145d3340-8ded-4082-b9c8-7b1a21390097" (UID: "145d3340-8ded-4082-b9c8-7b1a21390097"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.072528 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data" (OuterVolumeSpecName: "config-data") pod "145d3340-8ded-4082-b9c8-7b1a21390097" (UID: "145d3340-8ded-4082-b9c8-7b1a21390097"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.108521 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.108557 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndv6n\" (UniqueName: \"kubernetes.io/projected/145d3340-8ded-4082-b9c8-7b1a21390097-kube-api-access-ndv6n\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.108570 4743 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.108602 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.108611 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.108622 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/145d3340-8ded-4082-b9c8-7b1a21390097-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.162875 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" path="/var/lib/kubelet/pods/03685c6a-5ae9-45cf-b66d-5210d4811bda/volumes" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.229549 4743 generic.go:334] "Generic (PLEG): container finished" podID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerID="c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed" exitCode=137 Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.229609 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed"} Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.230026 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1638fe70-d5cb-4edc-9513-e5ae475c0909","Type":"ContainerDied","Data":"0f90ac6c9f8ede09876ab45a6508579f9dab5b53603c758d13a7dfc4e43bef69"} Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.230064 4743 scope.go:117] "RemoveContainer" containerID="c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.229680 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.233052 4743 generic.go:334] "Generic (PLEG): container finished" podID="145d3340-8ded-4082-b9c8-7b1a21390097" containerID="253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826" exitCode=137 Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.233118 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.233105 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"145d3340-8ded-4082-b9c8-7b1a21390097","Type":"ContainerDied","Data":"253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826"} Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.233193 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"145d3340-8ded-4082-b9c8-7b1a21390097","Type":"ContainerDied","Data":"1962cb16879bf483bc8e74cf26e36f96f2fed16beaf2b0e74f674eb51c6b683e"} Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.261040 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.263923 4743 scope.go:117] "RemoveContainer" containerID="18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.270281 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.275377 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.280755 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.290242 4743 scope.go:117] "RemoveContainer" containerID="a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.309185 4743 scope.go:117] "RemoveContainer" containerID="993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.329715 4743 scope.go:117] "RemoveContainer" containerID="ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.356060 4743 scope.go:117] "RemoveContainer" containerID="f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.373047 4743 scope.go:117] "RemoveContainer" containerID="bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.397339 4743 scope.go:117] "RemoveContainer" containerID="da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.416102 4743 scope.go:117] "RemoveContainer" containerID="5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.438825 4743 scope.go:117] "RemoveContainer" containerID="174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.460435 4743 scope.go:117] "RemoveContainer" containerID="f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.481633 4743 scope.go:117] "RemoveContainer" containerID="863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.501642 4743 scope.go:117] "RemoveContainer" containerID="08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.520260 4743 scope.go:117] "RemoveContainer" containerID="f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.542415 4743 scope.go:117] "RemoveContainer" containerID="bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.562325 4743 scope.go:117] "RemoveContainer" containerID="c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.563032 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed\": container with ID starting with c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed not found: ID does not exist" containerID="c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.563071 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed"} err="failed to get container status \"c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed\": rpc error: code = NotFound desc = could not find container \"c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed\": container with ID starting with c927e14480e99e5fa3aae16518edf72706fbe6e0db7cc7bf41ff9df35681ceed not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.563100 4743 scope.go:117] "RemoveContainer" containerID="18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.563700 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe\": container with ID starting with 18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe not found: ID does not exist" containerID="18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.563764 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe"} err="failed to get container status \"18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe\": rpc error: code = NotFound desc = could not find container \"18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe\": container with ID starting with 18016d212d31dcd8e9019771e050a3c2f2c8b98d61ac29c013a7c0d29f0c9abe not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.563798 4743 scope.go:117] "RemoveContainer" containerID="a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.564146 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f\": container with ID starting with a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f not found: ID does not exist" containerID="a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.564202 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f"} err="failed to get container status \"a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f\": rpc error: code = NotFound desc = could not find container \"a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f\": container with ID starting with a95195ffff5992838a98524eb90743acf2d28f10f758a03c12aed8acb2d6e42f not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.564219 4743 scope.go:117] "RemoveContainer" containerID="993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.564567 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291\": container with ID starting with 993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291 not found: ID does not exist" containerID="993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.564617 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291"} err="failed to get container status \"993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291\": rpc error: code = NotFound desc = could not find container \"993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291\": container with ID starting with 993807971e3fe38a51adbd1b219bfc433b036611015f9ffcf036047443df9291 not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.564660 4743 scope.go:117] "RemoveContainer" containerID="ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.565105 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776\": container with ID starting with ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776 not found: ID does not exist" containerID="ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.565131 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776"} err="failed to get container status \"ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776\": rpc error: code = NotFound desc = could not find container \"ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776\": container with ID starting with ec9feb1e36903530f51e4081f62931d995df7433e4c983a4e4000fd683661776 not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.565145 4743 scope.go:117] "RemoveContainer" containerID="f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.565532 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b\": container with ID starting with f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b not found: ID does not exist" containerID="f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.565565 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b"} err="failed to get container status \"f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b\": rpc error: code = NotFound desc = could not find container \"f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b\": container with ID starting with f3626494247e8a2e92e0f6feeee5a699a3fd40344fc02568ca3ac59cc83b616b not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.565607 4743 scope.go:117] "RemoveContainer" containerID="bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.565931 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84\": container with ID starting with bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84 not found: ID does not exist" containerID="bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.565960 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84"} err="failed to get container status \"bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84\": rpc error: code = NotFound desc = could not find container \"bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84\": container with ID starting with bab2db134206d47ae9bd6b30f04f683e92994550a36ef0ed7b0a94786f7c1f84 not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.566005 4743 scope.go:117] "RemoveContainer" containerID="da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.566612 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb\": container with ID starting with da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb not found: ID does not exist" containerID="da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.566646 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb"} err="failed to get container status \"da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb\": rpc error: code = NotFound desc = could not find container \"da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb\": container with ID starting with da59663fd29d23a13ce99b0f8eba923d7ac3ae1ec7cc5b5f29f1fa56cac439bb not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.566668 4743 scope.go:117] "RemoveContainer" containerID="5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.566932 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4\": container with ID starting with 5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4 not found: ID does not exist" containerID="5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.566953 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4"} err="failed to get container status \"5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4\": rpc error: code = NotFound desc = could not find container \"5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4\": container with ID starting with 5509987d5d5ced9977a07fb9c6a62d6b0dae600a659e5ff215e68baa439b16d4 not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.566968 4743 scope.go:117] "RemoveContainer" containerID="174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.567724 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b\": container with ID starting with 174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b not found: ID does not exist" containerID="174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.567757 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b"} err="failed to get container status \"174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b\": rpc error: code = NotFound desc = could not find container \"174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b\": container with ID starting with 174f838214762b90232d8a45fbcdeeaeaac51eb2bc0ad0e1c786832dbb96539b not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.567802 4743 scope.go:117] "RemoveContainer" containerID="f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.568246 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3\": container with ID starting with f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3 not found: ID does not exist" containerID="f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.568280 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3"} err="failed to get container status \"f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3\": rpc error: code = NotFound desc = could not find container \"f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3\": container with ID starting with f33b36b26e29f5ea2598f30cf3a310aacff9df030f39cf92bdbeb5b74f93a4d3 not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.568302 4743 scope.go:117] "RemoveContainer" containerID="863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.569516 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6\": container with ID starting with 863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6 not found: ID does not exist" containerID="863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.569558 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6"} err="failed to get container status \"863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6\": rpc error: code = NotFound desc = could not find container \"863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6\": container with ID starting with 863a5332cece1e7ce5ae15e2fa474c8c71526ebcab45a6b55cd30386bf1917d6 not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.569598 4743 scope.go:117] "RemoveContainer" containerID="08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.570081 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e\": container with ID starting with 08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e not found: ID does not exist" containerID="08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.570132 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e"} err="failed to get container status \"08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e\": rpc error: code = NotFound desc = could not find container \"08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e\": container with ID starting with 08cfe1f9097fa25403bbf13e96d1685d14f831926ca3e60b7bdf1ab3df90261e not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.570172 4743 scope.go:117] "RemoveContainer" containerID="f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.570602 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded\": container with ID starting with f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded not found: ID does not exist" containerID="f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.570629 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded"} err="failed to get container status \"f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded\": rpc error: code = NotFound desc = could not find container \"f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded\": container with ID starting with f652cefcf9320804b59ad536155f0f3d9bcff2bdf7eaec7fd0cc89efc0d51ded not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.570645 4743 scope.go:117] "RemoveContainer" containerID="bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.570973 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf\": container with ID starting with bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf not found: ID does not exist" containerID="bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.571013 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf"} err="failed to get container status \"bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf\": rpc error: code = NotFound desc = could not find container \"bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf\": container with ID starting with bc67388ea964c640cbbf1d345cf0d38dacfa81970af84458b56994de4cc50fbf not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.571058 4743 scope.go:117] "RemoveContainer" containerID="fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.588334 4743 scope.go:117] "RemoveContainer" containerID="253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.608343 4743 scope.go:117] "RemoveContainer" containerID="fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.608807 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af\": container with ID starting with fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af not found: ID does not exist" containerID="fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.608864 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af"} err="failed to get container status \"fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af\": rpc error: code = NotFound desc = could not find container \"fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af\": container with ID starting with fddbe94d6f9c3286cd45ff25a15d0d930618663ac4eff680a3c92f472d8332af not found: ID does not exist" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.608891 4743 scope.go:117] "RemoveContainer" containerID="253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826" Nov 22 08:46:23 crc kubenswrapper[4743]: E1122 08:46:23.609243 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826\": container with ID starting with 253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826 not found: ID does not exist" containerID="253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826" Nov 22 08:46:23 crc kubenswrapper[4743]: I1122 08:46:23.609287 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826"} err="failed to get container status \"253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826\": rpc error: code = NotFound desc = could not find container \"253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826\": container with ID starting with 253a1530f6e401d4ea120ec7cf85cf94cc5d0c8844ceb014f1e8200ce768e826 not found: ID does not exist" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.079385 4743 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod5987ad61-2878-4efc-98ca-ea29b123f26e"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod5987ad61-2878-4efc-98ca-ea29b123f26e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5987ad61_2878_4efc_98ca_ea29b123f26e.slice" Nov 22 08:46:25 crc kubenswrapper[4743]: E1122 08:46:25.079683 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod5987ad61-2878-4efc-98ca-ea29b123f26e] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod5987ad61-2878-4efc-98ca-ea29b123f26e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5987ad61_2878_4efc_98ca_ea29b123f26e.slice" pod="openstack/openstackclient" podUID="5987ad61-2878-4efc-98ca-ea29b123f26e" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.085557 4743 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod3a18c86e-9d86-49ee-918f-76de17000e18"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod3a18c86e-9d86-49ee-918f-76de17000e18] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3a18c86e_9d86_49ee_918f_76de17000e18.slice" Nov 22 08:46:25 crc kubenswrapper[4743]: E1122 08:46:25.085683 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod3a18c86e-9d86-49ee-918f-76de17000e18] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod3a18c86e-9d86-49ee-918f-76de17000e18] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3a18c86e_9d86_49ee_918f_76de17000e18.slice" pod="openstack/ovsdbserver-sb-0" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.160550 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="145d3340-8ded-4082-b9c8-7b1a21390097" path="/var/lib/kubelet/pods/145d3340-8ded-4082-b9c8-7b1a21390097/volumes" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.161180 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" path="/var/lib/kubelet/pods/1638fe70-d5cb-4edc-9513-e5ae475c0909/volumes" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.203220 4743 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podaab079ae-b574-40f3-8df0-7deff1356e09"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podaab079ae-b574-40f3-8df0-7deff1356e09] : Timed out while waiting for systemd to remove kubepods-besteffort-podaab079ae_b574_40f3_8df0_7deff1356e09.slice" Nov 22 08:46:25 crc kubenswrapper[4743]: E1122 08:46:25.203343 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podaab079ae-b574-40f3-8df0-7deff1356e09] : unable to destroy cgroup paths for cgroup [kubepods besteffort podaab079ae-b574-40f3-8df0-7deff1356e09] : Timed out while waiting for systemd to remove kubepods-besteffort-podaab079ae_b574_40f3_8df0_7deff1356e09.slice" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" podUID="aab079ae-b574-40f3-8df0-7deff1356e09" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.206656 4743 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod870c700d-9095-4781-ab16-4cce25d24ed2"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod870c700d-9095-4781-ab16-4cce25d24ed2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod870c700d_9095_4781_ab16_4cce25d24ed2.slice" Nov 22 08:46:25 crc kubenswrapper[4743]: E1122 08:46:25.206702 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod870c700d-9095-4781-ab16-4cce25d24ed2] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod870c700d-9095-4781-ab16-4cce25d24ed2] : Timed out while waiting for systemd to remove kubepods-besteffort-pod870c700d_9095_4781_ab16_4cce25d24ed2.slice" pod="openstack/ovn-controller-metrics-7qctt" podUID="870c700d-9095-4781-ab16-4cce25d24ed2" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.263460 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.264209 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-7qctt" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.264714 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.264932 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-zqvbz" Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.297255 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-7qctt"] Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.305081 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-7qctt"] Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.317080 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.324259 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.331097 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-zqvbz"] Nov 22 08:46:25 crc kubenswrapper[4743]: I1122 08:46:25.337891 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-zqvbz"] Nov 22 08:46:27 crc kubenswrapper[4743]: I1122 08:46:27.160415 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" path="/var/lib/kubelet/pods/3a18c86e-9d86-49ee-918f-76de17000e18/volumes" Nov 22 08:46:27 crc kubenswrapper[4743]: I1122 08:46:27.161079 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="870c700d-9095-4781-ab16-4cce25d24ed2" path="/var/lib/kubelet/pods/870c700d-9095-4781-ab16-4cce25d24ed2/volumes" Nov 22 08:46:27 crc kubenswrapper[4743]: I1122 08:46:27.161748 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab079ae-b574-40f3-8df0-7deff1356e09" path="/var/lib/kubelet/pods/aab079ae-b574-40f3-8df0-7deff1356e09/volumes" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.854155 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6d1b00d_147b_4865_b659_59d06f360797.slice/crio-conmon-bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30ee548a_8838_4d52_867b_4dfdb6c4f641.slice/crio-conmon-d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5310c975_ef7b_4161_ab2e_5ee94b709f9d.slice/crio-9d5ed428f2c04f357aa903343aea3a22402d7ece9dcdf137788ad34286fe9ec5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5310c975_ef7b_4161_ab2e_5ee94b709f9d.slice/crio-conmon-9d5ed428f2c04f357aa903343aea3a22402d7ece9dcdf137788ad34286fe9ec5.scope\": RecentStats: unable to find data in memory cache]" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.885097 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d78kg"] Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888223 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-expirer" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888252 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-expirer" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888284 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerName="ovsdbserver-nb" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888291 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerName="ovsdbserver-nb" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888297 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-metadata" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888303 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-metadata" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888313 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovs-vswitchd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888319 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovs-vswitchd" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888329 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861e40f8-c596-40a1-b192-2fa51f567b55" containerName="memcached" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888334 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="861e40f8-c596-40a1-b192-2fa51f567b55" containerName="memcached" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888365 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d96211ff-f7ba-4e26-ae39-43c8062e2277" containerName="mariadb-account-delete" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888371 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d96211ff-f7ba-4e26-ae39-43c8062e2277" containerName="mariadb-account-delete" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888378 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fac46a-545d-4f30-a7ab-8f5e713e934d" containerName="setup-container" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888383 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fac46a-545d-4f30-a7ab-8f5e713e934d" containerName="setup-container" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888389 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888395 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888407 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11c59cd3-7ee4-43f3-83ce-9d22824473d7" containerName="nova-cell1-conductor-conductor" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888413 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="11c59cd3-7ee4-43f3-83ce-9d22824473d7" containerName="nova-cell1-conductor-conductor" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888457 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" containerName="nova-api-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888466 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" containerName="nova-api-api" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888474 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server-init" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888481 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server-init" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888492 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="ceilometer-notification-agent" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888497 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="ceilometer-notification-agent" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888532 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" containerName="placement-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888544 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" containerName="placement-api" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888552 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d63130-217d-400e-afc5-6b6bb3d56658" containerName="galera" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888560 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d63130-217d-400e-afc5-6b6bb3d56658" containerName="galera" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888570 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61760fb-827b-4199-bfdb-52698c7b4824" containerName="glance-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888628 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61760fb-827b-4199-bfdb-52698c7b4824" containerName="glance-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888638 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" containerName="barbican-worker" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888643 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" containerName="barbican-worker" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888655 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-reaper" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.888660 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-reaper" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.888672 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.893724 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.893794 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-auditor" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.893803 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-auditor" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.893829 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" containerName="nova-api-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.893835 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" containerName="nova-api-log" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.893857 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="ceilometer-central-agent" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.893864 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="ceilometer-central-agent" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.893873 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" containerName="proxy-server" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.893879 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" containerName="proxy-server" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.893893 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerName="barbican-api-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.893898 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerName="barbican-api-log" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.893911 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7be7b8b-96eb-40fb-98b2-bc33e2154343" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.893916 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7be7b8b-96eb-40fb-98b2-bc33e2154343" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.893932 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerName="cinder-api-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.893939 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerName="cinder-api-log" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.893949 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145d3340-8ded-4082-b9c8-7b1a21390097" containerName="cinder-scheduler" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.893956 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="145d3340-8ded-4082-b9c8-7b1a21390097" containerName="cinder-scheduler" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894530 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-replicator" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894540 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-replicator" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894551 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-auditor" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894561 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-auditor" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894635 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="500679c5-1691-4831-b5ec-3c6cce19c503" containerName="nova-cell0-conductor-conductor" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894643 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="500679c5-1691-4831-b5ec-3c6cce19c503" containerName="nova-cell0-conductor-conductor" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894650 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab079ae-b574-40f3-8df0-7deff1356e09" containerName="dnsmasq-dns" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894657 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab079ae-b574-40f3-8df0-7deff1356e09" containerName="dnsmasq-dns" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894670 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-server" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894679 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-server" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894696 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" containerName="placement-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894703 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" containerName="placement-log" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894714 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-auditor" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894721 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-auditor" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894734 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerName="cinder-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894741 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerName="cinder-api" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894749 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894756 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894767 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe5d70f-5277-4803-ae45-de61d0eefe27" containerName="barbican-keystone-listener-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894774 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe5d70f-5277-4803-ae45-de61d0eefe27" containerName="barbican-keystone-listener-log" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894784 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145d3340-8ded-4082-b9c8-7b1a21390097" containerName="probe" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894792 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="145d3340-8ded-4082-b9c8-7b1a21390097" containerName="probe" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894805 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" containerName="glance-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894811 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" containerName="glance-log" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894823 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5db10427-8546-4dea-b849-36bb02c837bd" containerName="ovn-controller" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894831 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db10427-8546-4dea-b849-36bb02c837bd" containerName="ovn-controller" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894843 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" containerName="glance-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894850 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" containerName="glance-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894860 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29734ea4-591c-478e-8030-55fcbac72d3a" containerName="mysql-bootstrap" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894868 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="29734ea4-591c-478e-8030-55fcbac72d3a" containerName="mysql-bootstrap" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894876 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-updater" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894885 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-updater" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894897 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b21104-eefe-4583-9af8-731d561b78c2" containerName="keystone-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894904 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b21104-eefe-4583-9af8-731d561b78c2" containerName="keystone-api" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894918 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-server" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894925 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-server" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894936 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894944 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894954 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fac46a-545d-4f30-a7ab-8f5e713e934d" containerName="rabbitmq" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894962 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fac46a-545d-4f30-a7ab-8f5e713e934d" containerName="rabbitmq" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894971 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="rsync" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894978 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="rsync" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.894987 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c61760fb-827b-4199-bfdb-52698c7b4824" containerName="glance-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.894994 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c61760fb-827b-4199-bfdb-52698c7b4824" containerName="glance-log" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895006 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7936e330-2138-4624-b319-902f6a4941ec" containerName="mariadb-account-delete" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895013 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7936e330-2138-4624-b319-902f6a4941ec" containerName="mariadb-account-delete" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895022 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="proxy-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895029 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="proxy-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895038 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fe5d70f-5277-4803-ae45-de61d0eefe27" containerName="barbican-keystone-listener" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895045 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fe5d70f-5277-4803-ae45-de61d0eefe27" containerName="barbican-keystone-listener" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895059 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" containerName="ovsdbserver-sb" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895065 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" containerName="ovsdbserver-sb" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895076 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" containerName="barbican-worker-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895083 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" containerName="barbican-worker-log" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895096 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="sg-core" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895103 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="sg-core" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895113 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-server" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895121 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-server" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895132 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-replicator" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895140 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-replicator" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895154 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="870c700d-9095-4781-ab16-4cce25d24ed2" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895161 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="870c700d-9095-4781-ab16-4cce25d24ed2" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895170 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" containerName="neutron-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895177 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" containerName="neutron-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895191 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" containerName="proxy-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895199 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" containerName="proxy-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895214 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" containerName="setup-container" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895222 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" containerName="setup-container" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895236 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerName="barbican-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895243 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerName="barbican-api" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895254 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d63130-217d-400e-afc5-6b6bb3d56658" containerName="mysql-bootstrap" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895261 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d63130-217d-400e-afc5-6b6bb3d56658" containerName="mysql-bootstrap" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895271 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="ovn-northd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895278 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="ovn-northd" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895290 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab079ae-b574-40f3-8df0-7deff1356e09" containerName="init" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895298 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab079ae-b574-40f3-8df0-7deff1356e09" containerName="init" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895308 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3a93a60-b315-4de2-96d7-d23c9cedbc9c" containerName="kube-state-metrics" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895315 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3a93a60-b315-4de2-96d7-d23c9cedbc9c" containerName="kube-state-metrics" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895330 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-updater" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895338 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-updater" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895348 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="swift-recon-cron" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895355 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="swift-recon-cron" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895367 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29734ea4-591c-478e-8030-55fcbac72d3a" containerName="galera" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895375 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="29734ea4-591c-478e-8030-55fcbac72d3a" containerName="galera" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895385 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" containerName="neutron-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895391 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" containerName="neutron-api" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895402 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-replicator" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895409 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-replicator" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895419 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da8955a2-6deb-440c-97e3-f2420aa5fae8" containerName="nova-scheduler-scheduler" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895426 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8955a2-6deb-440c-97e3-f2420aa5fae8" containerName="nova-scheduler-scheduler" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895437 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" containerName="rabbitmq" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895443 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" containerName="rabbitmq" Nov 22 08:46:28 crc kubenswrapper[4743]: E1122 08:46:28.895453 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.895460 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896740 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff6b12a5-cc5e-4ff8-b4eb-ab76fb4d36c1" containerName="rabbitmq" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896758 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" containerName="glance-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896771 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerName="ovsdbserver-nb" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896784 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="da8955a2-6deb-440c-97e3-f2420aa5fae8" containerName="nova-scheduler-scheduler" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896795 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-auditor" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896807 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d96211ff-f7ba-4e26-ae39-43c8062e2277" containerName="mariadb-account-delete" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896819 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" containerName="barbican-worker-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896829 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-server" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896836 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="145d3340-8ded-4082-b9c8-7b1a21390097" containerName="probe" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896848 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61760fb-827b-4199-bfdb-52698c7b4824" containerName="glance-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896859 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="swift-recon-cron" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896872 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="861e40f8-c596-40a1-b192-2fa51f567b55" containerName="memcached" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896883 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-updater" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896893 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" containerName="proxy-server" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896902 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca6d95c-89d6-4b49-bf28-2606b9b5c05e" containerName="glance-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896912 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerName="barbican-api-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896923 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c61760fb-827b-4199-bfdb-52698c7b4824" containerName="glance-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896939 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-replicator" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896953 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d63130-217d-400e-afc5-6b6bb3d56658" containerName="galera" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896962 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-expirer" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896975 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovsdb-server" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.896988 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5db10427-8546-4dea-b849-36bb02c837bd" containerName="ovn-controller" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898070 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-auditor" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898088 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe5d70f-5277-4803-ae45-de61d0eefe27" containerName="barbican-keystone-listener-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898099 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerName="cinder-api-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898112 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="ceilometer-central-agent" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898127 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-auditor" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898140 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-replicator" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898152 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-server" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898163 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" containerName="nova-api-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898175 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898186 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f765fcd-87e8-4d2d-a82d-6ba04aa8a00d" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898197 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" containerName="neutron-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898205 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="container-updater" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898215 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="11c59cd3-7ee4-43f3-83ce-9d22824473d7" containerName="nova-cell1-conductor-conductor" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898226 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fac46a-545d-4f30-a7ab-8f5e713e934d" containerName="rabbitmq" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898234 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="ceilometer-notification-agent" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898242 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-reaper" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898250 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="object-replicator" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898257 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="178ccbe4-360f-4a0d-b97c-edf5b8b8dcba" containerName="proxy-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898268 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab079ae-b574-40f3-8df0-7deff1356e09" containerName="dnsmasq-dns" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898279 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3a93a60-b315-4de2-96d7-d23c9cedbc9c" containerName="kube-state-metrics" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898288 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcd06b6d-e1b1-44dd-b2a6-8d2a8cca4d48" containerName="neutron-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898301 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="account-server" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898314 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="145d3340-8ded-4082-b9c8-7b1a21390097" containerName="cinder-scheduler" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898368 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" containerName="ovsdbserver-sb" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898378 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="29734ea4-591c-478e-8030-55fcbac72d3a" containerName="galera" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898387 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898399 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7936e330-2138-4624-b319-902f6a4941ec" containerName="mariadb-account-delete" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898566 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9817865-d957-42d3-8edb-6800e1075d23" containerName="ovn-northd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898593 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" containerName="placement-log" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898604 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="db905ec2-675e-48ea-a051-ed3d78c35797" containerName="nova-api-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898705 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="proxy-httpd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898726 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b24dd85-d686-4fb0-be74-7aca0b03255c" containerName="nova-metadata-metadata" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898738 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fe5d70f-5277-4803-ae45-de61d0eefe27" containerName="barbican-keystone-listener" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898749 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b21104-eefe-4583-9af8-731d561b78c2" containerName="keystone-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898790 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="500679c5-1691-4831-b5ec-3c6cce19c503" containerName="nova-cell0-conductor-conductor" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898801 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7be7b8b-96eb-40fb-98b2-bc33e2154343" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898811 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b7a46d-98c7-4e9e-94df-80d359fd68c7" containerName="sg-core" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898819 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc034ce8-656e-4c88-92f1-18f384ae1a18" containerName="barbican-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898831 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="870c700d-9095-4781-ab16-4cce25d24ed2" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898854 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d8e638-b97a-4273-9391-5e0c7dd1bfb1" containerName="barbican-worker" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898865 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a18c86e-9d86-49ee-918f-76de17000e18" containerName="openstack-network-exporter" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898873 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1638fe70-d5cb-4edc-9513-e5ae475c0909" containerName="rsync" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898881 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="abdfc89d-bd20-4fae-b6f6-ee4d1729e8ca" containerName="placement-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898890 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="29bf9036-d8fc-43f7-9153-f133a723c6df" containerName="cinder-api" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.898902 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="03685c6a-5ae9-45cf-b66d-5210d4811bda" containerName="ovs-vswitchd" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.901140 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.910107 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d78kg"] Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.930177 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4a60-account-delete-j4cg4" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.996314 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-766xd\" (UniqueName: \"kubernetes.io/projected/eb0d8b6b-810b-4643-89f6-c00a308a72fa-kube-api-access-766xd\") pod \"community-operators-d78kg\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.996402 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-catalog-content\") pod \"community-operators-d78kg\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:28 crc kubenswrapper[4743]: I1122 08:46:28.996435 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-utilities\") pod \"community-operators-d78kg\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.016023 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron8dc4-account-delete-rtl4b" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.040454 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi1d8b-account-delete-scjlt" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.086374 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0ad52-account-delete-9r4lw" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.097680 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9zdc\" (UniqueName: \"kubernetes.io/projected/f48bbac5-2782-4c1e-b74b-520f0457f9ac-kube-api-access-s9zdc\") pod \"f48bbac5-2782-4c1e-b74b-520f0457f9ac\" (UID: \"f48bbac5-2782-4c1e-b74b-520f0457f9ac\") " Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.097848 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48bbac5-2782-4c1e-b74b-520f0457f9ac-operator-scripts\") pod \"f48bbac5-2782-4c1e-b74b-520f0457f9ac\" (UID: \"f48bbac5-2782-4c1e-b74b-520f0457f9ac\") " Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.098108 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-766xd\" (UniqueName: \"kubernetes.io/projected/eb0d8b6b-810b-4643-89f6-c00a308a72fa-kube-api-access-766xd\") pod \"community-operators-d78kg\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.098156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-catalog-content\") pod \"community-operators-d78kg\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.098177 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-utilities\") pod \"community-operators-d78kg\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.098762 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-utilities\") pod \"community-operators-d78kg\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.098895 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f48bbac5-2782-4c1e-b74b-520f0457f9ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f48bbac5-2782-4c1e-b74b-520f0457f9ac" (UID: "f48bbac5-2782-4c1e-b74b-520f0457f9ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.101081 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-catalog-content\") pod \"community-operators-d78kg\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.110739 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f48bbac5-2782-4c1e-b74b-520f0457f9ac-kube-api-access-s9zdc" (OuterVolumeSpecName: "kube-api-access-s9zdc") pod "f48bbac5-2782-4c1e-b74b-520f0457f9ac" (UID: "f48bbac5-2782-4c1e-b74b-520f0457f9ac"). InnerVolumeSpecName "kube-api-access-s9zdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.120347 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-766xd\" (UniqueName: \"kubernetes.io/projected/eb0d8b6b-810b-4643-89f6-c00a308a72fa-kube-api-access-766xd\") pod \"community-operators-d78kg\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.198806 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdrrv\" (UniqueName: \"kubernetes.io/projected/f6d1b00d-147b-4865-b659-59d06f360797-kube-api-access-qdrrv\") pod \"f6d1b00d-147b-4865-b659-59d06f360797\" (UID: \"f6d1b00d-147b-4865-b659-59d06f360797\") " Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.198864 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d1b00d-147b-4865-b659-59d06f360797-operator-scripts\") pod \"f6d1b00d-147b-4865-b659-59d06f360797\" (UID: \"f6d1b00d-147b-4865-b659-59d06f360797\") " Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.198918 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk88b\" (UniqueName: \"kubernetes.io/projected/30ee548a-8838-4d52-867b-4dfdb6c4f641-kube-api-access-dk88b\") pod \"30ee548a-8838-4d52-867b-4dfdb6c4f641\" (UID: \"30ee548a-8838-4d52-867b-4dfdb6c4f641\") " Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.199007 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fca29fd-c34f-4954-960f-b5ca0812d5b0-operator-scripts\") pod \"5fca29fd-c34f-4954-960f-b5ca0812d5b0\" (UID: \"5fca29fd-c34f-4954-960f-b5ca0812d5b0\") " Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.199033 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmgxv\" (UniqueName: \"kubernetes.io/projected/5fca29fd-c34f-4954-960f-b5ca0812d5b0-kube-api-access-gmgxv\") pod \"5fca29fd-c34f-4954-960f-b5ca0812d5b0\" (UID: \"5fca29fd-c34f-4954-960f-b5ca0812d5b0\") " Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.199082 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30ee548a-8838-4d52-867b-4dfdb6c4f641-operator-scripts\") pod \"30ee548a-8838-4d52-867b-4dfdb6c4f641\" (UID: \"30ee548a-8838-4d52-867b-4dfdb6c4f641\") " Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.199448 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48bbac5-2782-4c1e-b74b-520f0457f9ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.199474 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9zdc\" (UniqueName: \"kubernetes.io/projected/f48bbac5-2782-4c1e-b74b-520f0457f9ac-kube-api-access-s9zdc\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.199472 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6d1b00d-147b-4865-b659-59d06f360797-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6d1b00d-147b-4865-b659-59d06f360797" (UID: "f6d1b00d-147b-4865-b659-59d06f360797"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.199625 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fca29fd-c34f-4954-960f-b5ca0812d5b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5fca29fd-c34f-4954-960f-b5ca0812d5b0" (UID: "5fca29fd-c34f-4954-960f-b5ca0812d5b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.199850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30ee548a-8838-4d52-867b-4dfdb6c4f641-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30ee548a-8838-4d52-867b-4dfdb6c4f641" (UID: "30ee548a-8838-4d52-867b-4dfdb6c4f641"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.201998 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fca29fd-c34f-4954-960f-b5ca0812d5b0-kube-api-access-gmgxv" (OuterVolumeSpecName: "kube-api-access-gmgxv") pod "5fca29fd-c34f-4954-960f-b5ca0812d5b0" (UID: "5fca29fd-c34f-4954-960f-b5ca0812d5b0"). InnerVolumeSpecName "kube-api-access-gmgxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.202079 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6d1b00d-147b-4865-b659-59d06f360797-kube-api-access-qdrrv" (OuterVolumeSpecName: "kube-api-access-qdrrv") pod "f6d1b00d-147b-4865-b659-59d06f360797" (UID: "f6d1b00d-147b-4865-b659-59d06f360797"). InnerVolumeSpecName "kube-api-access-qdrrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.202072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30ee548a-8838-4d52-867b-4dfdb6c4f641-kube-api-access-dk88b" (OuterVolumeSpecName: "kube-api-access-dk88b") pod "30ee548a-8838-4d52-867b-4dfdb6c4f641" (UID: "30ee548a-8838-4d52-867b-4dfdb6c4f641"). InnerVolumeSpecName "kube-api-access-dk88b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.248268 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.300334 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdrrv\" (UniqueName: \"kubernetes.io/projected/f6d1b00d-147b-4865-b659-59d06f360797-kube-api-access-qdrrv\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.300367 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6d1b00d-147b-4865-b659-59d06f360797-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.300376 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk88b\" (UniqueName: \"kubernetes.io/projected/30ee548a-8838-4d52-867b-4dfdb6c4f641-kube-api-access-dk88b\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.300385 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5fca29fd-c34f-4954-960f-b5ca0812d5b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.300394 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmgxv\" (UniqueName: \"kubernetes.io/projected/5fca29fd-c34f-4954-960f-b5ca0812d5b0-kube-api-access-gmgxv\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.300404 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30ee548a-8838-4d52-867b-4dfdb6c4f641-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.321164 4743 generic.go:334] "Generic (PLEG): container finished" podID="30ee548a-8838-4d52-867b-4dfdb6c4f641" containerID="d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818" exitCode=137 Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.321253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi1d8b-account-delete-scjlt" event={"ID":"30ee548a-8838-4d52-867b-4dfdb6c4f641","Type":"ContainerDied","Data":"d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818"} Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.321282 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi1d8b-account-delete-scjlt" event={"ID":"30ee548a-8838-4d52-867b-4dfdb6c4f641","Type":"ContainerDied","Data":"55d8bb28b2a06d9eeab5de7f880cb8bbdefa5f0995dd6971a51f9f0b5040e274"} Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.321302 4743 scope.go:117] "RemoveContainer" containerID="d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.321450 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi1d8b-account-delete-scjlt" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.338292 4743 generic.go:334] "Generic (PLEG): container finished" podID="5fca29fd-c34f-4954-960f-b5ca0812d5b0" containerID="8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5" exitCode=137 Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.338376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0ad52-account-delete-9r4lw" event={"ID":"5fca29fd-c34f-4954-960f-b5ca0812d5b0","Type":"ContainerDied","Data":"8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5"} Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.338410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell0ad52-account-delete-9r4lw" event={"ID":"5fca29fd-c34f-4954-960f-b5ca0812d5b0","Type":"ContainerDied","Data":"6d6eeaaa805b28bfc333c74829288c5cc15dca9e40890f87a5ab4f98de096a46"} Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.338477 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell0ad52-account-delete-9r4lw" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.352066 4743 generic.go:334] "Generic (PLEG): container finished" podID="f48bbac5-2782-4c1e-b74b-520f0457f9ac" containerID="76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d" exitCode=137 Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.352140 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance4a60-account-delete-j4cg4" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.352160 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4a60-account-delete-j4cg4" event={"ID":"f48bbac5-2782-4c1e-b74b-520f0457f9ac","Type":"ContainerDied","Data":"76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d"} Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.357848 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance4a60-account-delete-j4cg4" event={"ID":"f48bbac5-2782-4c1e-b74b-520f0457f9ac","Type":"ContainerDied","Data":"c88d4928e4941bbba968e503dcd1361302276773fdecc26ed008af20ac9e5165"} Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.372702 4743 generic.go:334] "Generic (PLEG): container finished" podID="f6d1b00d-147b-4865-b659-59d06f360797" containerID="bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9" exitCode=137 Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.372848 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron8dc4-account-delete-rtl4b" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.373644 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi1d8b-account-delete-scjlt"] Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.373686 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron8dc4-account-delete-rtl4b" event={"ID":"f6d1b00d-147b-4865-b659-59d06f360797","Type":"ContainerDied","Data":"bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9"} Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.373710 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron8dc4-account-delete-rtl4b" event={"ID":"f6d1b00d-147b-4865-b659-59d06f360797","Type":"ContainerDied","Data":"e0b17a045985a2ee9326dd2c57115606fefec5c4be6421e1f11446a84efc107c"} Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.378931 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi1d8b-account-delete-scjlt"] Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.395905 4743 generic.go:334] "Generic (PLEG): container finished" podID="5310c975-ef7b-4161-ab2e-5ee94b709f9d" containerID="9d5ed428f2c04f357aa903343aea3a22402d7ece9dcdf137788ad34286fe9ec5" exitCode=137 Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.395943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican675b-account-delete-bdj7v" event={"ID":"5310c975-ef7b-4161-ab2e-5ee94b709f9d","Type":"ContainerDied","Data":"9d5ed428f2c04f357aa903343aea3a22402d7ece9dcdf137788ad34286fe9ec5"} Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.400310 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance4a60-account-delete-j4cg4"] Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.405360 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance4a60-account-delete-j4cg4"] Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.411158 4743 scope.go:117] "RemoveContainer" containerID="d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818" Nov 22 08:46:29 crc kubenswrapper[4743]: E1122 08:46:29.421728 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818\": container with ID starting with d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818 not found: ID does not exist" containerID="d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.421776 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818"} err="failed to get container status \"d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818\": rpc error: code = NotFound desc = could not find container \"d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818\": container with ID starting with d12996ee4d4bb65bbd1487713d988967856007d225eb0c1f67d3dbc1f8dd6818 not found: ID does not exist" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.421804 4743 scope.go:117] "RemoveContainer" containerID="8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.433706 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell0ad52-account-delete-9r4lw"] Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.449652 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell0ad52-account-delete-9r4lw"] Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.462388 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron8dc4-account-delete-rtl4b"] Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.473344 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron8dc4-account-delete-rtl4b"] Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.512216 4743 scope.go:117] "RemoveContainer" containerID="8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5" Nov 22 08:46:29 crc kubenswrapper[4743]: E1122 08:46:29.522387 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5\": container with ID starting with 8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5 not found: ID does not exist" containerID="8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.522445 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5"} err="failed to get container status \"8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5\": rpc error: code = NotFound desc = could not find container \"8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5\": container with ID starting with 8bc08446c74495671ee38a6aef364f767a2af22130edb024b88a5b3e944f5fd5 not found: ID does not exist" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.522475 4743 scope.go:117] "RemoveContainer" containerID="76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.551416 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican675b-account-delete-bdj7v" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.578644 4743 scope.go:117] "RemoveContainer" containerID="76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d" Nov 22 08:46:29 crc kubenswrapper[4743]: E1122 08:46:29.579005 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d\": container with ID starting with 76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d not found: ID does not exist" containerID="76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.579027 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d"} err="failed to get container status \"76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d\": rpc error: code = NotFound desc = could not find container \"76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d\": container with ID starting with 76f7ccc7a22bd009be02364adf34ad42a61311c6524af764414368553a404f8d not found: ID does not exist" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.579046 4743 scope.go:117] "RemoveContainer" containerID="bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.628911 4743 scope.go:117] "RemoveContainer" containerID="bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9" Nov 22 08:46:29 crc kubenswrapper[4743]: E1122 08:46:29.629383 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9\": container with ID starting with bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9 not found: ID does not exist" containerID="bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.629424 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9"} err="failed to get container status \"bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9\": rpc error: code = NotFound desc = could not find container \"bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9\": container with ID starting with bd6ba22094377577994d39bcf02837d20ed01b849d358fc9f5b444bdc298bdc9 not found: ID does not exist" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.707551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5310c975-ef7b-4161-ab2e-5ee94b709f9d-operator-scripts\") pod \"5310c975-ef7b-4161-ab2e-5ee94b709f9d\" (UID: \"5310c975-ef7b-4161-ab2e-5ee94b709f9d\") " Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.708295 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5310c975-ef7b-4161-ab2e-5ee94b709f9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5310c975-ef7b-4161-ab2e-5ee94b709f9d" (UID: "5310c975-ef7b-4161-ab2e-5ee94b709f9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.708469 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlsc6\" (UniqueName: \"kubernetes.io/projected/5310c975-ef7b-4161-ab2e-5ee94b709f9d-kube-api-access-dlsc6\") pod \"5310c975-ef7b-4161-ab2e-5ee94b709f9d\" (UID: \"5310c975-ef7b-4161-ab2e-5ee94b709f9d\") " Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.708751 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5310c975-ef7b-4161-ab2e-5ee94b709f9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.713793 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5310c975-ef7b-4161-ab2e-5ee94b709f9d-kube-api-access-dlsc6" (OuterVolumeSpecName: "kube-api-access-dlsc6") pod "5310c975-ef7b-4161-ab2e-5ee94b709f9d" (UID: "5310c975-ef7b-4161-ab2e-5ee94b709f9d"). InnerVolumeSpecName "kube-api-access-dlsc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.809760 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlsc6\" (UniqueName: \"kubernetes.io/projected/5310c975-ef7b-4161-ab2e-5ee94b709f9d-kube-api-access-dlsc6\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:29 crc kubenswrapper[4743]: I1122 08:46:29.847241 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d78kg"] Nov 22 08:46:29 crc kubenswrapper[4743]: W1122 08:46:29.862713 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb0d8b6b_810b_4643_89f6_c00a308a72fa.slice/crio-eeb155d44f85a7f03d90ffb143e9217e9d734b2a7fad5573423727469470b644 WatchSource:0}: Error finding container eeb155d44f85a7f03d90ffb143e9217e9d734b2a7fad5573423727469470b644: Status 404 returned error can't find the container with id eeb155d44f85a7f03d90ffb143e9217e9d734b2a7fad5573423727469470b644 Nov 22 08:46:30 crc kubenswrapper[4743]: I1122 08:46:30.404875 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican675b-account-delete-bdj7v" Nov 22 08:46:30 crc kubenswrapper[4743]: I1122 08:46:30.404941 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican675b-account-delete-bdj7v" event={"ID":"5310c975-ef7b-4161-ab2e-5ee94b709f9d","Type":"ContainerDied","Data":"94ab297637887ec62b68254745a2399c55743ad3c0e45375143ff38f5eae56ed"} Nov 22 08:46:30 crc kubenswrapper[4743]: I1122 08:46:30.405000 4743 scope.go:117] "RemoveContainer" containerID="9d5ed428f2c04f357aa903343aea3a22402d7ece9dcdf137788ad34286fe9ec5" Nov 22 08:46:30 crc kubenswrapper[4743]: I1122 08:46:30.413414 4743 generic.go:334] "Generic (PLEG): container finished" podID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerID="74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37" exitCode=0 Nov 22 08:46:30 crc kubenswrapper[4743]: I1122 08:46:30.413462 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d78kg" event={"ID":"eb0d8b6b-810b-4643-89f6-c00a308a72fa","Type":"ContainerDied","Data":"74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37"} Nov 22 08:46:30 crc kubenswrapper[4743]: I1122 08:46:30.413487 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d78kg" event={"ID":"eb0d8b6b-810b-4643-89f6-c00a308a72fa","Type":"ContainerStarted","Data":"eeb155d44f85a7f03d90ffb143e9217e9d734b2a7fad5573423727469470b644"} Nov 22 08:46:30 crc kubenswrapper[4743]: E1122 08:46:30.418640 4743 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Nov 22 08:46:30 crc kubenswrapper[4743]: E1122 08:46:30.418692 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts podName:9375da2b-3776-4c32-8afd-d1ed7b22b308 nodeName:}" failed. No retries permitted until 2025-11-22 08:47:02.418679624 +0000 UTC m=+1496.125040676 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts") pod "placement0984-account-delete-82zvj" (UID: "9375da2b-3776-4c32-8afd-d1ed7b22b308") : configmap "openstack-scripts" not found Nov 22 08:46:30 crc kubenswrapper[4743]: I1122 08:46:30.426640 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 08:46:30 crc kubenswrapper[4743]: I1122 08:46:30.450303 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican675b-account-delete-bdj7v"] Nov 22 08:46:30 crc kubenswrapper[4743]: I1122 08:46:30.456050 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican675b-account-delete-bdj7v"] Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.163391 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30ee548a-8838-4d52-867b-4dfdb6c4f641" path="/var/lib/kubelet/pods/30ee548a-8838-4d52-867b-4dfdb6c4f641/volumes" Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.163912 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5310c975-ef7b-4161-ab2e-5ee94b709f9d" path="/var/lib/kubelet/pods/5310c975-ef7b-4161-ab2e-5ee94b709f9d/volumes" Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.164813 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fca29fd-c34f-4954-960f-b5ca0812d5b0" path="/var/lib/kubelet/pods/5fca29fd-c34f-4954-960f-b5ca0812d5b0/volumes" Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.165330 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f48bbac5-2782-4c1e-b74b-520f0457f9ac" path="/var/lib/kubelet/pods/f48bbac5-2782-4c1e-b74b-520f0457f9ac/volumes" Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.167798 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6d1b00d-147b-4865-b659-59d06f360797" path="/var/lib/kubelet/pods/f6d1b00d-147b-4865-b659-59d06f360797/volumes" Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.425525 4743 generic.go:334] "Generic (PLEG): container finished" podID="9375da2b-3776-4c32-8afd-d1ed7b22b308" containerID="6ef5c4847a495226bbd00d72b1c06fd2666d37d7018b465aac2d955130276623" exitCode=137 Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.425598 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement0984-account-delete-82zvj" event={"ID":"9375da2b-3776-4c32-8afd-d1ed7b22b308","Type":"ContainerDied","Data":"6ef5c4847a495226bbd00d72b1c06fd2666d37d7018b465aac2d955130276623"} Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.757540 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement0984-account-delete-82zvj" Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.853745 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts\") pod \"9375da2b-3776-4c32-8afd-d1ed7b22b308\" (UID: \"9375da2b-3776-4c32-8afd-d1ed7b22b308\") " Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.853828 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8tq9\" (UniqueName: \"kubernetes.io/projected/9375da2b-3776-4c32-8afd-d1ed7b22b308-kube-api-access-g8tq9\") pod \"9375da2b-3776-4c32-8afd-d1ed7b22b308\" (UID: \"9375da2b-3776-4c32-8afd-d1ed7b22b308\") " Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.854521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9375da2b-3776-4c32-8afd-d1ed7b22b308" (UID: "9375da2b-3776-4c32-8afd-d1ed7b22b308"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.859201 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9375da2b-3776-4c32-8afd-d1ed7b22b308-kube-api-access-g8tq9" (OuterVolumeSpecName: "kube-api-access-g8tq9") pod "9375da2b-3776-4c32-8afd-d1ed7b22b308" (UID: "9375da2b-3776-4c32-8afd-d1ed7b22b308"). InnerVolumeSpecName "kube-api-access-g8tq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.955637 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9375da2b-3776-4c32-8afd-d1ed7b22b308-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:31 crc kubenswrapper[4743]: I1122 08:46:31.955677 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8tq9\" (UniqueName: \"kubernetes.io/projected/9375da2b-3776-4c32-8afd-d1ed7b22b308-kube-api-access-g8tq9\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:32 crc kubenswrapper[4743]: I1122 08:46:32.445216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement0984-account-delete-82zvj" event={"ID":"9375da2b-3776-4c32-8afd-d1ed7b22b308","Type":"ContainerDied","Data":"5831c8101ffb619513c175100e0c23fc1f2a816be37dfd715e1f552d2a6971be"} Nov 22 08:46:32 crc kubenswrapper[4743]: I1122 08:46:32.445252 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement0984-account-delete-82zvj" Nov 22 08:46:32 crc kubenswrapper[4743]: I1122 08:46:32.445314 4743 scope.go:117] "RemoveContainer" containerID="6ef5c4847a495226bbd00d72b1c06fd2666d37d7018b465aac2d955130276623" Nov 22 08:46:32 crc kubenswrapper[4743]: I1122 08:46:32.480432 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement0984-account-delete-82zvj"] Nov 22 08:46:32 crc kubenswrapper[4743]: I1122 08:46:32.501973 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement0984-account-delete-82zvj"] Nov 22 08:46:33 crc kubenswrapper[4743]: I1122 08:46:33.160127 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9375da2b-3776-4c32-8afd-d1ed7b22b308" path="/var/lib/kubelet/pods/9375da2b-3776-4c32-8afd-d1ed7b22b308/volumes" Nov 22 08:46:33 crc kubenswrapper[4743]: I1122 08:46:33.454446 4743 generic.go:334] "Generic (PLEG): container finished" podID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerID="d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052" exitCode=0 Nov 22 08:46:33 crc kubenswrapper[4743]: I1122 08:46:33.454491 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d78kg" event={"ID":"eb0d8b6b-810b-4643-89f6-c00a308a72fa","Type":"ContainerDied","Data":"d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052"} Nov 22 08:46:35 crc kubenswrapper[4743]: I1122 08:46:35.478767 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d78kg" event={"ID":"eb0d8b6b-810b-4643-89f6-c00a308a72fa","Type":"ContainerStarted","Data":"0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736"} Nov 22 08:46:35 crc kubenswrapper[4743]: I1122 08:46:35.496936 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d78kg" podStartSLOduration=3.395485951 podStartE2EDuration="7.496918784s" podCreationTimestamp="2025-11-22 08:46:28 +0000 UTC" firstStartedPulling="2025-11-22 08:46:30.42584294 +0000 UTC m=+1464.132203992" lastFinishedPulling="2025-11-22 08:46:34.527275773 +0000 UTC m=+1468.233636825" observedRunningTime="2025-11-22 08:46:35.495942556 +0000 UTC m=+1469.202303628" watchObservedRunningTime="2025-11-22 08:46:35.496918784 +0000 UTC m=+1469.203279836" Nov 22 08:46:39 crc kubenswrapper[4743]: I1122 08:46:39.248724 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:39 crc kubenswrapper[4743]: I1122 08:46:39.250673 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:39 crc kubenswrapper[4743]: I1122 08:46:39.292199 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:39 crc kubenswrapper[4743]: I1122 08:46:39.556847 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:39 crc kubenswrapper[4743]: I1122 08:46:39.597207 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d78kg"] Nov 22 08:46:41 crc kubenswrapper[4743]: I1122 08:46:41.539643 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d78kg" podUID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerName="registry-server" containerID="cri-o://0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736" gracePeriod=2 Nov 22 08:46:41 crc kubenswrapper[4743]: I1122 08:46:41.902478 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:41 crc kubenswrapper[4743]: I1122 08:46:41.919207 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-catalog-content\") pod \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " Nov 22 08:46:41 crc kubenswrapper[4743]: I1122 08:46:41.919314 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-utilities\") pod \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " Nov 22 08:46:41 crc kubenswrapper[4743]: I1122 08:46:41.919366 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-766xd\" (UniqueName: \"kubernetes.io/projected/eb0d8b6b-810b-4643-89f6-c00a308a72fa-kube-api-access-766xd\") pod \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\" (UID: \"eb0d8b6b-810b-4643-89f6-c00a308a72fa\") " Nov 22 08:46:41 crc kubenswrapper[4743]: I1122 08:46:41.920507 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-utilities" (OuterVolumeSpecName: "utilities") pod "eb0d8b6b-810b-4643-89f6-c00a308a72fa" (UID: "eb0d8b6b-810b-4643-89f6-c00a308a72fa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:46:41 crc kubenswrapper[4743]: I1122 08:46:41.925926 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb0d8b6b-810b-4643-89f6-c00a308a72fa-kube-api-access-766xd" (OuterVolumeSpecName: "kube-api-access-766xd") pod "eb0d8b6b-810b-4643-89f6-c00a308a72fa" (UID: "eb0d8b6b-810b-4643-89f6-c00a308a72fa"). InnerVolumeSpecName "kube-api-access-766xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:46:41 crc kubenswrapper[4743]: I1122 08:46:41.992176 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb0d8b6b-810b-4643-89f6-c00a308a72fa" (UID: "eb0d8b6b-810b-4643-89f6-c00a308a72fa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.021262 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.021339 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-766xd\" (UniqueName: \"kubernetes.io/projected/eb0d8b6b-810b-4643-89f6-c00a308a72fa-kube-api-access-766xd\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.021365 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb0d8b6b-810b-4643-89f6-c00a308a72fa-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.550041 4743 generic.go:334] "Generic (PLEG): container finished" podID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerID="0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736" exitCode=0 Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.550095 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d78kg" event={"ID":"eb0d8b6b-810b-4643-89f6-c00a308a72fa","Type":"ContainerDied","Data":"0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736"} Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.550413 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d78kg" event={"ID":"eb0d8b6b-810b-4643-89f6-c00a308a72fa","Type":"ContainerDied","Data":"eeb155d44f85a7f03d90ffb143e9217e9d734b2a7fad5573423727469470b644"} Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.550442 4743 scope.go:117] "RemoveContainer" containerID="0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.550152 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d78kg" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.587952 4743 scope.go:117] "RemoveContainer" containerID="d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.590461 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d78kg"] Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.595092 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d78kg"] Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.609949 4743 scope.go:117] "RemoveContainer" containerID="74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.635144 4743 scope.go:117] "RemoveContainer" containerID="0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736" Nov 22 08:46:42 crc kubenswrapper[4743]: E1122 08:46:42.635644 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736\": container with ID starting with 0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736 not found: ID does not exist" containerID="0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.635687 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736"} err="failed to get container status \"0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736\": rpc error: code = NotFound desc = could not find container \"0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736\": container with ID starting with 0115ce437528f91e37b05d55580d8438567b911d81c8708a4edff87bed6c8736 not found: ID does not exist" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.635713 4743 scope.go:117] "RemoveContainer" containerID="d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052" Nov 22 08:46:42 crc kubenswrapper[4743]: E1122 08:46:42.636258 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052\": container with ID starting with d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052 not found: ID does not exist" containerID="d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.636311 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052"} err="failed to get container status \"d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052\": rpc error: code = NotFound desc = could not find container \"d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052\": container with ID starting with d81af5f7504fa51407e21e47f901cbd9a606e549bff23130ac573d91bdd09052 not found: ID does not exist" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.636346 4743 scope.go:117] "RemoveContainer" containerID="74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37" Nov 22 08:46:42 crc kubenswrapper[4743]: E1122 08:46:42.636669 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37\": container with ID starting with 74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37 not found: ID does not exist" containerID="74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37" Nov 22 08:46:42 crc kubenswrapper[4743]: I1122 08:46:42.636711 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37"} err="failed to get container status \"74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37\": rpc error: code = NotFound desc = could not find container \"74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37\": container with ID starting with 74dd37b78e12a9654681c334d9cc8e527e06dac846a0ea6f8bd97e25655c8b37 not found: ID does not exist" Nov 22 08:46:43 crc kubenswrapper[4743]: I1122 08:46:43.162862 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" path="/var/lib/kubelet/pods/eb0d8b6b-810b-4643-89f6-c00a308a72fa/volumes" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.729749 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mwb4v"] Nov 22 08:46:48 crc kubenswrapper[4743]: E1122 08:46:48.730548 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5310c975-ef7b-4161-ab2e-5ee94b709f9d" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730563 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5310c975-ef7b-4161-ab2e-5ee94b709f9d" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: E1122 08:46:48.730624 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerName="registry-server" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730635 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerName="registry-server" Nov 22 08:46:48 crc kubenswrapper[4743]: E1122 08:46:48.730642 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerName="extract-utilities" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730651 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerName="extract-utilities" Nov 22 08:46:48 crc kubenswrapper[4743]: E1122 08:46:48.730660 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fca29fd-c34f-4954-960f-b5ca0812d5b0" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730666 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fca29fd-c34f-4954-960f-b5ca0812d5b0" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: E1122 08:46:48.730680 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6d1b00d-147b-4865-b659-59d06f360797" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730686 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6d1b00d-147b-4865-b659-59d06f360797" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: E1122 08:46:48.730699 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerName="extract-content" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730704 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerName="extract-content" Nov 22 08:46:48 crc kubenswrapper[4743]: E1122 08:46:48.730716 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9375da2b-3776-4c32-8afd-d1ed7b22b308" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730722 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9375da2b-3776-4c32-8afd-d1ed7b22b308" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: E1122 08:46:48.730731 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30ee548a-8838-4d52-867b-4dfdb6c4f641" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730737 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="30ee548a-8838-4d52-867b-4dfdb6c4f641" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: E1122 08:46:48.730744 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f48bbac5-2782-4c1e-b74b-520f0457f9ac" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730751 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f48bbac5-2782-4c1e-b74b-520f0457f9ac" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730875 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fca29fd-c34f-4954-960f-b5ca0812d5b0" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730893 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9375da2b-3776-4c32-8afd-d1ed7b22b308" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730904 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="30ee548a-8838-4d52-867b-4dfdb6c4f641" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730911 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6d1b00d-147b-4865-b659-59d06f360797" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730916 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0d8b6b-810b-4643-89f6-c00a308a72fa" containerName="registry-server" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730925 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f48bbac5-2782-4c1e-b74b-520f0457f9ac" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.730934 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5310c975-ef7b-4161-ab2e-5ee94b709f9d" containerName="mariadb-account-delete" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.731980 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.747123 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwb4v"] Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.813898 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-catalog-content\") pod \"certified-operators-mwb4v\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.814038 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7cbz\" (UniqueName: \"kubernetes.io/projected/781c48f4-faf6-4a95-b3fb-ab4f5427e525-kube-api-access-h7cbz\") pod \"certified-operators-mwb4v\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.814138 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-utilities\") pod \"certified-operators-mwb4v\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.915407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-utilities\") pod \"certified-operators-mwb4v\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.915948 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-utilities\") pod \"certified-operators-mwb4v\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.916103 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-catalog-content\") pod \"certified-operators-mwb4v\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.916183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7cbz\" (UniqueName: \"kubernetes.io/projected/781c48f4-faf6-4a95-b3fb-ab4f5427e525-kube-api-access-h7cbz\") pod \"certified-operators-mwb4v\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.916503 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-catalog-content\") pod \"certified-operators-mwb4v\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:48 crc kubenswrapper[4743]: I1122 08:46:48.934203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7cbz\" (UniqueName: \"kubernetes.io/projected/781c48f4-faf6-4a95-b3fb-ab4f5427e525-kube-api-access-h7cbz\") pod \"certified-operators-mwb4v\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:49 crc kubenswrapper[4743]: I1122 08:46:49.090328 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:49 crc kubenswrapper[4743]: I1122 08:46:49.617482 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mwb4v"] Nov 22 08:46:49 crc kubenswrapper[4743]: W1122 08:46:49.630812 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod781c48f4_faf6_4a95_b3fb_ab4f5427e525.slice/crio-9bc724cec8eb845e7c6e9d5698d285123004f300980a1129cf3a525e02950db9 WatchSource:0}: Error finding container 9bc724cec8eb845e7c6e9d5698d285123004f300980a1129cf3a525e02950db9: Status 404 returned error can't find the container with id 9bc724cec8eb845e7c6e9d5698d285123004f300980a1129cf3a525e02950db9 Nov 22 08:46:50 crc kubenswrapper[4743]: I1122 08:46:50.619216 4743 generic.go:334] "Generic (PLEG): container finished" podID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerID="4d6d62125b1d1ac81ae0bc02075269ca14bb6e629f377f5e5aa59f7ccb4cf6ea" exitCode=0 Nov 22 08:46:50 crc kubenswrapper[4743]: I1122 08:46:50.619328 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwb4v" event={"ID":"781c48f4-faf6-4a95-b3fb-ab4f5427e525","Type":"ContainerDied","Data":"4d6d62125b1d1ac81ae0bc02075269ca14bb6e629f377f5e5aa59f7ccb4cf6ea"} Nov 22 08:46:50 crc kubenswrapper[4743]: I1122 08:46:50.619657 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwb4v" event={"ID":"781c48f4-faf6-4a95-b3fb-ab4f5427e525","Type":"ContainerStarted","Data":"9bc724cec8eb845e7c6e9d5698d285123004f300980a1129cf3a525e02950db9"} Nov 22 08:46:51 crc kubenswrapper[4743]: I1122 08:46:51.631292 4743 generic.go:334] "Generic (PLEG): container finished" podID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerID="20b4ca9a25214d8c73e1b721a363be51e5626cc576729620fb0180afefb865f5" exitCode=0 Nov 22 08:46:51 crc kubenswrapper[4743]: I1122 08:46:51.631394 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwb4v" event={"ID":"781c48f4-faf6-4a95-b3fb-ab4f5427e525","Type":"ContainerDied","Data":"20b4ca9a25214d8c73e1b721a363be51e5626cc576729620fb0180afefb865f5"} Nov 22 08:46:52 crc kubenswrapper[4743]: I1122 08:46:52.644563 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwb4v" event={"ID":"781c48f4-faf6-4a95-b3fb-ab4f5427e525","Type":"ContainerStarted","Data":"d55d48e3106a99420762b9939960ffbcb32dc863ae9e41e9f444e3420b0e6176"} Nov 22 08:46:52 crc kubenswrapper[4743]: I1122 08:46:52.668634 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mwb4v" podStartSLOduration=3.259075621 podStartE2EDuration="4.668603414s" podCreationTimestamp="2025-11-22 08:46:48 +0000 UTC" firstStartedPulling="2025-11-22 08:46:50.622470479 +0000 UTC m=+1484.328831531" lastFinishedPulling="2025-11-22 08:46:52.031998272 +0000 UTC m=+1485.738359324" observedRunningTime="2025-11-22 08:46:52.660671515 +0000 UTC m=+1486.367032577" watchObservedRunningTime="2025-11-22 08:46:52.668603414 +0000 UTC m=+1486.374964476" Nov 22 08:46:59 crc kubenswrapper[4743]: I1122 08:46:59.091995 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:59 crc kubenswrapper[4743]: I1122 08:46:59.093565 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:59 crc kubenswrapper[4743]: I1122 08:46:59.147723 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:46:59 crc kubenswrapper[4743]: I1122 08:46:59.743253 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:47:00 crc kubenswrapper[4743]: I1122 08:47:00.091128 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwb4v"] Nov 22 08:47:01 crc kubenswrapper[4743]: I1122 08:47:01.712315 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mwb4v" podUID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerName="registry-server" containerID="cri-o://d55d48e3106a99420762b9939960ffbcb32dc863ae9e41e9f444e3420b0e6176" gracePeriod=2 Nov 22 08:47:03 crc kubenswrapper[4743]: I1122 08:47:03.737232 4743 generic.go:334] "Generic (PLEG): container finished" podID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerID="d55d48e3106a99420762b9939960ffbcb32dc863ae9e41e9f444e3420b0e6176" exitCode=0 Nov 22 08:47:03 crc kubenswrapper[4743]: I1122 08:47:03.737297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwb4v" event={"ID":"781c48f4-faf6-4a95-b3fb-ab4f5427e525","Type":"ContainerDied","Data":"d55d48e3106a99420762b9939960ffbcb32dc863ae9e41e9f444e3420b0e6176"} Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.374012 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.465829 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-catalog-content\") pod \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.465919 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-utilities\") pod \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.465968 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7cbz\" (UniqueName: \"kubernetes.io/projected/781c48f4-faf6-4a95-b3fb-ab4f5427e525-kube-api-access-h7cbz\") pod \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\" (UID: \"781c48f4-faf6-4a95-b3fb-ab4f5427e525\") " Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.467270 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-utilities" (OuterVolumeSpecName: "utilities") pod "781c48f4-faf6-4a95-b3fb-ab4f5427e525" (UID: "781c48f4-faf6-4a95-b3fb-ab4f5427e525"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.472597 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/781c48f4-faf6-4a95-b3fb-ab4f5427e525-kube-api-access-h7cbz" (OuterVolumeSpecName: "kube-api-access-h7cbz") pod "781c48f4-faf6-4a95-b3fb-ab4f5427e525" (UID: "781c48f4-faf6-4a95-b3fb-ab4f5427e525"). InnerVolumeSpecName "kube-api-access-h7cbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.567909 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.567946 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7cbz\" (UniqueName: \"kubernetes.io/projected/781c48f4-faf6-4a95-b3fb-ab4f5427e525-kube-api-access-h7cbz\") on node \"crc\" DevicePath \"\"" Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.754283 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mwb4v" event={"ID":"781c48f4-faf6-4a95-b3fb-ab4f5427e525","Type":"ContainerDied","Data":"9bc724cec8eb845e7c6e9d5698d285123004f300980a1129cf3a525e02950db9"} Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.754323 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mwb4v" Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.754337 4743 scope.go:117] "RemoveContainer" containerID="d55d48e3106a99420762b9939960ffbcb32dc863ae9e41e9f444e3420b0e6176" Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.774886 4743 scope.go:117] "RemoveContainer" containerID="20b4ca9a25214d8c73e1b721a363be51e5626cc576729620fb0180afefb865f5" Nov 22 08:47:05 crc kubenswrapper[4743]: I1122 08:47:05.793551 4743 scope.go:117] "RemoveContainer" containerID="4d6d62125b1d1ac81ae0bc02075269ca14bb6e629f377f5e5aa59f7ccb4cf6ea" Nov 22 08:47:06 crc kubenswrapper[4743]: I1122 08:47:06.186761 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "781c48f4-faf6-4a95-b3fb-ab4f5427e525" (UID: "781c48f4-faf6-4a95-b3fb-ab4f5427e525"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:47:06 crc kubenswrapper[4743]: I1122 08:47:06.279325 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/781c48f4-faf6-4a95-b3fb-ab4f5427e525-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:47:06 crc kubenswrapper[4743]: I1122 08:47:06.385047 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mwb4v"] Nov 22 08:47:06 crc kubenswrapper[4743]: I1122 08:47:06.391290 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mwb4v"] Nov 22 08:47:07 crc kubenswrapper[4743]: I1122 08:47:07.161296 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" path="/var/lib/kubelet/pods/781c48f4-faf6-4a95-b3fb-ab4f5427e525/volumes" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.277938 4743 scope.go:117] "RemoveContainer" containerID="a9167b92a9dde989a74ae857841a3d9207fe61f9c8d20e0a6521ebbba621df1f" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.304527 4743 scope.go:117] "RemoveContainer" containerID="6d69d74bde61491735c233b8f8505c03c211704fff1bdd9e4f6579ecc321af8e" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.341161 4743 scope.go:117] "RemoveContainer" containerID="65b882681867eb01a06888b0955cb4a26c937522146e082819e8320ea62fe3d5" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.373717 4743 scope.go:117] "RemoveContainer" containerID="a9937bd5c1410e351ab5ec4ba137b45596917cb622464a85d33d573baae61398" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.402925 4743 scope.go:117] "RemoveContainer" containerID="1492488c837aa8f04f7299b1d413786a94dd865f945e841bffe9f473e3ac4bf5" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.424756 4743 scope.go:117] "RemoveContainer" containerID="59bb33e985a8fb36c339adbe9885bf0826891ad476e2ede03cf3c29fdf19037b" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.445185 4743 scope.go:117] "RemoveContainer" containerID="1fbbbdfbb859c9da828f408196fc0aa8d1393484e6831a83bb031fda26468ddc" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.478547 4743 scope.go:117] "RemoveContainer" containerID="f2886216dda914ee00b1675fd8db61d4ea910f9773dfec2868ea4686645bbea2" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.495846 4743 scope.go:117] "RemoveContainer" containerID="c2cd99ce8d0b171935d2ad05dc8e366d1a81b0cd4e4bf014f079a93f8d17c5ad" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.518333 4743 scope.go:117] "RemoveContainer" containerID="570cc25ff76bbe41a98fb2a034cfe25b67f71d87b0c7e1df88da843ee55eeb93" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.546263 4743 scope.go:117] "RemoveContainer" containerID="9d3af77ffcdc657327ec7c448b5a0a750fd3a193887ca244cc6a7b3498d8993a" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.568021 4743 scope.go:117] "RemoveContainer" containerID="5215c7383d75012abf3d0d94618fad8a23559b994de0167f56986ac7a14b929d" Nov 22 08:47:28 crc kubenswrapper[4743]: I1122 08:47:28.588445 4743 scope.go:117] "RemoveContainer" containerID="21ecc6b3d20c284ef43b926c5c463d821ee2edacb2a3220709a25fbf01483ae9" Nov 22 08:48:01 crc kubenswrapper[4743]: I1122 08:48:01.241416 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:48:01 crc kubenswrapper[4743]: I1122 08:48:01.242038 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:48:28 crc kubenswrapper[4743]: I1122 08:48:28.856693 4743 scope.go:117] "RemoveContainer" containerID="402d9421e3dbd2b5e063a6bf98415c17b6ebaa5cf0bc4fe18813a071b338fe93" Nov 22 08:48:28 crc kubenswrapper[4743]: I1122 08:48:28.878088 4743 scope.go:117] "RemoveContainer" containerID="522839b5fdebb84c1fea1d0e014e5145ae5940ceb284fa8c95f021daffa542f0" Nov 22 08:48:28 crc kubenswrapper[4743]: I1122 08:48:28.909942 4743 scope.go:117] "RemoveContainer" containerID="849fd71ccf74f0eb1a2e281f40b05417d8904c997955819096f6193efd1ab257" Nov 22 08:48:28 crc kubenswrapper[4743]: I1122 08:48:28.936345 4743 scope.go:117] "RemoveContainer" containerID="9610dc8f6778c28876a91471c6c7ed36b0828b18a27d7acf62fa3c2dcf65b6ca" Nov 22 08:48:28 crc kubenswrapper[4743]: I1122 08:48:28.980984 4743 scope.go:117] "RemoveContainer" containerID="02c08be1a06f176f363240d80ae5a5c4fa7036c63f2fb50020e992f965215d10" Nov 22 08:48:29 crc kubenswrapper[4743]: I1122 08:48:29.003108 4743 scope.go:117] "RemoveContainer" containerID="5b088da74bbd6737792507a6ad04b9744e4130e90691d8f27ec2c054e8c7fc8d" Nov 22 08:48:29 crc kubenswrapper[4743]: I1122 08:48:29.037272 4743 scope.go:117] "RemoveContainer" containerID="7c16c6e79f2fba2bf718be7b393d3ab67b2c838e9819d990799299a5fe9b826a" Nov 22 08:48:29 crc kubenswrapper[4743]: I1122 08:48:29.076466 4743 scope.go:117] "RemoveContainer" containerID="d65252f2cbf9207547c52ed982045e996c05e604dc4138895706f868b51d4c6c" Nov 22 08:48:29 crc kubenswrapper[4743]: I1122 08:48:29.097370 4743 scope.go:117] "RemoveContainer" containerID="9634df27808c5b6900e16e550218eae167007f2f968cbec651ad9d5729aaf28a" Nov 22 08:48:29 crc kubenswrapper[4743]: I1122 08:48:29.119477 4743 scope.go:117] "RemoveContainer" containerID="d86df506e2e6495d0d0573fb95792122de57a18bc48b48390f2fddae46e7a46f" Nov 22 08:48:29 crc kubenswrapper[4743]: I1122 08:48:29.161540 4743 scope.go:117] "RemoveContainer" containerID="bfd31ce4d61a607de637c6db8af6011fdd9c45b14e183e2552977504481ffd47" Nov 22 08:48:29 crc kubenswrapper[4743]: I1122 08:48:29.181767 4743 scope.go:117] "RemoveContainer" containerID="6ed030084c63587910484888925d6585aae009c6384bcc1c9a46d6aea22044b6" Nov 22 08:48:31 crc kubenswrapper[4743]: I1122 08:48:31.241015 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:48:31 crc kubenswrapper[4743]: I1122 08:48:31.241075 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:49:01 crc kubenswrapper[4743]: I1122 08:49:01.242015 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:49:01 crc kubenswrapper[4743]: I1122 08:49:01.242671 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:49:01 crc kubenswrapper[4743]: I1122 08:49:01.242726 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:49:01 crc kubenswrapper[4743]: I1122 08:49:01.243428 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 08:49:01 crc kubenswrapper[4743]: I1122 08:49:01.243479 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" gracePeriod=600 Nov 22 08:49:01 crc kubenswrapper[4743]: E1122 08:49:01.387812 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:49:01 crc kubenswrapper[4743]: I1122 08:49:01.732129 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" exitCode=0 Nov 22 08:49:01 crc kubenswrapper[4743]: I1122 08:49:01.732168 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5"} Nov 22 08:49:01 crc kubenswrapper[4743]: I1122 08:49:01.732237 4743 scope.go:117] "RemoveContainer" containerID="986e2145b00a5a447ecb09e84f860b781baabf3cc2562b60d26d99a571cd2cc8" Nov 22 08:49:01 crc kubenswrapper[4743]: I1122 08:49:01.732852 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:49:01 crc kubenswrapper[4743]: E1122 08:49:01.733149 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:49:14 crc kubenswrapper[4743]: I1122 08:49:14.151626 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:49:14 crc kubenswrapper[4743]: E1122 08:49:14.152418 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:49:28 crc kubenswrapper[4743]: I1122 08:49:28.151699 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:49:28 crc kubenswrapper[4743]: E1122 08:49:28.152494 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.381393 4743 scope.go:117] "RemoveContainer" containerID="12838b3e542aa21904acab03d6b27d30ec54f1471909fca6df88ff3e1aee935d" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.403902 4743 scope.go:117] "RemoveContainer" containerID="7711ec056fa213f2eee796483c27379cd7a134fa6030ba1e23b52a3457a46cec" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.423747 4743 scope.go:117] "RemoveContainer" containerID="cbc06588fe9760520abd9ff10252f044f013bc9dd64e11f663750ea38a18547b" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.440020 4743 scope.go:117] "RemoveContainer" containerID="69a331217c6e9870990cf0477268ec07b586afa72d1bd546c97e364e672bdc27" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.463917 4743 scope.go:117] "RemoveContainer" containerID="1353a3a23996d88dd42b52f243bc37775202d4377fb8072678af08d9a69e8b34" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.482221 4743 scope.go:117] "RemoveContainer" containerID="b628f200e7a2bedf0d8e8e4b5953a55be8ad9ac041c328380e5595675b2b42f4" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.497822 4743 scope.go:117] "RemoveContainer" containerID="c35c08056d876c19f4f4c85186c089afae3fdfa5b6c85d74b609121293b9cccf" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.528043 4743 scope.go:117] "RemoveContainer" containerID="ff8003a3594d25ec03aad9438f8a8b6e3c4495c012f444863e724569495817e4" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.546794 4743 scope.go:117] "RemoveContainer" containerID="596f95b1d0cc9abb230b4c2a4a8c4b0c1af12cc6eed82f9960b7ca6e13289379" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.566752 4743 scope.go:117] "RemoveContainer" containerID="ed4a51a510556630bce53eb084d8e72e4a5c86ebf6ea25c0ba40d17086b1eead" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.586417 4743 scope.go:117] "RemoveContainer" containerID="8cd97805271c2689ffeedd2861c6862a07880da94fdba3980f8515ddcf5802bb" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.618673 4743 scope.go:117] "RemoveContainer" containerID="98e216804cd00368cfd172ea133b9f2e2806dd9d0733496c7ada731c9979c7c3" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.664771 4743 scope.go:117] "RemoveContainer" containerID="c4858ef9316fe50db2369ef4f71a3b1345ef98bcbaf371bd82d3cde2ff7b09b8" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.691125 4743 scope.go:117] "RemoveContainer" containerID="4c285ff99f16c40f8e9fb6688027dc480aaea87e547c44567abd12812584f0a8" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.715868 4743 scope.go:117] "RemoveContainer" containerID="4d28bbe29fc448e35abc64e8a734311fe6a648e6bd6b421e1355130eba81ef7e" Nov 22 08:49:29 crc kubenswrapper[4743]: I1122 08:49:29.739824 4743 scope.go:117] "RemoveContainer" containerID="b87c8deb0c6f1f3c1134e38ff7289f1edfe2b60e90ae6b47a46057bcb212868c" Nov 22 08:49:40 crc kubenswrapper[4743]: I1122 08:49:40.151774 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:49:40 crc kubenswrapper[4743]: E1122 08:49:40.152522 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:49:54 crc kubenswrapper[4743]: I1122 08:49:54.151855 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:49:54 crc kubenswrapper[4743]: E1122 08:49:54.152557 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:50:07 crc kubenswrapper[4743]: I1122 08:50:07.155933 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:50:07 crc kubenswrapper[4743]: E1122 08:50:07.156553 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:50:20 crc kubenswrapper[4743]: I1122 08:50:20.152423 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:50:20 crc kubenswrapper[4743]: E1122 08:50:20.153293 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:50:29 crc kubenswrapper[4743]: I1122 08:50:29.930897 4743 scope.go:117] "RemoveContainer" containerID="1e3e8fa38a54921afda25d5c5650246d86727b8836380c7c504af6b09ba4a0eb" Nov 22 08:50:29 crc kubenswrapper[4743]: I1122 08:50:29.963599 4743 scope.go:117] "RemoveContainer" containerID="83164b2e658bb9ac77208bcdab8d7ea5bcddd9ddb221ef2bb7c6d22ed509bf07" Nov 22 08:50:30 crc kubenswrapper[4743]: I1122 08:50:30.001254 4743 scope.go:117] "RemoveContainer" containerID="7305244bd79cd85c2c92eab84566fc7d97bcd7fde2ff9a55e6572d0e121cf472" Nov 22 08:50:30 crc kubenswrapper[4743]: I1122 08:50:30.039325 4743 scope.go:117] "RemoveContainer" containerID="e06fe6ab41a32e8d8623a2af9e10dd42dc4c4ab2b15a0a739bcf375a4c618b9c" Nov 22 08:50:34 crc kubenswrapper[4743]: I1122 08:50:34.151663 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:50:34 crc kubenswrapper[4743]: E1122 08:50:34.152490 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:50:45 crc kubenswrapper[4743]: I1122 08:50:45.151985 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:50:45 crc kubenswrapper[4743]: E1122 08:50:45.152865 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:50:59 crc kubenswrapper[4743]: I1122 08:50:59.152172 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:50:59 crc kubenswrapper[4743]: E1122 08:50:59.152987 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:51:12 crc kubenswrapper[4743]: I1122 08:51:12.152077 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:51:12 crc kubenswrapper[4743]: E1122 08:51:12.152799 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:51:24 crc kubenswrapper[4743]: I1122 08:51:24.150975 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:51:24 crc kubenswrapper[4743]: E1122 08:51:24.151821 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.135677 4743 scope.go:117] "RemoveContainer" containerID="d5f358c51f3837120bf2786591f156a51d70cdabcf793d05895bd486bf90bd29" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.155740 4743 scope.go:117] "RemoveContainer" containerID="f6a47a6042cdcc435e43f807a46da87bc4a2b5807e85c840896d368eabd6740d" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.184346 4743 scope.go:117] "RemoveContainer" containerID="31240114f37ac66a6ac0ee75966656d89b98f5b65714a820dfeae421393b0b13" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.208915 4743 scope.go:117] "RemoveContainer" containerID="37fbdedccc446a670cae08eb9632f568f7a7eccdeada9e4e6146ebf59a36b2e3" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.229862 4743 scope.go:117] "RemoveContainer" containerID="4b24ccee2f20c0c9bff9ed0577c5b13a5e4c322c8c14f5cae7487c6ed9272a36" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.244762 4743 scope.go:117] "RemoveContainer" containerID="a87505d83945ff1d2017945a039358590dde4a277ab690f7938e248ef7eb6722" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.262386 4743 scope.go:117] "RemoveContainer" containerID="3a8d852d3f689474ae890d019ad247d38067084ae22ca4aca5bee80eb525bd90" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.278324 4743 scope.go:117] "RemoveContainer" containerID="d286017c65a937c25c9d60c902e5c3e3d08c52047202b26fc6832bb9d0b90d7b" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.293903 4743 scope.go:117] "RemoveContainer" containerID="0fc045e12216ec39ccb7f9f0e77c526341966c30dccbf1e54a1b50e33279ef52" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.309650 4743 scope.go:117] "RemoveContainer" containerID="bd056c88a18e8d56a5bd78a0aae1c581bd242d465d94e8b629291f01de7a8525" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.345336 4743 scope.go:117] "RemoveContainer" containerID="f437f9d019295d0f78778e5e7033bf6f20cb2b01e6b07ff80c52b784e7ef5faa" Nov 22 08:51:30 crc kubenswrapper[4743]: I1122 08:51:30.364867 4743 scope.go:117] "RemoveContainer" containerID="4f25a424241601e91504248ab884e7a0f9860edb39f6eab7afdb79fa3b729315" Nov 22 08:51:38 crc kubenswrapper[4743]: I1122 08:51:38.151118 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:51:38 crc kubenswrapper[4743]: E1122 08:51:38.151896 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:51:53 crc kubenswrapper[4743]: I1122 08:51:53.151677 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:51:53 crc kubenswrapper[4743]: E1122 08:51:53.152542 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:52:04 crc kubenswrapper[4743]: I1122 08:52:04.153312 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:52:04 crc kubenswrapper[4743]: E1122 08:52:04.153949 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:52:19 crc kubenswrapper[4743]: I1122 08:52:19.152205 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:52:19 crc kubenswrapper[4743]: E1122 08:52:19.152900 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:52:30 crc kubenswrapper[4743]: I1122 08:52:30.485857 4743 scope.go:117] "RemoveContainer" containerID="e3311847d040f49ee2f688d4b62c7bd813884fd1d9ee40438d2d48f3bbbd5240" Nov 22 08:52:32 crc kubenswrapper[4743]: I1122 08:52:32.151857 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:52:32 crc kubenswrapper[4743]: E1122 08:52:32.152394 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:52:44 crc kubenswrapper[4743]: I1122 08:52:44.151725 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:52:44 crc kubenswrapper[4743]: E1122 08:52:44.152570 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:52:58 crc kubenswrapper[4743]: I1122 08:52:58.152593 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:52:58 crc kubenswrapper[4743]: E1122 08:52:58.153353 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:53:10 crc kubenswrapper[4743]: I1122 08:53:10.151032 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:53:10 crc kubenswrapper[4743]: E1122 08:53:10.151716 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:53:25 crc kubenswrapper[4743]: I1122 08:53:25.151837 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:53:25 crc kubenswrapper[4743]: E1122 08:53:25.152644 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:53:40 crc kubenswrapper[4743]: I1122 08:53:40.151901 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:53:40 crc kubenswrapper[4743]: E1122 08:53:40.152613 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:53:54 crc kubenswrapper[4743]: I1122 08:53:54.152043 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:53:54 crc kubenswrapper[4743]: E1122 08:53:54.152764 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 08:54:07 crc kubenswrapper[4743]: I1122 08:54:07.156513 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:54:08 crc kubenswrapper[4743]: I1122 08:54:08.088076 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"8942efd4903f2b0d2e78730494ebbc46c37bc276bc4d1b8bfab93e628b8157ff"} Nov 22 08:55:25 crc kubenswrapper[4743]: I1122 08:55:25.870741 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6r4l2"] Nov 22 08:55:25 crc kubenswrapper[4743]: E1122 08:55:25.871876 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerName="extract-utilities" Nov 22 08:55:25 crc kubenswrapper[4743]: I1122 08:55:25.871891 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerName="extract-utilities" Nov 22 08:55:25 crc kubenswrapper[4743]: E1122 08:55:25.871908 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerName="registry-server" Nov 22 08:55:25 crc kubenswrapper[4743]: I1122 08:55:25.871937 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerName="registry-server" Nov 22 08:55:25 crc kubenswrapper[4743]: E1122 08:55:25.871955 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerName="extract-content" Nov 22 08:55:25 crc kubenswrapper[4743]: I1122 08:55:25.871963 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerName="extract-content" Nov 22 08:55:25 crc kubenswrapper[4743]: I1122 08:55:25.872165 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="781c48f4-faf6-4a95-b3fb-ab4f5427e525" containerName="registry-server" Nov 22 08:55:25 crc kubenswrapper[4743]: I1122 08:55:25.873414 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:25 crc kubenswrapper[4743]: I1122 08:55:25.877439 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6r4l2"] Nov 22 08:55:25 crc kubenswrapper[4743]: I1122 08:55:25.914868 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n575\" (UniqueName: \"kubernetes.io/projected/2263583b-69f7-43c4-821d-f174c42c9ab9-kube-api-access-4n575\") pod \"redhat-marketplace-6r4l2\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:25 crc kubenswrapper[4743]: I1122 08:55:25.914942 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-utilities\") pod \"redhat-marketplace-6r4l2\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:25 crc kubenswrapper[4743]: I1122 08:55:25.914961 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-catalog-content\") pod \"redhat-marketplace-6r4l2\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:26 crc kubenswrapper[4743]: I1122 08:55:26.015753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n575\" (UniqueName: \"kubernetes.io/projected/2263583b-69f7-43c4-821d-f174c42c9ab9-kube-api-access-4n575\") pod \"redhat-marketplace-6r4l2\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:26 crc kubenswrapper[4743]: I1122 08:55:26.015815 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-utilities\") pod \"redhat-marketplace-6r4l2\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:26 crc kubenswrapper[4743]: I1122 08:55:26.015833 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-catalog-content\") pod \"redhat-marketplace-6r4l2\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:26 crc kubenswrapper[4743]: I1122 08:55:26.016311 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-catalog-content\") pod \"redhat-marketplace-6r4l2\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:26 crc kubenswrapper[4743]: I1122 08:55:26.016992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-utilities\") pod \"redhat-marketplace-6r4l2\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:26 crc kubenswrapper[4743]: I1122 08:55:26.038892 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n575\" (UniqueName: \"kubernetes.io/projected/2263583b-69f7-43c4-821d-f174c42c9ab9-kube-api-access-4n575\") pod \"redhat-marketplace-6r4l2\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:26 crc kubenswrapper[4743]: I1122 08:55:26.204487 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:26 crc kubenswrapper[4743]: I1122 08:55:26.646980 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6r4l2"] Nov 22 08:55:26 crc kubenswrapper[4743]: I1122 08:55:26.707840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r4l2" event={"ID":"2263583b-69f7-43c4-821d-f174c42c9ab9","Type":"ContainerStarted","Data":"b43a50b7fe9eec492beb06eaf21f7cb8463cd3af8e3b04e35afd04351e42d4a4"} Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.260777 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dm4k8"] Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.263383 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.272020 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dm4k8"] Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.437099 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7lnb\" (UniqueName: \"kubernetes.io/projected/a34fc030-3ed9-47d7-86ff-bca89184999a-kube-api-access-p7lnb\") pod \"redhat-operators-dm4k8\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.437181 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-catalog-content\") pod \"redhat-operators-dm4k8\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.437804 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-utilities\") pod \"redhat-operators-dm4k8\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.539761 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-utilities\") pod \"redhat-operators-dm4k8\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.539910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7lnb\" (UniqueName: \"kubernetes.io/projected/a34fc030-3ed9-47d7-86ff-bca89184999a-kube-api-access-p7lnb\") pod \"redhat-operators-dm4k8\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.539937 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-catalog-content\") pod \"redhat-operators-dm4k8\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.540433 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-utilities\") pod \"redhat-operators-dm4k8\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.540488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-catalog-content\") pod \"redhat-operators-dm4k8\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.561026 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7lnb\" (UniqueName: \"kubernetes.io/projected/a34fc030-3ed9-47d7-86ff-bca89184999a-kube-api-access-p7lnb\") pod \"redhat-operators-dm4k8\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.582751 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.718128 4743 generic.go:334] "Generic (PLEG): container finished" podID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerID="5056bf435d54a8c5ebd20fe0a92a7a3a5cf55fdc27ad873d2c6538569017134a" exitCode=0 Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.718177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r4l2" event={"ID":"2263583b-69f7-43c4-821d-f174c42c9ab9","Type":"ContainerDied","Data":"5056bf435d54a8c5ebd20fe0a92a7a3a5cf55fdc27ad873d2c6538569017134a"} Nov 22 08:55:27 crc kubenswrapper[4743]: I1122 08:55:27.720087 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 08:55:28 crc kubenswrapper[4743]: I1122 08:55:28.059546 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dm4k8"] Nov 22 08:55:28 crc kubenswrapper[4743]: I1122 08:55:28.728603 4743 generic.go:334] "Generic (PLEG): container finished" podID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerID="aaa3ca9a5876e15ff81e88c31ad0fa28105434905e9a368bf5b8a07b20dd0d85" exitCode=0 Nov 22 08:55:28 crc kubenswrapper[4743]: I1122 08:55:28.728703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dm4k8" event={"ID":"a34fc030-3ed9-47d7-86ff-bca89184999a","Type":"ContainerDied","Data":"aaa3ca9a5876e15ff81e88c31ad0fa28105434905e9a368bf5b8a07b20dd0d85"} Nov 22 08:55:28 crc kubenswrapper[4743]: I1122 08:55:28.729659 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dm4k8" event={"ID":"a34fc030-3ed9-47d7-86ff-bca89184999a","Type":"ContainerStarted","Data":"c036351cd1660f5ebbcd2212f22c175404ed803cc4f77dd70ba93ebc92ce3696"} Nov 22 08:55:29 crc kubenswrapper[4743]: I1122 08:55:29.744926 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r4l2" event={"ID":"2263583b-69f7-43c4-821d-f174c42c9ab9","Type":"ContainerStarted","Data":"0d2261cef3b38886e050330e8ebd99a9319c65d6d6408aa6f7369f8f102cf5b0"} Nov 22 08:55:30 crc kubenswrapper[4743]: I1122 08:55:30.756014 4743 generic.go:334] "Generic (PLEG): container finished" podID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerID="0d2261cef3b38886e050330e8ebd99a9319c65d6d6408aa6f7369f8f102cf5b0" exitCode=0 Nov 22 08:55:30 crc kubenswrapper[4743]: I1122 08:55:30.756074 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r4l2" event={"ID":"2263583b-69f7-43c4-821d-f174c42c9ab9","Type":"ContainerDied","Data":"0d2261cef3b38886e050330e8ebd99a9319c65d6d6408aa6f7369f8f102cf5b0"} Nov 22 08:55:31 crc kubenswrapper[4743]: I1122 08:55:31.764835 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r4l2" event={"ID":"2263583b-69f7-43c4-821d-f174c42c9ab9","Type":"ContainerStarted","Data":"26c66ff75a7d9dda195439870d76d52a8c90ad19f4969930b081eab3c74cf51a"} Nov 22 08:55:31 crc kubenswrapper[4743]: I1122 08:55:31.767310 4743 generic.go:334] "Generic (PLEG): container finished" podID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerID="945ca2162e2ebc665663c1eb78844799f592504b4c4f050655139a2f2fe0c901" exitCode=0 Nov 22 08:55:31 crc kubenswrapper[4743]: I1122 08:55:31.767341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dm4k8" event={"ID":"a34fc030-3ed9-47d7-86ff-bca89184999a","Type":"ContainerDied","Data":"945ca2162e2ebc665663c1eb78844799f592504b4c4f050655139a2f2fe0c901"} Nov 22 08:55:31 crc kubenswrapper[4743]: I1122 08:55:31.784312 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6r4l2" podStartSLOduration=3.344392239 podStartE2EDuration="6.784290992s" podCreationTimestamp="2025-11-22 08:55:25 +0000 UTC" firstStartedPulling="2025-11-22 08:55:27.719841663 +0000 UTC m=+2001.426202715" lastFinishedPulling="2025-11-22 08:55:31.159740416 +0000 UTC m=+2004.866101468" observedRunningTime="2025-11-22 08:55:31.784241821 +0000 UTC m=+2005.490602873" watchObservedRunningTime="2025-11-22 08:55:31.784290992 +0000 UTC m=+2005.490652044" Nov 22 08:55:32 crc kubenswrapper[4743]: I1122 08:55:32.779376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dm4k8" event={"ID":"a34fc030-3ed9-47d7-86ff-bca89184999a","Type":"ContainerStarted","Data":"3182b110b06da337ee0c6174e2f8a7d5465aed4bfc4e8137b665004b2f9d0f9a"} Nov 22 08:55:32 crc kubenswrapper[4743]: I1122 08:55:32.804817 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dm4k8" podStartSLOduration=2.03154505 podStartE2EDuration="5.804789919s" podCreationTimestamp="2025-11-22 08:55:27 +0000 UTC" firstStartedPulling="2025-11-22 08:55:28.730536276 +0000 UTC m=+2002.436897328" lastFinishedPulling="2025-11-22 08:55:32.503781105 +0000 UTC m=+2006.210142197" observedRunningTime="2025-11-22 08:55:32.798795836 +0000 UTC m=+2006.505156888" watchObservedRunningTime="2025-11-22 08:55:32.804789919 +0000 UTC m=+2006.511151001" Nov 22 08:55:36 crc kubenswrapper[4743]: I1122 08:55:36.206072 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:36 crc kubenswrapper[4743]: I1122 08:55:36.206437 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:36 crc kubenswrapper[4743]: I1122 08:55:36.258864 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:36 crc kubenswrapper[4743]: I1122 08:55:36.846643 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:37 crc kubenswrapper[4743]: I1122 08:55:37.583185 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:37 crc kubenswrapper[4743]: I1122 08:55:37.583269 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:38 crc kubenswrapper[4743]: I1122 08:55:38.043152 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6r4l2"] Nov 22 08:55:38 crc kubenswrapper[4743]: I1122 08:55:38.628420 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dm4k8" podUID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerName="registry-server" probeResult="failure" output=< Nov 22 08:55:38 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 08:55:38 crc kubenswrapper[4743]: > Nov 22 08:55:38 crc kubenswrapper[4743]: I1122 08:55:38.817532 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6r4l2" podUID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerName="registry-server" containerID="cri-o://26c66ff75a7d9dda195439870d76d52a8c90ad19f4969930b081eab3c74cf51a" gracePeriod=2 Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.375257 4743 generic.go:334] "Generic (PLEG): container finished" podID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerID="26c66ff75a7d9dda195439870d76d52a8c90ad19f4969930b081eab3c74cf51a" exitCode=0 Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.375313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r4l2" event={"ID":"2263583b-69f7-43c4-821d-f174c42c9ab9","Type":"ContainerDied","Data":"26c66ff75a7d9dda195439870d76d52a8c90ad19f4969930b081eab3c74cf51a"} Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.729720 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.892632 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-utilities\") pod \"2263583b-69f7-43c4-821d-f174c42c9ab9\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.892775 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n575\" (UniqueName: \"kubernetes.io/projected/2263583b-69f7-43c4-821d-f174c42c9ab9-kube-api-access-4n575\") pod \"2263583b-69f7-43c4-821d-f174c42c9ab9\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.892870 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-catalog-content\") pod \"2263583b-69f7-43c4-821d-f174c42c9ab9\" (UID: \"2263583b-69f7-43c4-821d-f174c42c9ab9\") " Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.893626 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-utilities" (OuterVolumeSpecName: "utilities") pod "2263583b-69f7-43c4-821d-f174c42c9ab9" (UID: "2263583b-69f7-43c4-821d-f174c42c9ab9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.901216 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2263583b-69f7-43c4-821d-f174c42c9ab9-kube-api-access-4n575" (OuterVolumeSpecName: "kube-api-access-4n575") pod "2263583b-69f7-43c4-821d-f174c42c9ab9" (UID: "2263583b-69f7-43c4-821d-f174c42c9ab9"). InnerVolumeSpecName "kube-api-access-4n575". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.925323 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2263583b-69f7-43c4-821d-f174c42c9ab9" (UID: "2263583b-69f7-43c4-821d-f174c42c9ab9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.995053 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.995089 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2263583b-69f7-43c4-821d-f174c42c9ab9-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:55:44 crc kubenswrapper[4743]: I1122 08:55:44.995099 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n575\" (UniqueName: \"kubernetes.io/projected/2263583b-69f7-43c4-821d-f174c42c9ab9-kube-api-access-4n575\") on node \"crc\" DevicePath \"\"" Nov 22 08:55:45 crc kubenswrapper[4743]: I1122 08:55:45.386188 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6r4l2" event={"ID":"2263583b-69f7-43c4-821d-f174c42c9ab9","Type":"ContainerDied","Data":"b43a50b7fe9eec492beb06eaf21f7cb8463cd3af8e3b04e35afd04351e42d4a4"} Nov 22 08:55:45 crc kubenswrapper[4743]: I1122 08:55:45.386239 4743 scope.go:117] "RemoveContainer" containerID="26c66ff75a7d9dda195439870d76d52a8c90ad19f4969930b081eab3c74cf51a" Nov 22 08:55:45 crc kubenswrapper[4743]: I1122 08:55:45.386264 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6r4l2" Nov 22 08:55:45 crc kubenswrapper[4743]: I1122 08:55:45.411214 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6r4l2"] Nov 22 08:55:45 crc kubenswrapper[4743]: I1122 08:55:45.418224 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6r4l2"] Nov 22 08:55:45 crc kubenswrapper[4743]: I1122 08:55:45.419158 4743 scope.go:117] "RemoveContainer" containerID="0d2261cef3b38886e050330e8ebd99a9319c65d6d6408aa6f7369f8f102cf5b0" Nov 22 08:55:45 crc kubenswrapper[4743]: I1122 08:55:45.437276 4743 scope.go:117] "RemoveContainer" containerID="5056bf435d54a8c5ebd20fe0a92a7a3a5cf55fdc27ad873d2c6538569017134a" Nov 22 08:55:47 crc kubenswrapper[4743]: I1122 08:55:47.165825 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2263583b-69f7-43c4-821d-f174c42c9ab9" path="/var/lib/kubelet/pods/2263583b-69f7-43c4-821d-f174c42c9ab9/volumes" Nov 22 08:55:47 crc kubenswrapper[4743]: I1122 08:55:47.655813 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:47 crc kubenswrapper[4743]: I1122 08:55:47.701545 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:47 crc kubenswrapper[4743]: I1122 08:55:47.964523 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dm4k8"] Nov 22 08:55:49 crc kubenswrapper[4743]: I1122 08:55:49.422036 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dm4k8" podUID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerName="registry-server" containerID="cri-o://3182b110b06da337ee0c6174e2f8a7d5465aed4bfc4e8137b665004b2f9d0f9a" gracePeriod=2 Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.434729 4743 generic.go:334] "Generic (PLEG): container finished" podID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerID="3182b110b06da337ee0c6174e2f8a7d5465aed4bfc4e8137b665004b2f9d0f9a" exitCode=0 Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.434780 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dm4k8" event={"ID":"a34fc030-3ed9-47d7-86ff-bca89184999a","Type":"ContainerDied","Data":"3182b110b06da337ee0c6174e2f8a7d5465aed4bfc4e8137b665004b2f9d0f9a"} Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.628672 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.818287 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-utilities\") pod \"a34fc030-3ed9-47d7-86ff-bca89184999a\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.818342 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-catalog-content\") pod \"a34fc030-3ed9-47d7-86ff-bca89184999a\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.818814 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7lnb\" (UniqueName: \"kubernetes.io/projected/a34fc030-3ed9-47d7-86ff-bca89184999a-kube-api-access-p7lnb\") pod \"a34fc030-3ed9-47d7-86ff-bca89184999a\" (UID: \"a34fc030-3ed9-47d7-86ff-bca89184999a\") " Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.819677 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-utilities" (OuterVolumeSpecName: "utilities") pod "a34fc030-3ed9-47d7-86ff-bca89184999a" (UID: "a34fc030-3ed9-47d7-86ff-bca89184999a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.823902 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a34fc030-3ed9-47d7-86ff-bca89184999a-kube-api-access-p7lnb" (OuterVolumeSpecName: "kube-api-access-p7lnb") pod "a34fc030-3ed9-47d7-86ff-bca89184999a" (UID: "a34fc030-3ed9-47d7-86ff-bca89184999a"). InnerVolumeSpecName "kube-api-access-p7lnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.920185 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.920217 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7lnb\" (UniqueName: \"kubernetes.io/projected/a34fc030-3ed9-47d7-86ff-bca89184999a-kube-api-access-p7lnb\") on node \"crc\" DevicePath \"\"" Nov 22 08:55:50 crc kubenswrapper[4743]: I1122 08:55:50.922606 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a34fc030-3ed9-47d7-86ff-bca89184999a" (UID: "a34fc030-3ed9-47d7-86ff-bca89184999a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:55:51 crc kubenswrapper[4743]: I1122 08:55:51.021295 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a34fc030-3ed9-47d7-86ff-bca89184999a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:55:51 crc kubenswrapper[4743]: I1122 08:55:51.445702 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dm4k8" event={"ID":"a34fc030-3ed9-47d7-86ff-bca89184999a","Type":"ContainerDied","Data":"c036351cd1660f5ebbcd2212f22c175404ed803cc4f77dd70ba93ebc92ce3696"} Nov 22 08:55:51 crc kubenswrapper[4743]: I1122 08:55:51.445774 4743 scope.go:117] "RemoveContainer" containerID="3182b110b06da337ee0c6174e2f8a7d5465aed4bfc4e8137b665004b2f9d0f9a" Nov 22 08:55:51 crc kubenswrapper[4743]: I1122 08:55:51.447530 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dm4k8" Nov 22 08:55:51 crc kubenswrapper[4743]: I1122 08:55:51.480356 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dm4k8"] Nov 22 08:55:51 crc kubenswrapper[4743]: I1122 08:55:51.486772 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dm4k8"] Nov 22 08:55:51 crc kubenswrapper[4743]: I1122 08:55:51.496877 4743 scope.go:117] "RemoveContainer" containerID="945ca2162e2ebc665663c1eb78844799f592504b4c4f050655139a2f2fe0c901" Nov 22 08:55:51 crc kubenswrapper[4743]: I1122 08:55:51.519083 4743 scope.go:117] "RemoveContainer" containerID="aaa3ca9a5876e15ff81e88c31ad0fa28105434905e9a368bf5b8a07b20dd0d85" Nov 22 08:55:53 crc kubenswrapper[4743]: I1122 08:55:53.166906 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a34fc030-3ed9-47d7-86ff-bca89184999a" path="/var/lib/kubelet/pods/a34fc030-3ed9-47d7-86ff-bca89184999a/volumes" Nov 22 08:56:31 crc kubenswrapper[4743]: I1122 08:56:31.240826 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:56:31 crc kubenswrapper[4743]: I1122 08:56:31.241490 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:57:01 crc kubenswrapper[4743]: I1122 08:57:01.241823 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:57:01 crc kubenswrapper[4743]: I1122 08:57:01.242254 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.907025 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8l9jp"] Nov 22 08:57:05 crc kubenswrapper[4743]: E1122 08:57:05.907704 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerName="extract-utilities" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.907720 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerName="extract-utilities" Nov 22 08:57:05 crc kubenswrapper[4743]: E1122 08:57:05.907740 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerName="extract-utilities" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.907747 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerName="extract-utilities" Nov 22 08:57:05 crc kubenswrapper[4743]: E1122 08:57:05.907773 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerName="extract-content" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.907781 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerName="extract-content" Nov 22 08:57:05 crc kubenswrapper[4743]: E1122 08:57:05.907794 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerName="registry-server" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.907801 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerName="registry-server" Nov 22 08:57:05 crc kubenswrapper[4743]: E1122 08:57:05.907823 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerName="registry-server" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.907830 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerName="registry-server" Nov 22 08:57:05 crc kubenswrapper[4743]: E1122 08:57:05.907841 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerName="extract-content" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.907848 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerName="extract-content" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.908018 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2263583b-69f7-43c4-821d-f174c42c9ab9" containerName="registry-server" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.908036 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34fc030-3ed9-47d7-86ff-bca89184999a" containerName="registry-server" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.909431 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:05 crc kubenswrapper[4743]: I1122 08:57:05.921829 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8l9jp"] Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.016540 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrz4h\" (UniqueName: \"kubernetes.io/projected/97833e66-df49-41c7-8659-75a260f5d418-kube-api-access-nrz4h\") pod \"community-operators-8l9jp\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.016621 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-utilities\") pod \"community-operators-8l9jp\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.016651 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-catalog-content\") pod \"community-operators-8l9jp\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.118026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrz4h\" (UniqueName: \"kubernetes.io/projected/97833e66-df49-41c7-8659-75a260f5d418-kube-api-access-nrz4h\") pod \"community-operators-8l9jp\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.118076 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-utilities\") pod \"community-operators-8l9jp\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.118105 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-catalog-content\") pod \"community-operators-8l9jp\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.118770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-catalog-content\") pod \"community-operators-8l9jp\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.118969 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-utilities\") pod \"community-operators-8l9jp\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.140426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrz4h\" (UniqueName: \"kubernetes.io/projected/97833e66-df49-41c7-8659-75a260f5d418-kube-api-access-nrz4h\") pod \"community-operators-8l9jp\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.243502 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:06 crc kubenswrapper[4743]: I1122 08:57:06.677776 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8l9jp"] Nov 22 08:57:07 crc kubenswrapper[4743]: I1122 08:57:07.025523 4743 generic.go:334] "Generic (PLEG): container finished" podID="97833e66-df49-41c7-8659-75a260f5d418" containerID="2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43" exitCode=0 Nov 22 08:57:07 crc kubenswrapper[4743]: I1122 08:57:07.025606 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l9jp" event={"ID":"97833e66-df49-41c7-8659-75a260f5d418","Type":"ContainerDied","Data":"2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43"} Nov 22 08:57:07 crc kubenswrapper[4743]: I1122 08:57:07.025886 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l9jp" event={"ID":"97833e66-df49-41c7-8659-75a260f5d418","Type":"ContainerStarted","Data":"ff8d9e7865c6ed0581bb37a97200da6016f55c102e6e543b80f02ea39646564f"} Nov 22 08:57:08 crc kubenswrapper[4743]: I1122 08:57:08.037362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l9jp" event={"ID":"97833e66-df49-41c7-8659-75a260f5d418","Type":"ContainerStarted","Data":"224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e"} Nov 22 08:57:09 crc kubenswrapper[4743]: I1122 08:57:09.048167 4743 generic.go:334] "Generic (PLEG): container finished" podID="97833e66-df49-41c7-8659-75a260f5d418" containerID="224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e" exitCode=0 Nov 22 08:57:09 crc kubenswrapper[4743]: I1122 08:57:09.048226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l9jp" event={"ID":"97833e66-df49-41c7-8659-75a260f5d418","Type":"ContainerDied","Data":"224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e"} Nov 22 08:57:10 crc kubenswrapper[4743]: I1122 08:57:10.056705 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l9jp" event={"ID":"97833e66-df49-41c7-8659-75a260f5d418","Type":"ContainerStarted","Data":"f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5"} Nov 22 08:57:10 crc kubenswrapper[4743]: I1122 08:57:10.076455 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8l9jp" podStartSLOduration=2.65132951 podStartE2EDuration="5.076435479s" podCreationTimestamp="2025-11-22 08:57:05 +0000 UTC" firstStartedPulling="2025-11-22 08:57:07.026946375 +0000 UTC m=+2100.733307417" lastFinishedPulling="2025-11-22 08:57:09.452052334 +0000 UTC m=+2103.158413386" observedRunningTime="2025-11-22 08:57:10.070523768 +0000 UTC m=+2103.776884840" watchObservedRunningTime="2025-11-22 08:57:10.076435479 +0000 UTC m=+2103.782796541" Nov 22 08:57:16 crc kubenswrapper[4743]: I1122 08:57:16.244146 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:16 crc kubenswrapper[4743]: I1122 08:57:16.244925 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:16 crc kubenswrapper[4743]: I1122 08:57:16.305244 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:17 crc kubenswrapper[4743]: I1122 08:57:17.174855 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:17 crc kubenswrapper[4743]: I1122 08:57:17.217060 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8l9jp"] Nov 22 08:57:19 crc kubenswrapper[4743]: I1122 08:57:19.125133 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8l9jp" podUID="97833e66-df49-41c7-8659-75a260f5d418" containerName="registry-server" containerID="cri-o://f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5" gracePeriod=2 Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.098226 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.139633 4743 generic.go:334] "Generic (PLEG): container finished" podID="97833e66-df49-41c7-8659-75a260f5d418" containerID="f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5" exitCode=0 Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.139681 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l9jp" event={"ID":"97833e66-df49-41c7-8659-75a260f5d418","Type":"ContainerDied","Data":"f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5"} Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.139711 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8l9jp" event={"ID":"97833e66-df49-41c7-8659-75a260f5d418","Type":"ContainerDied","Data":"ff8d9e7865c6ed0581bb37a97200da6016f55c102e6e543b80f02ea39646564f"} Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.139709 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8l9jp" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.139731 4743 scope.go:117] "RemoveContainer" containerID="f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.161690 4743 scope.go:117] "RemoveContainer" containerID="224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.182169 4743 scope.go:117] "RemoveContainer" containerID="2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.201589 4743 scope.go:117] "RemoveContainer" containerID="f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5" Nov 22 08:57:20 crc kubenswrapper[4743]: E1122 08:57:20.202078 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5\": container with ID starting with f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5 not found: ID does not exist" containerID="f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.202118 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5"} err="failed to get container status \"f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5\": rpc error: code = NotFound desc = could not find container \"f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5\": container with ID starting with f78b6e2b0e5de4ffa3975acee7c02d831f6bcc74c430a676e471f355e9e89ff5 not found: ID does not exist" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.202144 4743 scope.go:117] "RemoveContainer" containerID="224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e" Nov 22 08:57:20 crc kubenswrapper[4743]: E1122 08:57:20.202570 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e\": container with ID starting with 224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e not found: ID does not exist" containerID="224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.202609 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e"} err="failed to get container status \"224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e\": rpc error: code = NotFound desc = could not find container \"224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e\": container with ID starting with 224b9033b299a6bdda2a1d7384e25eed63e1874b070807882de95a439689970e not found: ID does not exist" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.202625 4743 scope.go:117] "RemoveContainer" containerID="2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43" Nov 22 08:57:20 crc kubenswrapper[4743]: E1122 08:57:20.203001 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43\": container with ID starting with 2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43 not found: ID does not exist" containerID="2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.203025 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43"} err="failed to get container status \"2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43\": rpc error: code = NotFound desc = could not find container \"2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43\": container with ID starting with 2743b8361b5727f23b20083f77d2e794cecdc7e785909e5556c61091d8d97d43 not found: ID does not exist" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.215665 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrz4h\" (UniqueName: \"kubernetes.io/projected/97833e66-df49-41c7-8659-75a260f5d418-kube-api-access-nrz4h\") pod \"97833e66-df49-41c7-8659-75a260f5d418\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.215783 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-catalog-content\") pod \"97833e66-df49-41c7-8659-75a260f5d418\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.215934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-utilities\") pod \"97833e66-df49-41c7-8659-75a260f5d418\" (UID: \"97833e66-df49-41c7-8659-75a260f5d418\") " Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.216974 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-utilities" (OuterVolumeSpecName: "utilities") pod "97833e66-df49-41c7-8659-75a260f5d418" (UID: "97833e66-df49-41c7-8659-75a260f5d418"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.222020 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97833e66-df49-41c7-8659-75a260f5d418-kube-api-access-nrz4h" (OuterVolumeSpecName: "kube-api-access-nrz4h") pod "97833e66-df49-41c7-8659-75a260f5d418" (UID: "97833e66-df49-41c7-8659-75a260f5d418"). InnerVolumeSpecName "kube-api-access-nrz4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.277724 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97833e66-df49-41c7-8659-75a260f5d418" (UID: "97833e66-df49-41c7-8659-75a260f5d418"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.320309 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.320338 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97833e66-df49-41c7-8659-75a260f5d418-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.320348 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrz4h\" (UniqueName: \"kubernetes.io/projected/97833e66-df49-41c7-8659-75a260f5d418-kube-api-access-nrz4h\") on node \"crc\" DevicePath \"\"" Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.482218 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8l9jp"] Nov 22 08:57:20 crc kubenswrapper[4743]: I1122 08:57:20.488559 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8l9jp"] Nov 22 08:57:21 crc kubenswrapper[4743]: I1122 08:57:21.161423 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97833e66-df49-41c7-8659-75a260f5d418" path="/var/lib/kubelet/pods/97833e66-df49-41c7-8659-75a260f5d418/volumes" Nov 22 08:57:31 crc kubenswrapper[4743]: I1122 08:57:31.241143 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:57:31 crc kubenswrapper[4743]: I1122 08:57:31.241663 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 08:57:31 crc kubenswrapper[4743]: I1122 08:57:31.241735 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 08:57:31 crc kubenswrapper[4743]: I1122 08:57:31.243510 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8942efd4903f2b0d2e78730494ebbc46c37bc276bc4d1b8bfab93e628b8157ff"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 08:57:31 crc kubenswrapper[4743]: I1122 08:57:31.243626 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://8942efd4903f2b0d2e78730494ebbc46c37bc276bc4d1b8bfab93e628b8157ff" gracePeriod=600 Nov 22 08:57:32 crc kubenswrapper[4743]: I1122 08:57:32.257807 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="8942efd4903f2b0d2e78730494ebbc46c37bc276bc4d1b8bfab93e628b8157ff" exitCode=0 Nov 22 08:57:32 crc kubenswrapper[4743]: I1122 08:57:32.257895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"8942efd4903f2b0d2e78730494ebbc46c37bc276bc4d1b8bfab93e628b8157ff"} Nov 22 08:57:32 crc kubenswrapper[4743]: I1122 08:57:32.258653 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467"} Nov 22 08:57:32 crc kubenswrapper[4743]: I1122 08:57:32.258701 4743 scope.go:117] "RemoveContainer" containerID="a838e60dee751ffc953155e115c7b30a98544a658b8f15ed4873369b392283e5" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.657065 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7mfz2"] Nov 22 08:57:34 crc kubenswrapper[4743]: E1122 08:57:34.657782 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97833e66-df49-41c7-8659-75a260f5d418" containerName="registry-server" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.657803 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="97833e66-df49-41c7-8659-75a260f5d418" containerName="registry-server" Nov 22 08:57:34 crc kubenswrapper[4743]: E1122 08:57:34.657814 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97833e66-df49-41c7-8659-75a260f5d418" containerName="extract-content" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.657822 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="97833e66-df49-41c7-8659-75a260f5d418" containerName="extract-content" Nov 22 08:57:34 crc kubenswrapper[4743]: E1122 08:57:34.657832 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97833e66-df49-41c7-8659-75a260f5d418" containerName="extract-utilities" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.657839 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="97833e66-df49-41c7-8659-75a260f5d418" containerName="extract-utilities" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.658031 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="97833e66-df49-41c7-8659-75a260f5d418" containerName="registry-server" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.659236 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.681775 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7mfz2"] Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.728198 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64g2q\" (UniqueName: \"kubernetes.io/projected/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-kube-api-access-64g2q\") pod \"certified-operators-7mfz2\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.728261 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-catalog-content\") pod \"certified-operators-7mfz2\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.728339 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-utilities\") pod \"certified-operators-7mfz2\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.829154 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64g2q\" (UniqueName: \"kubernetes.io/projected/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-kube-api-access-64g2q\") pod \"certified-operators-7mfz2\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.829215 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-catalog-content\") pod \"certified-operators-7mfz2\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.829249 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-utilities\") pod \"certified-operators-7mfz2\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.829810 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-utilities\") pod \"certified-operators-7mfz2\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.829873 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-catalog-content\") pod \"certified-operators-7mfz2\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.849061 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64g2q\" (UniqueName: \"kubernetes.io/projected/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-kube-api-access-64g2q\") pod \"certified-operators-7mfz2\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:34 crc kubenswrapper[4743]: I1122 08:57:34.979765 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:35 crc kubenswrapper[4743]: I1122 08:57:35.221970 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7mfz2"] Nov 22 08:57:35 crc kubenswrapper[4743]: I1122 08:57:35.284156 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mfz2" event={"ID":"0fa0f620-508a-4fd3-8f7a-8c73925f31d6","Type":"ContainerStarted","Data":"c4054a0231ac37b4ca0c6747a9b5b6d26d3a0df2a4cf1b5ba04b96290d8fddfa"} Nov 22 08:57:36 crc kubenswrapper[4743]: I1122 08:57:36.294724 4743 generic.go:334] "Generic (PLEG): container finished" podID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerID="1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446" exitCode=0 Nov 22 08:57:36 crc kubenswrapper[4743]: I1122 08:57:36.294816 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mfz2" event={"ID":"0fa0f620-508a-4fd3-8f7a-8c73925f31d6","Type":"ContainerDied","Data":"1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446"} Nov 22 08:57:37 crc kubenswrapper[4743]: I1122 08:57:37.305044 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mfz2" event={"ID":"0fa0f620-508a-4fd3-8f7a-8c73925f31d6","Type":"ContainerStarted","Data":"8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef"} Nov 22 08:57:38 crc kubenswrapper[4743]: I1122 08:57:38.315220 4743 generic.go:334] "Generic (PLEG): container finished" podID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerID="8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef" exitCode=0 Nov 22 08:57:38 crc kubenswrapper[4743]: I1122 08:57:38.315300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mfz2" event={"ID":"0fa0f620-508a-4fd3-8f7a-8c73925f31d6","Type":"ContainerDied","Data":"8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef"} Nov 22 08:57:40 crc kubenswrapper[4743]: I1122 08:57:40.331918 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mfz2" event={"ID":"0fa0f620-508a-4fd3-8f7a-8c73925f31d6","Type":"ContainerStarted","Data":"dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392"} Nov 22 08:57:40 crc kubenswrapper[4743]: I1122 08:57:40.355389 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7mfz2" podStartSLOduration=3.076581339 podStartE2EDuration="6.355372468s" podCreationTimestamp="2025-11-22 08:57:34 +0000 UTC" firstStartedPulling="2025-11-22 08:57:36.31004368 +0000 UTC m=+2130.016404772" lastFinishedPulling="2025-11-22 08:57:39.588834859 +0000 UTC m=+2133.295195901" observedRunningTime="2025-11-22 08:57:40.349727356 +0000 UTC m=+2134.056088428" watchObservedRunningTime="2025-11-22 08:57:40.355372468 +0000 UTC m=+2134.061733510" Nov 22 08:57:44 crc kubenswrapper[4743]: I1122 08:57:44.980504 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:44 crc kubenswrapper[4743]: I1122 08:57:44.981064 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:45 crc kubenswrapper[4743]: I1122 08:57:45.020735 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:45 crc kubenswrapper[4743]: I1122 08:57:45.413314 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:45 crc kubenswrapper[4743]: I1122 08:57:45.465856 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7mfz2"] Nov 22 08:57:47 crc kubenswrapper[4743]: I1122 08:57:47.387804 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7mfz2" podUID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerName="registry-server" containerID="cri-o://dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392" gracePeriod=2 Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.379613 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.409735 4743 generic.go:334] "Generic (PLEG): container finished" podID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerID="dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392" exitCode=0 Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.409778 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mfz2" event={"ID":"0fa0f620-508a-4fd3-8f7a-8c73925f31d6","Type":"ContainerDied","Data":"dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392"} Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.409805 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7mfz2" event={"ID":"0fa0f620-508a-4fd3-8f7a-8c73925f31d6","Type":"ContainerDied","Data":"c4054a0231ac37b4ca0c6747a9b5b6d26d3a0df2a4cf1b5ba04b96290d8fddfa"} Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.409779 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7mfz2" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.409824 4743 scope.go:117] "RemoveContainer" containerID="dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.429946 4743 scope.go:117] "RemoveContainer" containerID="8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.445364 4743 scope.go:117] "RemoveContainer" containerID="1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.471422 4743 scope.go:117] "RemoveContainer" containerID="dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392" Nov 22 08:57:48 crc kubenswrapper[4743]: E1122 08:57:48.471838 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392\": container with ID starting with dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392 not found: ID does not exist" containerID="dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.471879 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392"} err="failed to get container status \"dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392\": rpc error: code = NotFound desc = could not find container \"dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392\": container with ID starting with dae51a8fb04e8cb2045027e460a183f0c0bf760732e34dfd1509ddfb3d7fd392 not found: ID does not exist" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.471909 4743 scope.go:117] "RemoveContainer" containerID="8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef" Nov 22 08:57:48 crc kubenswrapper[4743]: E1122 08:57:48.472336 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef\": container with ID starting with 8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef not found: ID does not exist" containerID="8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.472359 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef"} err="failed to get container status \"8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef\": rpc error: code = NotFound desc = could not find container \"8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef\": container with ID starting with 8147e80bc9cad6678f010cfc6c9abdb6fbdcfeb1d0d2ee1f46492e1cc4817aef not found: ID does not exist" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.472378 4743 scope.go:117] "RemoveContainer" containerID="1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446" Nov 22 08:57:48 crc kubenswrapper[4743]: E1122 08:57:48.472600 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446\": container with ID starting with 1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446 not found: ID does not exist" containerID="1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.472627 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446"} err="failed to get container status \"1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446\": rpc error: code = NotFound desc = could not find container \"1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446\": container with ID starting with 1e956ba7f05a7a414acf560592003f4c9b35a51b261393879fba2c0f0d6c8446 not found: ID does not exist" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.509859 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-utilities\") pod \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.509917 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64g2q\" (UniqueName: \"kubernetes.io/projected/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-kube-api-access-64g2q\") pod \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.509960 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-catalog-content\") pod \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\" (UID: \"0fa0f620-508a-4fd3-8f7a-8c73925f31d6\") " Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.510815 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-utilities" (OuterVolumeSpecName: "utilities") pod "0fa0f620-508a-4fd3-8f7a-8c73925f31d6" (UID: "0fa0f620-508a-4fd3-8f7a-8c73925f31d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.518394 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-kube-api-access-64g2q" (OuterVolumeSpecName: "kube-api-access-64g2q") pod "0fa0f620-508a-4fd3-8f7a-8c73925f31d6" (UID: "0fa0f620-508a-4fd3-8f7a-8c73925f31d6"). InnerVolumeSpecName "kube-api-access-64g2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.611199 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 08:57:48 crc kubenswrapper[4743]: I1122 08:57:48.611227 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64g2q\" (UniqueName: \"kubernetes.io/projected/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-kube-api-access-64g2q\") on node \"crc\" DevicePath \"\"" Nov 22 08:57:49 crc kubenswrapper[4743]: I1122 08:57:49.833084 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fa0f620-508a-4fd3-8f7a-8c73925f31d6" (UID: "0fa0f620-508a-4fd3-8f7a-8c73925f31d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 08:57:49 crc kubenswrapper[4743]: I1122 08:57:49.930476 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa0f620-508a-4fd3-8f7a-8c73925f31d6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 08:57:49 crc kubenswrapper[4743]: I1122 08:57:49.941454 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7mfz2"] Nov 22 08:57:49 crc kubenswrapper[4743]: I1122 08:57:49.948009 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7mfz2"] Nov 22 08:57:51 crc kubenswrapper[4743]: I1122 08:57:51.168428 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" path="/var/lib/kubelet/pods/0fa0f620-508a-4fd3-8f7a-8c73925f31d6/volumes" Nov 22 08:59:31 crc kubenswrapper[4743]: I1122 08:59:31.240853 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 08:59:31 crc kubenswrapper[4743]: I1122 08:59:31.241734 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.149976 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6"] Nov 22 09:00:00 crc kubenswrapper[4743]: E1122 09:00:00.150764 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerName="extract-content" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.150776 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerName="extract-content" Nov 22 09:00:00 crc kubenswrapper[4743]: E1122 09:00:00.150785 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerName="extract-utilities" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.150791 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerName="extract-utilities" Nov 22 09:00:00 crc kubenswrapper[4743]: E1122 09:00:00.150803 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerName="registry-server" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.150809 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerName="registry-server" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.150946 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa0f620-508a-4fd3-8f7a-8c73925f31d6" containerName="registry-server" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.151402 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.153448 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.154277 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.163029 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6"] Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.328241 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7efdba6-7144-4590-855a-3b93a8edd588-secret-volume\") pod \"collect-profiles-29396700-8m5c6\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.328317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7efdba6-7144-4590-855a-3b93a8edd588-config-volume\") pod \"collect-profiles-29396700-8m5c6\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.328380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brc68\" (UniqueName: \"kubernetes.io/projected/c7efdba6-7144-4590-855a-3b93a8edd588-kube-api-access-brc68\") pod \"collect-profiles-29396700-8m5c6\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.429438 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7efdba6-7144-4590-855a-3b93a8edd588-secret-volume\") pod \"collect-profiles-29396700-8m5c6\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.429534 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7efdba6-7144-4590-855a-3b93a8edd588-config-volume\") pod \"collect-profiles-29396700-8m5c6\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.429619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brc68\" (UniqueName: \"kubernetes.io/projected/c7efdba6-7144-4590-855a-3b93a8edd588-kube-api-access-brc68\") pod \"collect-profiles-29396700-8m5c6\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.430964 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7efdba6-7144-4590-855a-3b93a8edd588-config-volume\") pod \"collect-profiles-29396700-8m5c6\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.441613 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7efdba6-7144-4590-855a-3b93a8edd588-secret-volume\") pod \"collect-profiles-29396700-8m5c6\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.447253 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brc68\" (UniqueName: \"kubernetes.io/projected/c7efdba6-7144-4590-855a-3b93a8edd588-kube-api-access-brc68\") pod \"collect-profiles-29396700-8m5c6\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.477338 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:00 crc kubenswrapper[4743]: I1122 09:00:00.895611 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6"] Nov 22 09:00:01 crc kubenswrapper[4743]: I1122 09:00:01.241739 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:00:01 crc kubenswrapper[4743]: I1122 09:00:01.241803 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:00:01 crc kubenswrapper[4743]: I1122 09:00:01.495220 4743 generic.go:334] "Generic (PLEG): container finished" podID="c7efdba6-7144-4590-855a-3b93a8edd588" containerID="7d80e9fa94f3a615cb8b38f8b50e944d3b11f628ab538f3277f8eb46ed6bc1eb" exitCode=0 Nov 22 09:00:01 crc kubenswrapper[4743]: I1122 09:00:01.495260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" event={"ID":"c7efdba6-7144-4590-855a-3b93a8edd588","Type":"ContainerDied","Data":"7d80e9fa94f3a615cb8b38f8b50e944d3b11f628ab538f3277f8eb46ed6bc1eb"} Nov 22 09:00:01 crc kubenswrapper[4743]: I1122 09:00:01.495284 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" event={"ID":"c7efdba6-7144-4590-855a-3b93a8edd588","Type":"ContainerStarted","Data":"b33cf8b411aef61c0532f9278cd29281bbd4ff7eb3fc947c5b8a961e99512fe4"} Nov 22 09:00:02 crc kubenswrapper[4743]: I1122 09:00:02.765146 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:02 crc kubenswrapper[4743]: I1122 09:00:02.963799 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7efdba6-7144-4590-855a-3b93a8edd588-config-volume\") pod \"c7efdba6-7144-4590-855a-3b93a8edd588\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " Nov 22 09:00:02 crc kubenswrapper[4743]: I1122 09:00:02.963929 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brc68\" (UniqueName: \"kubernetes.io/projected/c7efdba6-7144-4590-855a-3b93a8edd588-kube-api-access-brc68\") pod \"c7efdba6-7144-4590-855a-3b93a8edd588\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " Nov 22 09:00:02 crc kubenswrapper[4743]: I1122 09:00:02.964024 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7efdba6-7144-4590-855a-3b93a8edd588-secret-volume\") pod \"c7efdba6-7144-4590-855a-3b93a8edd588\" (UID: \"c7efdba6-7144-4590-855a-3b93a8edd588\") " Nov 22 09:00:02 crc kubenswrapper[4743]: I1122 09:00:02.965206 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7efdba6-7144-4590-855a-3b93a8edd588-config-volume" (OuterVolumeSpecName: "config-volume") pod "c7efdba6-7144-4590-855a-3b93a8edd588" (UID: "c7efdba6-7144-4590-855a-3b93a8edd588"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:00:02 crc kubenswrapper[4743]: I1122 09:00:02.976831 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7efdba6-7144-4590-855a-3b93a8edd588-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c7efdba6-7144-4590-855a-3b93a8edd588" (UID: "c7efdba6-7144-4590-855a-3b93a8edd588"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:00:02 crc kubenswrapper[4743]: I1122 09:00:02.976830 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7efdba6-7144-4590-855a-3b93a8edd588-kube-api-access-brc68" (OuterVolumeSpecName: "kube-api-access-brc68") pod "c7efdba6-7144-4590-855a-3b93a8edd588" (UID: "c7efdba6-7144-4590-855a-3b93a8edd588"). InnerVolumeSpecName "kube-api-access-brc68". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:00:03 crc kubenswrapper[4743]: I1122 09:00:03.065462 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brc68\" (UniqueName: \"kubernetes.io/projected/c7efdba6-7144-4590-855a-3b93a8edd588-kube-api-access-brc68\") on node \"crc\" DevicePath \"\"" Nov 22 09:00:03 crc kubenswrapper[4743]: I1122 09:00:03.065874 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c7efdba6-7144-4590-855a-3b93a8edd588-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:00:03 crc kubenswrapper[4743]: I1122 09:00:03.065894 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7efdba6-7144-4590-855a-3b93a8edd588-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:00:03 crc kubenswrapper[4743]: I1122 09:00:03.513936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" event={"ID":"c7efdba6-7144-4590-855a-3b93a8edd588","Type":"ContainerDied","Data":"b33cf8b411aef61c0532f9278cd29281bbd4ff7eb3fc947c5b8a961e99512fe4"} Nov 22 09:00:03 crc kubenswrapper[4743]: I1122 09:00:03.514017 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6" Nov 22 09:00:03 crc kubenswrapper[4743]: I1122 09:00:03.514061 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b33cf8b411aef61c0532f9278cd29281bbd4ff7eb3fc947c5b8a961e99512fe4" Nov 22 09:00:03 crc kubenswrapper[4743]: I1122 09:00:03.850403 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt"] Nov 22 09:00:03 crc kubenswrapper[4743]: I1122 09:00:03.855668 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396655-zxpkt"] Nov 22 09:00:05 crc kubenswrapper[4743]: I1122 09:00:05.159893 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5564388b-e6dd-409f-a137-b34700967f4a" path="/var/lib/kubelet/pods/5564388b-e6dd-409f-a137-b34700967f4a/volumes" Nov 22 09:00:30 crc kubenswrapper[4743]: I1122 09:00:30.704547 4743 scope.go:117] "RemoveContainer" containerID="9aadc3a0986e123bcf0a362e1c9ed8598762bf3912aa6ade14a91b173e16fc04" Nov 22 09:00:31 crc kubenswrapper[4743]: I1122 09:00:31.241601 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:00:31 crc kubenswrapper[4743]: I1122 09:00:31.242001 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:00:31 crc kubenswrapper[4743]: I1122 09:00:31.242056 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:00:31 crc kubenswrapper[4743]: I1122 09:00:31.242709 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:00:31 crc kubenswrapper[4743]: I1122 09:00:31.242758 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" gracePeriod=600 Nov 22 09:00:31 crc kubenswrapper[4743]: E1122 09:00:31.374336 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:00:31 crc kubenswrapper[4743]: I1122 09:00:31.739986 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" exitCode=0 Nov 22 09:00:31 crc kubenswrapper[4743]: I1122 09:00:31.740020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467"} Nov 22 09:00:31 crc kubenswrapper[4743]: I1122 09:00:31.740110 4743 scope.go:117] "RemoveContainer" containerID="8942efd4903f2b0d2e78730494ebbc46c37bc276bc4d1b8bfab93e628b8157ff" Nov 22 09:00:31 crc kubenswrapper[4743]: I1122 09:00:31.740735 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:00:31 crc kubenswrapper[4743]: E1122 09:00:31.741100 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:00:43 crc kubenswrapper[4743]: I1122 09:00:43.151445 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:00:43 crc kubenswrapper[4743]: E1122 09:00:43.152180 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:00:55 crc kubenswrapper[4743]: I1122 09:00:55.151489 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:00:55 crc kubenswrapper[4743]: E1122 09:00:55.152268 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:01:10 crc kubenswrapper[4743]: I1122 09:01:10.151829 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:01:10 crc kubenswrapper[4743]: E1122 09:01:10.152545 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:01:22 crc kubenswrapper[4743]: I1122 09:01:22.151842 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:01:22 crc kubenswrapper[4743]: E1122 09:01:22.152686 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:01:37 crc kubenswrapper[4743]: I1122 09:01:37.161669 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:01:37 crc kubenswrapper[4743]: E1122 09:01:37.162511 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:01:52 crc kubenswrapper[4743]: I1122 09:01:52.151629 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:01:52 crc kubenswrapper[4743]: E1122 09:01:52.152279 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:02:04 crc kubenswrapper[4743]: I1122 09:02:04.151298 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:02:04 crc kubenswrapper[4743]: E1122 09:02:04.152171 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:02:15 crc kubenswrapper[4743]: I1122 09:02:15.151644 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:02:15 crc kubenswrapper[4743]: E1122 09:02:15.152449 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:02:26 crc kubenswrapper[4743]: I1122 09:02:26.152416 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:02:26 crc kubenswrapper[4743]: E1122 09:02:26.153753 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:02:40 crc kubenswrapper[4743]: I1122 09:02:40.151531 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:02:40 crc kubenswrapper[4743]: E1122 09:02:40.152145 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:02:54 crc kubenswrapper[4743]: I1122 09:02:54.152166 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:02:54 crc kubenswrapper[4743]: E1122 09:02:54.155315 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:03:08 crc kubenswrapper[4743]: I1122 09:03:08.151981 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:03:08 crc kubenswrapper[4743]: E1122 09:03:08.152819 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:03:19 crc kubenswrapper[4743]: I1122 09:03:19.155808 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:03:19 crc kubenswrapper[4743]: E1122 09:03:19.157145 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:03:31 crc kubenswrapper[4743]: I1122 09:03:31.152462 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:03:31 crc kubenswrapper[4743]: E1122 09:03:31.153244 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:03:44 crc kubenswrapper[4743]: I1122 09:03:44.151858 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:03:44 crc kubenswrapper[4743]: E1122 09:03:44.152622 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:03:55 crc kubenswrapper[4743]: I1122 09:03:55.155267 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:03:55 crc kubenswrapper[4743]: E1122 09:03:55.155878 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:04:07 crc kubenswrapper[4743]: I1122 09:04:07.155937 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:04:07 crc kubenswrapper[4743]: E1122 09:04:07.157619 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:04:22 crc kubenswrapper[4743]: I1122 09:04:22.151205 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:04:22 crc kubenswrapper[4743]: E1122 09:04:22.151907 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:04:34 crc kubenswrapper[4743]: I1122 09:04:34.151915 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:04:34 crc kubenswrapper[4743]: E1122 09:04:34.152654 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:04:46 crc kubenswrapper[4743]: I1122 09:04:46.151827 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:04:46 crc kubenswrapper[4743]: E1122 09:04:46.152602 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:05:00 crc kubenswrapper[4743]: I1122 09:05:00.151884 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:05:00 crc kubenswrapper[4743]: E1122 09:05:00.152627 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:05:15 crc kubenswrapper[4743]: I1122 09:05:15.151789 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:05:15 crc kubenswrapper[4743]: E1122 09:05:15.152521 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:05:28 crc kubenswrapper[4743]: I1122 09:05:28.151877 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:05:28 crc kubenswrapper[4743]: E1122 09:05:28.152545 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:05:43 crc kubenswrapper[4743]: I1122 09:05:43.151973 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:05:43 crc kubenswrapper[4743]: I1122 09:05:43.969634 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"8a60e1ad22fedf2a9d76ed3d5822dc4816ba557217c34494b2f1ff2a5f966e0a"} Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.359833 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d2z2s"] Nov 22 09:06:22 crc kubenswrapper[4743]: E1122 09:06:22.360683 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7efdba6-7144-4590-855a-3b93a8edd588" containerName="collect-profiles" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.360698 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7efdba6-7144-4590-855a-3b93a8edd588" containerName="collect-profiles" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.360862 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7efdba6-7144-4590-855a-3b93a8edd588" containerName="collect-profiles" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.361901 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.382304 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2z2s"] Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.398954 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-utilities\") pod \"redhat-marketplace-d2z2s\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.399003 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-catalog-content\") pod \"redhat-marketplace-d2z2s\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.399060 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgh9w\" (UniqueName: \"kubernetes.io/projected/9582beb2-be0f-4feb-82d9-e38a25859932-kube-api-access-cgh9w\") pod \"redhat-marketplace-d2z2s\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.499709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-utilities\") pod \"redhat-marketplace-d2z2s\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.499745 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-catalog-content\") pod \"redhat-marketplace-d2z2s\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.499802 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgh9w\" (UniqueName: \"kubernetes.io/projected/9582beb2-be0f-4feb-82d9-e38a25859932-kube-api-access-cgh9w\") pod \"redhat-marketplace-d2z2s\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.500497 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-utilities\") pod \"redhat-marketplace-d2z2s\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.500499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-catalog-content\") pod \"redhat-marketplace-d2z2s\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.519500 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgh9w\" (UniqueName: \"kubernetes.io/projected/9582beb2-be0f-4feb-82d9-e38a25859932-kube-api-access-cgh9w\") pod \"redhat-marketplace-d2z2s\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.679772 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:22 crc kubenswrapper[4743]: I1122 09:06:22.953452 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2z2s"] Nov 22 09:06:23 crc kubenswrapper[4743]: I1122 09:06:23.235870 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2z2s" event={"ID":"9582beb2-be0f-4feb-82d9-e38a25859932","Type":"ContainerStarted","Data":"18b0ecb27830481ff40cc791f4fb5b25c92b6ec15cd3bb0c46a533c481a533b2"} Nov 22 09:06:24 crc kubenswrapper[4743]: I1122 09:06:24.244477 4743 generic.go:334] "Generic (PLEG): container finished" podID="9582beb2-be0f-4feb-82d9-e38a25859932" containerID="512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303" exitCode=0 Nov 22 09:06:24 crc kubenswrapper[4743]: I1122 09:06:24.244524 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2z2s" event={"ID":"9582beb2-be0f-4feb-82d9-e38a25859932","Type":"ContainerDied","Data":"512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303"} Nov 22 09:06:24 crc kubenswrapper[4743]: I1122 09:06:24.246449 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:06:26 crc kubenswrapper[4743]: I1122 09:06:26.261217 4743 generic.go:334] "Generic (PLEG): container finished" podID="9582beb2-be0f-4feb-82d9-e38a25859932" containerID="c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de" exitCode=0 Nov 22 09:06:26 crc kubenswrapper[4743]: I1122 09:06:26.261293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2z2s" event={"ID":"9582beb2-be0f-4feb-82d9-e38a25859932","Type":"ContainerDied","Data":"c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de"} Nov 22 09:06:27 crc kubenswrapper[4743]: I1122 09:06:27.270895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2z2s" event={"ID":"9582beb2-be0f-4feb-82d9-e38a25859932","Type":"ContainerStarted","Data":"c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544"} Nov 22 09:06:27 crc kubenswrapper[4743]: I1122 09:06:27.290505 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d2z2s" podStartSLOduration=2.848277 podStartE2EDuration="5.290486742s" podCreationTimestamp="2025-11-22 09:06:22 +0000 UTC" firstStartedPulling="2025-11-22 09:06:24.246190708 +0000 UTC m=+2657.952551760" lastFinishedPulling="2025-11-22 09:06:26.68840045 +0000 UTC m=+2660.394761502" observedRunningTime="2025-11-22 09:06:27.289193435 +0000 UTC m=+2660.995554487" watchObservedRunningTime="2025-11-22 09:06:27.290486742 +0000 UTC m=+2660.996847794" Nov 22 09:06:32 crc kubenswrapper[4743]: I1122 09:06:32.680283 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:32 crc kubenswrapper[4743]: I1122 09:06:32.680617 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:32 crc kubenswrapper[4743]: I1122 09:06:32.717684 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:33 crc kubenswrapper[4743]: I1122 09:06:33.344751 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:33 crc kubenswrapper[4743]: I1122 09:06:33.392687 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2z2s"] Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.322383 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d2z2s" podUID="9582beb2-be0f-4feb-82d9-e38a25859932" containerName="registry-server" containerID="cri-o://c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544" gracePeriod=2 Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.715568 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.888717 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-utilities\") pod \"9582beb2-be0f-4feb-82d9-e38a25859932\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.888752 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgh9w\" (UniqueName: \"kubernetes.io/projected/9582beb2-be0f-4feb-82d9-e38a25859932-kube-api-access-cgh9w\") pod \"9582beb2-be0f-4feb-82d9-e38a25859932\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.888849 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-catalog-content\") pod \"9582beb2-be0f-4feb-82d9-e38a25859932\" (UID: \"9582beb2-be0f-4feb-82d9-e38a25859932\") " Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.890116 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-utilities" (OuterVolumeSpecName: "utilities") pod "9582beb2-be0f-4feb-82d9-e38a25859932" (UID: "9582beb2-be0f-4feb-82d9-e38a25859932"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.898552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9582beb2-be0f-4feb-82d9-e38a25859932-kube-api-access-cgh9w" (OuterVolumeSpecName: "kube-api-access-cgh9w") pod "9582beb2-be0f-4feb-82d9-e38a25859932" (UID: "9582beb2-be0f-4feb-82d9-e38a25859932"). InnerVolumeSpecName "kube-api-access-cgh9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.911097 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9582beb2-be0f-4feb-82d9-e38a25859932" (UID: "9582beb2-be0f-4feb-82d9-e38a25859932"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.990097 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.990183 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgh9w\" (UniqueName: \"kubernetes.io/projected/9582beb2-be0f-4feb-82d9-e38a25859932-kube-api-access-cgh9w\") on node \"crc\" DevicePath \"\"" Nov 22 09:06:35 crc kubenswrapper[4743]: I1122 09:06:35.990196 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9582beb2-be0f-4feb-82d9-e38a25859932-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.334243 4743 generic.go:334] "Generic (PLEG): container finished" podID="9582beb2-be0f-4feb-82d9-e38a25859932" containerID="c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544" exitCode=0 Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.334290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2z2s" event={"ID":"9582beb2-be0f-4feb-82d9-e38a25859932","Type":"ContainerDied","Data":"c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544"} Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.334317 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d2z2s" event={"ID":"9582beb2-be0f-4feb-82d9-e38a25859932","Type":"ContainerDied","Data":"18b0ecb27830481ff40cc791f4fb5b25c92b6ec15cd3bb0c46a533c481a533b2"} Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.334324 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d2z2s" Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.334332 4743 scope.go:117] "RemoveContainer" containerID="c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544" Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.353427 4743 scope.go:117] "RemoveContainer" containerID="c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de" Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.374914 4743 scope.go:117] "RemoveContainer" containerID="512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303" Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.375807 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2z2s"] Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.380920 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d2z2s"] Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.414697 4743 scope.go:117] "RemoveContainer" containerID="c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544" Nov 22 09:06:36 crc kubenswrapper[4743]: E1122 09:06:36.415235 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544\": container with ID starting with c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544 not found: ID does not exist" containerID="c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544" Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.415285 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544"} err="failed to get container status \"c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544\": rpc error: code = NotFound desc = could not find container \"c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544\": container with ID starting with c93a02c34fa8afe04adaffe8265351fdc5553262100ffe8175ce67c845edf544 not found: ID does not exist" Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.415317 4743 scope.go:117] "RemoveContainer" containerID="c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de" Nov 22 09:06:36 crc kubenswrapper[4743]: E1122 09:06:36.415654 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de\": container with ID starting with c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de not found: ID does not exist" containerID="c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de" Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.415702 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de"} err="failed to get container status \"c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de\": rpc error: code = NotFound desc = could not find container \"c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de\": container with ID starting with c551d92f5b7f0252f2d18f98e23575af94d9e78bbc6809834fc5433968aad0de not found: ID does not exist" Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.415733 4743 scope.go:117] "RemoveContainer" containerID="512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303" Nov 22 09:06:36 crc kubenswrapper[4743]: E1122 09:06:36.416091 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303\": container with ID starting with 512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303 not found: ID does not exist" containerID="512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303" Nov 22 09:06:36 crc kubenswrapper[4743]: I1122 09:06:36.416124 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303"} err="failed to get container status \"512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303\": rpc error: code = NotFound desc = could not find container \"512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303\": container with ID starting with 512ad14452a4662274802c685f75f70fe852adeaf6b87d2dfd9c7b5bd3556303 not found: ID does not exist" Nov 22 09:06:37 crc kubenswrapper[4743]: I1122 09:06:37.160774 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9582beb2-be0f-4feb-82d9-e38a25859932" path="/var/lib/kubelet/pods/9582beb2-be0f-4feb-82d9-e38a25859932/volumes" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.244812 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n4542"] Nov 22 09:07:05 crc kubenswrapper[4743]: E1122 09:07:05.245666 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9582beb2-be0f-4feb-82d9-e38a25859932" containerName="registry-server" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.245680 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9582beb2-be0f-4feb-82d9-e38a25859932" containerName="registry-server" Nov 22 09:07:05 crc kubenswrapper[4743]: E1122 09:07:05.245696 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9582beb2-be0f-4feb-82d9-e38a25859932" containerName="extract-utilities" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.245704 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9582beb2-be0f-4feb-82d9-e38a25859932" containerName="extract-utilities" Nov 22 09:07:05 crc kubenswrapper[4743]: E1122 09:07:05.245718 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9582beb2-be0f-4feb-82d9-e38a25859932" containerName="extract-content" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.245728 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9582beb2-be0f-4feb-82d9-e38a25859932" containerName="extract-content" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.245917 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9582beb2-be0f-4feb-82d9-e38a25859932" containerName="registry-server" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.247117 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.258125 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4542"] Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.394668 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-catalog-content\") pod \"community-operators-n4542\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.394797 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-utilities\") pod \"community-operators-n4542\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.394823 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x45r8\" (UniqueName: \"kubernetes.io/projected/93ae91fc-f75e-4851-b653-5624e97f086c-kube-api-access-x45r8\") pod \"community-operators-n4542\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.496039 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x45r8\" (UniqueName: \"kubernetes.io/projected/93ae91fc-f75e-4851-b653-5624e97f086c-kube-api-access-x45r8\") pod \"community-operators-n4542\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.496106 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-catalog-content\") pod \"community-operators-n4542\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.496183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-utilities\") pod \"community-operators-n4542\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.496744 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-utilities\") pod \"community-operators-n4542\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.496785 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-catalog-content\") pod \"community-operators-n4542\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.515741 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x45r8\" (UniqueName: \"kubernetes.io/projected/93ae91fc-f75e-4851-b653-5624e97f086c-kube-api-access-x45r8\") pod \"community-operators-n4542\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:05 crc kubenswrapper[4743]: I1122 09:07:05.593554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:06 crc kubenswrapper[4743]: I1122 09:07:06.047686 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4542"] Nov 22 09:07:06 crc kubenswrapper[4743]: I1122 09:07:06.554050 4743 generic.go:334] "Generic (PLEG): container finished" podID="93ae91fc-f75e-4851-b653-5624e97f086c" containerID="e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d" exitCode=0 Nov 22 09:07:06 crc kubenswrapper[4743]: I1122 09:07:06.554174 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4542" event={"ID":"93ae91fc-f75e-4851-b653-5624e97f086c","Type":"ContainerDied","Data":"e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d"} Nov 22 09:07:06 crc kubenswrapper[4743]: I1122 09:07:06.554388 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4542" event={"ID":"93ae91fc-f75e-4851-b653-5624e97f086c","Type":"ContainerStarted","Data":"16e4cf0fba4a75ae1b08230ff0113f3577cd3ba62ae410469e127a00486fe8f1"} Nov 22 09:07:07 crc kubenswrapper[4743]: I1122 09:07:07.570269 4743 generic.go:334] "Generic (PLEG): container finished" podID="93ae91fc-f75e-4851-b653-5624e97f086c" containerID="d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5" exitCode=0 Nov 22 09:07:07 crc kubenswrapper[4743]: I1122 09:07:07.570347 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4542" event={"ID":"93ae91fc-f75e-4851-b653-5624e97f086c","Type":"ContainerDied","Data":"d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5"} Nov 22 09:07:08 crc kubenswrapper[4743]: I1122 09:07:08.581067 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4542" event={"ID":"93ae91fc-f75e-4851-b653-5624e97f086c","Type":"ContainerStarted","Data":"3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b"} Nov 22 09:07:08 crc kubenswrapper[4743]: I1122 09:07:08.600123 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n4542" podStartSLOduration=2.163222076 podStartE2EDuration="3.600107624s" podCreationTimestamp="2025-11-22 09:07:05 +0000 UTC" firstStartedPulling="2025-11-22 09:07:06.556535389 +0000 UTC m=+2700.262896481" lastFinishedPulling="2025-11-22 09:07:07.993420947 +0000 UTC m=+2701.699782029" observedRunningTime="2025-11-22 09:07:08.59580773 +0000 UTC m=+2702.302168782" watchObservedRunningTime="2025-11-22 09:07:08.600107624 +0000 UTC m=+2702.306468676" Nov 22 09:07:15 crc kubenswrapper[4743]: I1122 09:07:15.594211 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:15 crc kubenswrapper[4743]: I1122 09:07:15.594808 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:15 crc kubenswrapper[4743]: I1122 09:07:15.660286 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:15 crc kubenswrapper[4743]: I1122 09:07:15.730336 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:15 crc kubenswrapper[4743]: I1122 09:07:15.896042 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4542"] Nov 22 09:07:17 crc kubenswrapper[4743]: I1122 09:07:17.649898 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n4542" podUID="93ae91fc-f75e-4851-b653-5624e97f086c" containerName="registry-server" containerID="cri-o://3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b" gracePeriod=2 Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.072338 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.186043 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x45r8\" (UniqueName: \"kubernetes.io/projected/93ae91fc-f75e-4851-b653-5624e97f086c-kube-api-access-x45r8\") pod \"93ae91fc-f75e-4851-b653-5624e97f086c\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.186100 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-catalog-content\") pod \"93ae91fc-f75e-4851-b653-5624e97f086c\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.186201 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-utilities\") pod \"93ae91fc-f75e-4851-b653-5624e97f086c\" (UID: \"93ae91fc-f75e-4851-b653-5624e97f086c\") " Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.187125 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-utilities" (OuterVolumeSpecName: "utilities") pod "93ae91fc-f75e-4851-b653-5624e97f086c" (UID: "93ae91fc-f75e-4851-b653-5624e97f086c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.191134 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93ae91fc-f75e-4851-b653-5624e97f086c-kube-api-access-x45r8" (OuterVolumeSpecName: "kube-api-access-x45r8") pod "93ae91fc-f75e-4851-b653-5624e97f086c" (UID: "93ae91fc-f75e-4851-b653-5624e97f086c"). InnerVolumeSpecName "kube-api-access-x45r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.239407 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93ae91fc-f75e-4851-b653-5624e97f086c" (UID: "93ae91fc-f75e-4851-b653-5624e97f086c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.287850 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.287888 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x45r8\" (UniqueName: \"kubernetes.io/projected/93ae91fc-f75e-4851-b653-5624e97f086c-kube-api-access-x45r8\") on node \"crc\" DevicePath \"\"" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.287902 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93ae91fc-f75e-4851-b653-5624e97f086c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.658649 4743 generic.go:334] "Generic (PLEG): container finished" podID="93ae91fc-f75e-4851-b653-5624e97f086c" containerID="3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b" exitCode=0 Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.658667 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4542" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.659647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4542" event={"ID":"93ae91fc-f75e-4851-b653-5624e97f086c","Type":"ContainerDied","Data":"3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b"} Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.659711 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4542" event={"ID":"93ae91fc-f75e-4851-b653-5624e97f086c","Type":"ContainerDied","Data":"16e4cf0fba4a75ae1b08230ff0113f3577cd3ba62ae410469e127a00486fe8f1"} Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.659738 4743 scope.go:117] "RemoveContainer" containerID="3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.683958 4743 scope.go:117] "RemoveContainer" containerID="d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.691631 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4542"] Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.698917 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n4542"] Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.703642 4743 scope.go:117] "RemoveContainer" containerID="e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.729661 4743 scope.go:117] "RemoveContainer" containerID="3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b" Nov 22 09:07:18 crc kubenswrapper[4743]: E1122 09:07:18.730163 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b\": container with ID starting with 3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b not found: ID does not exist" containerID="3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.730217 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b"} err="failed to get container status \"3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b\": rpc error: code = NotFound desc = could not find container \"3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b\": container with ID starting with 3666406f48e91ad3ee978f310d2aca3fc1a895846daabebcc89b432cef48b70b not found: ID does not exist" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.730249 4743 scope.go:117] "RemoveContainer" containerID="d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5" Nov 22 09:07:18 crc kubenswrapper[4743]: E1122 09:07:18.730699 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5\": container with ID starting with d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5 not found: ID does not exist" containerID="d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.730730 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5"} err="failed to get container status \"d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5\": rpc error: code = NotFound desc = could not find container \"d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5\": container with ID starting with d2e405c3b9fe4b5f083dc700db97a1632b91cb301ac641432e865d8c4766ccd5 not found: ID does not exist" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.730750 4743 scope.go:117] "RemoveContainer" containerID="e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d" Nov 22 09:07:18 crc kubenswrapper[4743]: E1122 09:07:18.730973 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d\": container with ID starting with e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d not found: ID does not exist" containerID="e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d" Nov 22 09:07:18 crc kubenswrapper[4743]: I1122 09:07:18.731011 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d"} err="failed to get container status \"e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d\": rpc error: code = NotFound desc = could not find container \"e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d\": container with ID starting with e19f26b28d6b5f0f01d7e0b1b5fae080e4260c72e8ca07e8bb4852137a20e60d not found: ID does not exist" Nov 22 09:07:19 crc kubenswrapper[4743]: I1122 09:07:19.165492 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93ae91fc-f75e-4851-b653-5624e97f086c" path="/var/lib/kubelet/pods/93ae91fc-f75e-4851-b653-5624e97f086c/volumes" Nov 22 09:08:01 crc kubenswrapper[4743]: I1122 09:08:01.241242 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:08:01 crc kubenswrapper[4743]: I1122 09:08:01.242001 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:08:31 crc kubenswrapper[4743]: I1122 09:08:31.241280 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:08:31 crc kubenswrapper[4743]: I1122 09:08:31.241864 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.164711 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7w7h9"] Nov 22 09:08:37 crc kubenswrapper[4743]: E1122 09:08:37.165260 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ae91fc-f75e-4851-b653-5624e97f086c" containerName="extract-utilities" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.165273 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ae91fc-f75e-4851-b653-5624e97f086c" containerName="extract-utilities" Nov 22 09:08:37 crc kubenswrapper[4743]: E1122 09:08:37.165292 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ae91fc-f75e-4851-b653-5624e97f086c" containerName="registry-server" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.165299 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ae91fc-f75e-4851-b653-5624e97f086c" containerName="registry-server" Nov 22 09:08:37 crc kubenswrapper[4743]: E1122 09:08:37.165308 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93ae91fc-f75e-4851-b653-5624e97f086c" containerName="extract-content" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.165316 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="93ae91fc-f75e-4851-b653-5624e97f086c" containerName="extract-content" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.165468 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="93ae91fc-f75e-4851-b653-5624e97f086c" containerName="registry-server" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.166464 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7w7h9"] Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.166544 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.316113 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-catalog-content\") pod \"certified-operators-7w7h9\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.316480 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqwzx\" (UniqueName: \"kubernetes.io/projected/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-kube-api-access-zqwzx\") pod \"certified-operators-7w7h9\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.316526 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-utilities\") pod \"certified-operators-7w7h9\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.418139 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-catalog-content\") pod \"certified-operators-7w7h9\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.418183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqwzx\" (UniqueName: \"kubernetes.io/projected/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-kube-api-access-zqwzx\") pod \"certified-operators-7w7h9\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.418209 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-utilities\") pod \"certified-operators-7w7h9\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.418829 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-utilities\") pod \"certified-operators-7w7h9\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.418827 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-catalog-content\") pod \"certified-operators-7w7h9\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.441674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqwzx\" (UniqueName: \"kubernetes.io/projected/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-kube-api-access-zqwzx\") pod \"certified-operators-7w7h9\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:37 crc kubenswrapper[4743]: I1122 09:08:37.495950 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:38 crc kubenswrapper[4743]: I1122 09:08:38.003663 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7w7h9"] Nov 22 09:08:38 crc kubenswrapper[4743]: I1122 09:08:38.287564 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w7h9" event={"ID":"1878a84f-30f6-42b5-9525-f4c3ee51f5d1","Type":"ContainerStarted","Data":"eb676e4ba13051adfe0b0c8634348cc43a3da81fb0e9bc0e812e1757ba71722d"} Nov 22 09:08:39 crc kubenswrapper[4743]: I1122 09:08:39.297332 4743 generic.go:334] "Generic (PLEG): container finished" podID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerID="f723828d6bb238fc9d6df76d84f5a933026851e552283c66450ddd2027542eee" exitCode=0 Nov 22 09:08:39 crc kubenswrapper[4743]: I1122 09:08:39.297384 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w7h9" event={"ID":"1878a84f-30f6-42b5-9525-f4c3ee51f5d1","Type":"ContainerDied","Data":"f723828d6bb238fc9d6df76d84f5a933026851e552283c66450ddd2027542eee"} Nov 22 09:08:42 crc kubenswrapper[4743]: I1122 09:08:42.320568 4743 generic.go:334] "Generic (PLEG): container finished" podID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerID="fe3405060f9fc695b2225f8db89f9303d73c88c1ad1a9061db8c27b7aeba98d4" exitCode=0 Nov 22 09:08:42 crc kubenswrapper[4743]: I1122 09:08:42.320703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w7h9" event={"ID":"1878a84f-30f6-42b5-9525-f4c3ee51f5d1","Type":"ContainerDied","Data":"fe3405060f9fc695b2225f8db89f9303d73c88c1ad1a9061db8c27b7aeba98d4"} Nov 22 09:08:44 crc kubenswrapper[4743]: I1122 09:08:44.335543 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w7h9" event={"ID":"1878a84f-30f6-42b5-9525-f4c3ee51f5d1","Type":"ContainerStarted","Data":"30adcb1258897299915a7492ab1ad2d885ac9371cff6214728f84b40efdd2737"} Nov 22 09:08:44 crc kubenswrapper[4743]: I1122 09:08:44.353496 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7w7h9" podStartSLOduration=2.918818615 podStartE2EDuration="7.353480616s" podCreationTimestamp="2025-11-22 09:08:37 +0000 UTC" firstStartedPulling="2025-11-22 09:08:39.29979679 +0000 UTC m=+2793.006157842" lastFinishedPulling="2025-11-22 09:08:43.734458791 +0000 UTC m=+2797.440819843" observedRunningTime="2025-11-22 09:08:44.349998606 +0000 UTC m=+2798.056359658" watchObservedRunningTime="2025-11-22 09:08:44.353480616 +0000 UTC m=+2798.059841668" Nov 22 09:08:47 crc kubenswrapper[4743]: I1122 09:08:47.497003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:47 crc kubenswrapper[4743]: I1122 09:08:47.497487 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:47 crc kubenswrapper[4743]: I1122 09:08:47.559860 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:48 crc kubenswrapper[4743]: I1122 09:08:48.410011 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:48 crc kubenswrapper[4743]: I1122 09:08:48.454853 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7w7h9"] Nov 22 09:08:50 crc kubenswrapper[4743]: I1122 09:08:50.375975 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7w7h9" podUID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerName="registry-server" containerID="cri-o://30adcb1258897299915a7492ab1ad2d885ac9371cff6214728f84b40efdd2737" gracePeriod=2 Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.390245 4743 generic.go:334] "Generic (PLEG): container finished" podID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerID="30adcb1258897299915a7492ab1ad2d885ac9371cff6214728f84b40efdd2737" exitCode=0 Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.390314 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w7h9" event={"ID":"1878a84f-30f6-42b5-9525-f4c3ee51f5d1","Type":"ContainerDied","Data":"30adcb1258897299915a7492ab1ad2d885ac9371cff6214728f84b40efdd2737"} Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.390665 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7w7h9" event={"ID":"1878a84f-30f6-42b5-9525-f4c3ee51f5d1","Type":"ContainerDied","Data":"eb676e4ba13051adfe0b0c8634348cc43a3da81fb0e9bc0e812e1757ba71722d"} Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.390708 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb676e4ba13051adfe0b0c8634348cc43a3da81fb0e9bc0e812e1757ba71722d" Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.401300 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.515417 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-utilities\") pod \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.515554 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqwzx\" (UniqueName: \"kubernetes.io/projected/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-kube-api-access-zqwzx\") pod \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.515798 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-catalog-content\") pod \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\" (UID: \"1878a84f-30f6-42b5-9525-f4c3ee51f5d1\") " Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.518341 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-utilities" (OuterVolumeSpecName: "utilities") pod "1878a84f-30f6-42b5-9525-f4c3ee51f5d1" (UID: "1878a84f-30f6-42b5-9525-f4c3ee51f5d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.522444 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-kube-api-access-zqwzx" (OuterVolumeSpecName: "kube-api-access-zqwzx") pod "1878a84f-30f6-42b5-9525-f4c3ee51f5d1" (UID: "1878a84f-30f6-42b5-9525-f4c3ee51f5d1"). InnerVolumeSpecName "kube-api-access-zqwzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.570885 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1878a84f-30f6-42b5-9525-f4c3ee51f5d1" (UID: "1878a84f-30f6-42b5-9525-f4c3ee51f5d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.618071 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.618110 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqwzx\" (UniqueName: \"kubernetes.io/projected/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-kube-api-access-zqwzx\") on node \"crc\" DevicePath \"\"" Nov 22 09:08:51 crc kubenswrapper[4743]: I1122 09:08:51.618124 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1878a84f-30f6-42b5-9525-f4c3ee51f5d1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:08:52 crc kubenswrapper[4743]: I1122 09:08:52.398363 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7w7h9" Nov 22 09:08:52 crc kubenswrapper[4743]: I1122 09:08:52.429005 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7w7h9"] Nov 22 09:08:52 crc kubenswrapper[4743]: I1122 09:08:52.437527 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7w7h9"] Nov 22 09:08:53 crc kubenswrapper[4743]: I1122 09:08:53.159721 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" path="/var/lib/kubelet/pods/1878a84f-30f6-42b5-9525-f4c3ee51f5d1/volumes" Nov 22 09:09:01 crc kubenswrapper[4743]: I1122 09:09:01.241624 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:09:01 crc kubenswrapper[4743]: I1122 09:09:01.242186 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:09:01 crc kubenswrapper[4743]: I1122 09:09:01.242232 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:09:01 crc kubenswrapper[4743]: I1122 09:09:01.242870 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a60e1ad22fedf2a9d76ed3d5822dc4816ba557217c34494b2f1ff2a5f966e0a"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:09:01 crc kubenswrapper[4743]: I1122 09:09:01.242930 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://8a60e1ad22fedf2a9d76ed3d5822dc4816ba557217c34494b2f1ff2a5f966e0a" gracePeriod=600 Nov 22 09:09:02 crc kubenswrapper[4743]: I1122 09:09:02.478756 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="8a60e1ad22fedf2a9d76ed3d5822dc4816ba557217c34494b2f1ff2a5f966e0a" exitCode=0 Nov 22 09:09:02 crc kubenswrapper[4743]: I1122 09:09:02.478837 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"8a60e1ad22fedf2a9d76ed3d5822dc4816ba557217c34494b2f1ff2a5f966e0a"} Nov 22 09:09:02 crc kubenswrapper[4743]: I1122 09:09:02.479333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb"} Nov 22 09:09:02 crc kubenswrapper[4743]: I1122 09:09:02.479355 4743 scope.go:117] "RemoveContainer" containerID="266fa270874a3c29a195bddb49ccb836207c6bc38bd7ce688e92409fdaaa1467" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.264160 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t2wm2"] Nov 22 09:10:43 crc kubenswrapper[4743]: E1122 09:10:43.265101 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerName="registry-server" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.265119 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerName="registry-server" Nov 22 09:10:43 crc kubenswrapper[4743]: E1122 09:10:43.265191 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerName="extract-utilities" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.265201 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerName="extract-utilities" Nov 22 09:10:43 crc kubenswrapper[4743]: E1122 09:10:43.265212 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerName="extract-content" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.265219 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerName="extract-content" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.265398 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1878a84f-30f6-42b5-9525-f4c3ee51f5d1" containerName="registry-server" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.268181 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.274194 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2wm2"] Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.417237 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jhbv\" (UniqueName: \"kubernetes.io/projected/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-kube-api-access-6jhbv\") pod \"redhat-operators-t2wm2\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.417319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-utilities\") pod \"redhat-operators-t2wm2\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.417381 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-catalog-content\") pod \"redhat-operators-t2wm2\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.518408 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jhbv\" (UniqueName: \"kubernetes.io/projected/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-kube-api-access-6jhbv\") pod \"redhat-operators-t2wm2\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.518485 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-utilities\") pod \"redhat-operators-t2wm2\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.518533 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-catalog-content\") pod \"redhat-operators-t2wm2\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.519170 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-catalog-content\") pod \"redhat-operators-t2wm2\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.519238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-utilities\") pod \"redhat-operators-t2wm2\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.551347 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jhbv\" (UniqueName: \"kubernetes.io/projected/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-kube-api-access-6jhbv\") pod \"redhat-operators-t2wm2\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:43 crc kubenswrapper[4743]: I1122 09:10:43.601540 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:44 crc kubenswrapper[4743]: I1122 09:10:44.063242 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2wm2"] Nov 22 09:10:44 crc kubenswrapper[4743]: I1122 09:10:44.237766 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2wm2" event={"ID":"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4","Type":"ContainerStarted","Data":"64fe5da6d736b9828e0e05d0d7eb077e0bfe32ca73c6d829956f07919cff7706"} Nov 22 09:10:45 crc kubenswrapper[4743]: I1122 09:10:45.246204 4743 generic.go:334] "Generic (PLEG): container finished" podID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerID="2d17258169dc677f2bd819523c05bd4f5df513a1c303c1d7c9318a7424e9fbf4" exitCode=0 Nov 22 09:10:45 crc kubenswrapper[4743]: I1122 09:10:45.246299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2wm2" event={"ID":"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4","Type":"ContainerDied","Data":"2d17258169dc677f2bd819523c05bd4f5df513a1c303c1d7c9318a7424e9fbf4"} Nov 22 09:10:46 crc kubenswrapper[4743]: I1122 09:10:46.257878 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2wm2" event={"ID":"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4","Type":"ContainerStarted","Data":"fefe95ccbf254e9585c9cb409a741b46c2c4d5e859f952a4a6cb0269f94828a4"} Nov 22 09:10:47 crc kubenswrapper[4743]: I1122 09:10:47.267675 4743 generic.go:334] "Generic (PLEG): container finished" podID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerID="fefe95ccbf254e9585c9cb409a741b46c2c4d5e859f952a4a6cb0269f94828a4" exitCode=0 Nov 22 09:10:47 crc kubenswrapper[4743]: I1122 09:10:47.267776 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2wm2" event={"ID":"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4","Type":"ContainerDied","Data":"fefe95ccbf254e9585c9cb409a741b46c2c4d5e859f952a4a6cb0269f94828a4"} Nov 22 09:10:48 crc kubenswrapper[4743]: I1122 09:10:48.279653 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2wm2" event={"ID":"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4","Type":"ContainerStarted","Data":"9b210974753794a6c0319fe1a3007133e3367dd0a3cfb023681ae0bb3f9ef933"} Nov 22 09:10:48 crc kubenswrapper[4743]: I1122 09:10:48.308108 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t2wm2" podStartSLOduration=2.830508913 podStartE2EDuration="5.308077118s" podCreationTimestamp="2025-11-22 09:10:43 +0000 UTC" firstStartedPulling="2025-11-22 09:10:45.248177345 +0000 UTC m=+2918.954538427" lastFinishedPulling="2025-11-22 09:10:47.72574558 +0000 UTC m=+2921.432106632" observedRunningTime="2025-11-22 09:10:48.300652694 +0000 UTC m=+2922.007013766" watchObservedRunningTime="2025-11-22 09:10:48.308077118 +0000 UTC m=+2922.014438190" Nov 22 09:10:53 crc kubenswrapper[4743]: I1122 09:10:53.601817 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:53 crc kubenswrapper[4743]: I1122 09:10:53.602283 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:53 crc kubenswrapper[4743]: I1122 09:10:53.642139 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:54 crc kubenswrapper[4743]: I1122 09:10:54.363191 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:54 crc kubenswrapper[4743]: I1122 09:10:54.402053 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2wm2"] Nov 22 09:10:56 crc kubenswrapper[4743]: I1122 09:10:56.331534 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t2wm2" podUID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerName="registry-server" containerID="cri-o://9b210974753794a6c0319fe1a3007133e3367dd0a3cfb023681ae0bb3f9ef933" gracePeriod=2 Nov 22 09:10:57 crc kubenswrapper[4743]: I1122 09:10:57.340236 4743 generic.go:334] "Generic (PLEG): container finished" podID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerID="9b210974753794a6c0319fe1a3007133e3367dd0a3cfb023681ae0bb3f9ef933" exitCode=0 Nov 22 09:10:57 crc kubenswrapper[4743]: I1122 09:10:57.340273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2wm2" event={"ID":"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4","Type":"ContainerDied","Data":"9b210974753794a6c0319fe1a3007133e3367dd0a3cfb023681ae0bb3f9ef933"} Nov 22 09:10:57 crc kubenswrapper[4743]: I1122 09:10:57.766160 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:57 crc kubenswrapper[4743]: I1122 09:10:57.834374 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-utilities\") pod \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " Nov 22 09:10:57 crc kubenswrapper[4743]: I1122 09:10:57.834555 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jhbv\" (UniqueName: \"kubernetes.io/projected/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-kube-api-access-6jhbv\") pod \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " Nov 22 09:10:57 crc kubenswrapper[4743]: I1122 09:10:57.834678 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-catalog-content\") pod \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\" (UID: \"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4\") " Nov 22 09:10:57 crc kubenswrapper[4743]: I1122 09:10:57.835508 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-utilities" (OuterVolumeSpecName: "utilities") pod "909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" (UID: "909d28f4-6ee9-4b7a-b821-f1d5f49a67a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:10:57 crc kubenswrapper[4743]: I1122 09:10:57.840179 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-kube-api-access-6jhbv" (OuterVolumeSpecName: "kube-api-access-6jhbv") pod "909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" (UID: "909d28f4-6ee9-4b7a-b821-f1d5f49a67a4"). InnerVolumeSpecName "kube-api-access-6jhbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:10:57 crc kubenswrapper[4743]: I1122 09:10:57.936723 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:10:57 crc kubenswrapper[4743]: I1122 09:10:57.936786 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jhbv\" (UniqueName: \"kubernetes.io/projected/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-kube-api-access-6jhbv\") on node \"crc\" DevicePath \"\"" Nov 22 09:10:58 crc kubenswrapper[4743]: I1122 09:10:58.349973 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2wm2" event={"ID":"909d28f4-6ee9-4b7a-b821-f1d5f49a67a4","Type":"ContainerDied","Data":"64fe5da6d736b9828e0e05d0d7eb077e0bfe32ca73c6d829956f07919cff7706"} Nov 22 09:10:58 crc kubenswrapper[4743]: I1122 09:10:58.350028 4743 scope.go:117] "RemoveContainer" containerID="9b210974753794a6c0319fe1a3007133e3367dd0a3cfb023681ae0bb3f9ef933" Nov 22 09:10:58 crc kubenswrapper[4743]: I1122 09:10:58.350043 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2wm2" Nov 22 09:10:58 crc kubenswrapper[4743]: I1122 09:10:58.366275 4743 scope.go:117] "RemoveContainer" containerID="fefe95ccbf254e9585c9cb409a741b46c2c4d5e859f952a4a6cb0269f94828a4" Nov 22 09:10:58 crc kubenswrapper[4743]: I1122 09:10:58.385013 4743 scope.go:117] "RemoveContainer" containerID="2d17258169dc677f2bd819523c05bd4f5df513a1c303c1d7c9318a7424e9fbf4" Nov 22 09:10:59 crc kubenswrapper[4743]: I1122 09:10:59.127359 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" (UID: "909d28f4-6ee9-4b7a-b821-f1d5f49a67a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:10:59 crc kubenswrapper[4743]: I1122 09:10:59.151998 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:10:59 crc kubenswrapper[4743]: I1122 09:10:59.270712 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2wm2"] Nov 22 09:10:59 crc kubenswrapper[4743]: I1122 09:10:59.275154 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t2wm2"] Nov 22 09:11:01 crc kubenswrapper[4743]: I1122 09:11:01.160260 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" path="/var/lib/kubelet/pods/909d28f4-6ee9-4b7a-b821-f1d5f49a67a4/volumes" Nov 22 09:11:31 crc kubenswrapper[4743]: I1122 09:11:31.241666 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:11:31 crc kubenswrapper[4743]: I1122 09:11:31.242214 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:12:01 crc kubenswrapper[4743]: I1122 09:12:01.240778 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:12:01 crc kubenswrapper[4743]: I1122 09:12:01.241231 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:12:31 crc kubenswrapper[4743]: I1122 09:12:31.241639 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:12:31 crc kubenswrapper[4743]: I1122 09:12:31.242267 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:12:31 crc kubenswrapper[4743]: I1122 09:12:31.242313 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:12:31 crc kubenswrapper[4743]: I1122 09:12:31.242885 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:12:31 crc kubenswrapper[4743]: I1122 09:12:31.242942 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" gracePeriod=600 Nov 22 09:12:31 crc kubenswrapper[4743]: E1122 09:12:31.386954 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:12:32 crc kubenswrapper[4743]: I1122 09:12:32.071644 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" exitCode=0 Nov 22 09:12:32 crc kubenswrapper[4743]: I1122 09:12:32.071689 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb"} Nov 22 09:12:32 crc kubenswrapper[4743]: I1122 09:12:32.071727 4743 scope.go:117] "RemoveContainer" containerID="8a60e1ad22fedf2a9d76ed3d5822dc4816ba557217c34494b2f1ff2a5f966e0a" Nov 22 09:12:32 crc kubenswrapper[4743]: I1122 09:12:32.072158 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:12:32 crc kubenswrapper[4743]: E1122 09:12:32.072363 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:12:46 crc kubenswrapper[4743]: I1122 09:12:46.151364 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:12:46 crc kubenswrapper[4743]: E1122 09:12:46.152133 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:13:01 crc kubenswrapper[4743]: I1122 09:13:01.152031 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:13:01 crc kubenswrapper[4743]: E1122 09:13:01.152738 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:13:15 crc kubenswrapper[4743]: I1122 09:13:15.152186 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:13:15 crc kubenswrapper[4743]: E1122 09:13:15.152969 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:13:29 crc kubenswrapper[4743]: I1122 09:13:29.152200 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:13:29 crc kubenswrapper[4743]: E1122 09:13:29.152864 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:13:44 crc kubenswrapper[4743]: I1122 09:13:44.152611 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:13:44 crc kubenswrapper[4743]: E1122 09:13:44.153386 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:13:58 crc kubenswrapper[4743]: I1122 09:13:58.151345 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:13:58 crc kubenswrapper[4743]: E1122 09:13:58.152206 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:14:09 crc kubenswrapper[4743]: I1122 09:14:09.151643 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:14:09 crc kubenswrapper[4743]: E1122 09:14:09.152456 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:14:22 crc kubenswrapper[4743]: I1122 09:14:22.151863 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:14:22 crc kubenswrapper[4743]: E1122 09:14:22.152511 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:14:35 crc kubenswrapper[4743]: I1122 09:14:35.151447 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:14:35 crc kubenswrapper[4743]: E1122 09:14:35.152353 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:14:49 crc kubenswrapper[4743]: I1122 09:14:49.150937 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:14:49 crc kubenswrapper[4743]: E1122 09:14:49.151669 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.168122 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc"] Nov 22 09:15:00 crc kubenswrapper[4743]: E1122 09:15:00.168775 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerName="extract-utilities" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.168791 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerName="extract-utilities" Nov 22 09:15:00 crc kubenswrapper[4743]: E1122 09:15:00.168812 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerName="registry-server" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.168820 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerName="registry-server" Nov 22 09:15:00 crc kubenswrapper[4743]: E1122 09:15:00.168852 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerName="extract-content" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.168863 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerName="extract-content" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.169023 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="909d28f4-6ee9-4b7a-b821-f1d5f49a67a4" containerName="registry-server" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.169773 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.171976 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.173164 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.181457 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc"] Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.181820 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa98797-b1ed-4fbd-9168-3cc290092457-config-volume\") pod \"collect-profiles-29396715-ntpmc\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.182082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa98797-b1ed-4fbd-9168-3cc290092457-secret-volume\") pod \"collect-profiles-29396715-ntpmc\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.182142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z5w2\" (UniqueName: \"kubernetes.io/projected/1aa98797-b1ed-4fbd-9168-3cc290092457-kube-api-access-7z5w2\") pod \"collect-profiles-29396715-ntpmc\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.283374 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa98797-b1ed-4fbd-9168-3cc290092457-secret-volume\") pod \"collect-profiles-29396715-ntpmc\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.283689 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z5w2\" (UniqueName: \"kubernetes.io/projected/1aa98797-b1ed-4fbd-9168-3cc290092457-kube-api-access-7z5w2\") pod \"collect-profiles-29396715-ntpmc\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.283741 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa98797-b1ed-4fbd-9168-3cc290092457-config-volume\") pod \"collect-profiles-29396715-ntpmc\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.284767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa98797-b1ed-4fbd-9168-3cc290092457-config-volume\") pod \"collect-profiles-29396715-ntpmc\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.296385 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa98797-b1ed-4fbd-9168-3cc290092457-secret-volume\") pod \"collect-profiles-29396715-ntpmc\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.304681 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z5w2\" (UniqueName: \"kubernetes.io/projected/1aa98797-b1ed-4fbd-9168-3cc290092457-kube-api-access-7z5w2\") pod \"collect-profiles-29396715-ntpmc\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.499515 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:00 crc kubenswrapper[4743]: I1122 09:15:00.960866 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc"] Nov 22 09:15:01 crc kubenswrapper[4743]: I1122 09:15:01.705660 4743 generic.go:334] "Generic (PLEG): container finished" podID="1aa98797-b1ed-4fbd-9168-3cc290092457" containerID="baa90a54d8dc526a42ac9b3ab529b3f056512058a59e155799a246a1c012e83d" exitCode=0 Nov 22 09:15:01 crc kubenswrapper[4743]: I1122 09:15:01.705743 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" event={"ID":"1aa98797-b1ed-4fbd-9168-3cc290092457","Type":"ContainerDied","Data":"baa90a54d8dc526a42ac9b3ab529b3f056512058a59e155799a246a1c012e83d"} Nov 22 09:15:01 crc kubenswrapper[4743]: I1122 09:15:01.708787 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" event={"ID":"1aa98797-b1ed-4fbd-9168-3cc290092457","Type":"ContainerStarted","Data":"3c91292540a8b44dd52ba583bb0019adca9f885c4e603a57d6e7dac147754ae5"} Nov 22 09:15:02 crc kubenswrapper[4743]: I1122 09:15:02.152903 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:15:02 crc kubenswrapper[4743]: E1122 09:15:02.153199 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.030542 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.227305 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa98797-b1ed-4fbd-9168-3cc290092457-secret-volume\") pod \"1aa98797-b1ed-4fbd-9168-3cc290092457\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.227410 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa98797-b1ed-4fbd-9168-3cc290092457-config-volume\") pod \"1aa98797-b1ed-4fbd-9168-3cc290092457\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.227549 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z5w2\" (UniqueName: \"kubernetes.io/projected/1aa98797-b1ed-4fbd-9168-3cc290092457-kube-api-access-7z5w2\") pod \"1aa98797-b1ed-4fbd-9168-3cc290092457\" (UID: \"1aa98797-b1ed-4fbd-9168-3cc290092457\") " Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.228116 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa98797-b1ed-4fbd-9168-3cc290092457-config-volume" (OuterVolumeSpecName: "config-volume") pod "1aa98797-b1ed-4fbd-9168-3cc290092457" (UID: "1aa98797-b1ed-4fbd-9168-3cc290092457"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.232595 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa98797-b1ed-4fbd-9168-3cc290092457-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1aa98797-b1ed-4fbd-9168-3cc290092457" (UID: "1aa98797-b1ed-4fbd-9168-3cc290092457"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.232620 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa98797-b1ed-4fbd-9168-3cc290092457-kube-api-access-7z5w2" (OuterVolumeSpecName: "kube-api-access-7z5w2") pod "1aa98797-b1ed-4fbd-9168-3cc290092457" (UID: "1aa98797-b1ed-4fbd-9168-3cc290092457"). InnerVolumeSpecName "kube-api-access-7z5w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.328920 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z5w2\" (UniqueName: \"kubernetes.io/projected/1aa98797-b1ed-4fbd-9168-3cc290092457-kube-api-access-7z5w2\") on node \"crc\" DevicePath \"\"" Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.329416 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1aa98797-b1ed-4fbd-9168-3cc290092457-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.329594 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1aa98797-b1ed-4fbd-9168-3cc290092457-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.719741 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" event={"ID":"1aa98797-b1ed-4fbd-9168-3cc290092457","Type":"ContainerDied","Data":"3c91292540a8b44dd52ba583bb0019adca9f885c4e603a57d6e7dac147754ae5"} Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.719780 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c91292540a8b44dd52ba583bb0019adca9f885c4e603a57d6e7dac147754ae5" Nov 22 09:15:03 crc kubenswrapper[4743]: I1122 09:15:03.720053 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc" Nov 22 09:15:04 crc kubenswrapper[4743]: I1122 09:15:04.099094 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn"] Nov 22 09:15:04 crc kubenswrapper[4743]: I1122 09:15:04.104086 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396670-j4fcn"] Nov 22 09:15:05 crc kubenswrapper[4743]: I1122 09:15:05.163547 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d19b2aa7-4ff4-470f-a036-2202acdf6490" path="/var/lib/kubelet/pods/d19b2aa7-4ff4-470f-a036-2202acdf6490/volumes" Nov 22 09:15:16 crc kubenswrapper[4743]: I1122 09:15:16.152671 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:15:16 crc kubenswrapper[4743]: E1122 09:15:16.154388 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:15:30 crc kubenswrapper[4743]: I1122 09:15:30.151012 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:15:30 crc kubenswrapper[4743]: E1122 09:15:30.151659 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:15:30 crc kubenswrapper[4743]: I1122 09:15:30.995395 4743 scope.go:117] "RemoveContainer" containerID="f723828d6bb238fc9d6df76d84f5a933026851e552283c66450ddd2027542eee" Nov 22 09:15:31 crc kubenswrapper[4743]: I1122 09:15:31.021024 4743 scope.go:117] "RemoveContainer" containerID="30adcb1258897299915a7492ab1ad2d885ac9371cff6214728f84b40efdd2737" Nov 22 09:15:31 crc kubenswrapper[4743]: I1122 09:15:31.042339 4743 scope.go:117] "RemoveContainer" containerID="8b539228e577d032f81498c065bc69bdcd50279e13b9d83ee584daec8475b55a" Nov 22 09:15:31 crc kubenswrapper[4743]: I1122 09:15:31.067331 4743 scope.go:117] "RemoveContainer" containerID="fe3405060f9fc695b2225f8db89f9303d73c88c1ad1a9061db8c27b7aeba98d4" Nov 22 09:15:44 crc kubenswrapper[4743]: I1122 09:15:44.151936 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:15:44 crc kubenswrapper[4743]: E1122 09:15:44.153702 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:15:58 crc kubenswrapper[4743]: I1122 09:15:58.151400 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:15:58 crc kubenswrapper[4743]: E1122 09:15:58.151941 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:16:13 crc kubenswrapper[4743]: I1122 09:16:13.152256 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:16:13 crc kubenswrapper[4743]: E1122 09:16:13.153081 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:16:24 crc kubenswrapper[4743]: I1122 09:16:24.778589 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pmf"] Nov 22 09:16:24 crc kubenswrapper[4743]: E1122 09:16:24.780786 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa98797-b1ed-4fbd-9168-3cc290092457" containerName="collect-profiles" Nov 22 09:16:24 crc kubenswrapper[4743]: I1122 09:16:24.780808 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa98797-b1ed-4fbd-9168-3cc290092457" containerName="collect-profiles" Nov 22 09:16:24 crc kubenswrapper[4743]: I1122 09:16:24.780995 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa98797-b1ed-4fbd-9168-3cc290092457" containerName="collect-profiles" Nov 22 09:16:24 crc kubenswrapper[4743]: I1122 09:16:24.782434 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:24 crc kubenswrapper[4743]: I1122 09:16:24.793074 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pmf"] Nov 22 09:16:24 crc kubenswrapper[4743]: I1122 09:16:24.900319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn5fg\" (UniqueName: \"kubernetes.io/projected/9bcebc40-cc52-411d-9962-c4e264473937-kube-api-access-gn5fg\") pod \"redhat-marketplace-x2pmf\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:24 crc kubenswrapper[4743]: I1122 09:16:24.900407 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-utilities\") pod \"redhat-marketplace-x2pmf\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:24 crc kubenswrapper[4743]: I1122 09:16:24.900525 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-catalog-content\") pod \"redhat-marketplace-x2pmf\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:25 crc kubenswrapper[4743]: I1122 09:16:25.002001 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn5fg\" (UniqueName: \"kubernetes.io/projected/9bcebc40-cc52-411d-9962-c4e264473937-kube-api-access-gn5fg\") pod \"redhat-marketplace-x2pmf\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:25 crc kubenswrapper[4743]: I1122 09:16:25.002065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-utilities\") pod \"redhat-marketplace-x2pmf\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:25 crc kubenswrapper[4743]: I1122 09:16:25.002087 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-catalog-content\") pod \"redhat-marketplace-x2pmf\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:25 crc kubenswrapper[4743]: I1122 09:16:25.002700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-catalog-content\") pod \"redhat-marketplace-x2pmf\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:25 crc kubenswrapper[4743]: I1122 09:16:25.002839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-utilities\") pod \"redhat-marketplace-x2pmf\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:25 crc kubenswrapper[4743]: I1122 09:16:25.027350 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn5fg\" (UniqueName: \"kubernetes.io/projected/9bcebc40-cc52-411d-9962-c4e264473937-kube-api-access-gn5fg\") pod \"redhat-marketplace-x2pmf\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:25 crc kubenswrapper[4743]: I1122 09:16:25.108540 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:25 crc kubenswrapper[4743]: I1122 09:16:25.559550 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pmf"] Nov 22 09:16:26 crc kubenswrapper[4743]: I1122 09:16:26.330700 4743 generic.go:334] "Generic (PLEG): container finished" podID="9bcebc40-cc52-411d-9962-c4e264473937" containerID="e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20" exitCode=0 Nov 22 09:16:26 crc kubenswrapper[4743]: I1122 09:16:26.330747 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pmf" event={"ID":"9bcebc40-cc52-411d-9962-c4e264473937","Type":"ContainerDied","Data":"e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20"} Nov 22 09:16:26 crc kubenswrapper[4743]: I1122 09:16:26.333969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pmf" event={"ID":"9bcebc40-cc52-411d-9962-c4e264473937","Type":"ContainerStarted","Data":"2763ca018a802fb7960eed2bbadca4da130307948692c3ad55e337a4a3103404"} Nov 22 09:16:26 crc kubenswrapper[4743]: I1122 09:16:26.333138 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:16:27 crc kubenswrapper[4743]: I1122 09:16:27.159569 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:16:27 crc kubenswrapper[4743]: E1122 09:16:27.160257 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:16:27 crc kubenswrapper[4743]: I1122 09:16:27.344803 4743 generic.go:334] "Generic (PLEG): container finished" podID="9bcebc40-cc52-411d-9962-c4e264473937" containerID="a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538" exitCode=0 Nov 22 09:16:27 crc kubenswrapper[4743]: I1122 09:16:27.344849 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pmf" event={"ID":"9bcebc40-cc52-411d-9962-c4e264473937","Type":"ContainerDied","Data":"a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538"} Nov 22 09:16:28 crc kubenswrapper[4743]: I1122 09:16:28.357706 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pmf" event={"ID":"9bcebc40-cc52-411d-9962-c4e264473937","Type":"ContainerStarted","Data":"9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529"} Nov 22 09:16:28 crc kubenswrapper[4743]: I1122 09:16:28.387623 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x2pmf" podStartSLOduration=2.537626206 podStartE2EDuration="4.387603137s" podCreationTimestamp="2025-11-22 09:16:24 +0000 UTC" firstStartedPulling="2025-11-22 09:16:26.332803198 +0000 UTC m=+3260.039164250" lastFinishedPulling="2025-11-22 09:16:28.182780129 +0000 UTC m=+3261.889141181" observedRunningTime="2025-11-22 09:16:28.380244675 +0000 UTC m=+3262.086605717" watchObservedRunningTime="2025-11-22 09:16:28.387603137 +0000 UTC m=+3262.093964219" Nov 22 09:16:35 crc kubenswrapper[4743]: I1122 09:16:35.109818 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:35 crc kubenswrapper[4743]: I1122 09:16:35.110356 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:35 crc kubenswrapper[4743]: I1122 09:16:35.148413 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:35 crc kubenswrapper[4743]: I1122 09:16:35.443245 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:35 crc kubenswrapper[4743]: I1122 09:16:35.483716 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pmf"] Nov 22 09:16:37 crc kubenswrapper[4743]: I1122 09:16:37.414228 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x2pmf" podUID="9bcebc40-cc52-411d-9962-c4e264473937" containerName="registry-server" containerID="cri-o://9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529" gracePeriod=2 Nov 22 09:16:37 crc kubenswrapper[4743]: I1122 09:16:37.812905 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:37 crc kubenswrapper[4743]: I1122 09:16:37.997861 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-catalog-content\") pod \"9bcebc40-cc52-411d-9962-c4e264473937\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " Nov 22 09:16:37 crc kubenswrapper[4743]: I1122 09:16:37.997965 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn5fg\" (UniqueName: \"kubernetes.io/projected/9bcebc40-cc52-411d-9962-c4e264473937-kube-api-access-gn5fg\") pod \"9bcebc40-cc52-411d-9962-c4e264473937\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " Nov 22 09:16:37 crc kubenswrapper[4743]: I1122 09:16:37.997996 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-utilities\") pod \"9bcebc40-cc52-411d-9962-c4e264473937\" (UID: \"9bcebc40-cc52-411d-9962-c4e264473937\") " Nov 22 09:16:37 crc kubenswrapper[4743]: I1122 09:16:37.999072 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-utilities" (OuterVolumeSpecName: "utilities") pod "9bcebc40-cc52-411d-9962-c4e264473937" (UID: "9bcebc40-cc52-411d-9962-c4e264473937"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.003995 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bcebc40-cc52-411d-9962-c4e264473937-kube-api-access-gn5fg" (OuterVolumeSpecName: "kube-api-access-gn5fg") pod "9bcebc40-cc52-411d-9962-c4e264473937" (UID: "9bcebc40-cc52-411d-9962-c4e264473937"). InnerVolumeSpecName "kube-api-access-gn5fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.018495 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bcebc40-cc52-411d-9962-c4e264473937" (UID: "9bcebc40-cc52-411d-9962-c4e264473937"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.099443 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.099495 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn5fg\" (UniqueName: \"kubernetes.io/projected/9bcebc40-cc52-411d-9962-c4e264473937-kube-api-access-gn5fg\") on node \"crc\" DevicePath \"\"" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.099515 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bcebc40-cc52-411d-9962-c4e264473937-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.422096 4743 generic.go:334] "Generic (PLEG): container finished" podID="9bcebc40-cc52-411d-9962-c4e264473937" containerID="9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529" exitCode=0 Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.422145 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x2pmf" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.422146 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pmf" event={"ID":"9bcebc40-cc52-411d-9962-c4e264473937","Type":"ContainerDied","Data":"9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529"} Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.422494 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x2pmf" event={"ID":"9bcebc40-cc52-411d-9962-c4e264473937","Type":"ContainerDied","Data":"2763ca018a802fb7960eed2bbadca4da130307948692c3ad55e337a4a3103404"} Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.422522 4743 scope.go:117] "RemoveContainer" containerID="9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.448939 4743 scope.go:117] "RemoveContainer" containerID="a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.459565 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pmf"] Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.465092 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x2pmf"] Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.471408 4743 scope.go:117] "RemoveContainer" containerID="e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.492556 4743 scope.go:117] "RemoveContainer" containerID="9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529" Nov 22 09:16:38 crc kubenswrapper[4743]: E1122 09:16:38.493097 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529\": container with ID starting with 9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529 not found: ID does not exist" containerID="9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.493197 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529"} err="failed to get container status \"9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529\": rpc error: code = NotFound desc = could not find container \"9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529\": container with ID starting with 9973597c973cbd3d1c8ba48fa3aa803997a8566767c3a82edd256708e9267529 not found: ID does not exist" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.493226 4743 scope.go:117] "RemoveContainer" containerID="a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538" Nov 22 09:16:38 crc kubenswrapper[4743]: E1122 09:16:38.493608 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538\": container with ID starting with a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538 not found: ID does not exist" containerID="a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.493648 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538"} err="failed to get container status \"a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538\": rpc error: code = NotFound desc = could not find container \"a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538\": container with ID starting with a86c7074afcbb6d726c3655da3559974ffe4e1f8fd9c8aac221d69194d1d3538 not found: ID does not exist" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.493674 4743 scope.go:117] "RemoveContainer" containerID="e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20" Nov 22 09:16:38 crc kubenswrapper[4743]: E1122 09:16:38.493952 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20\": container with ID starting with e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20 not found: ID does not exist" containerID="e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20" Nov 22 09:16:38 crc kubenswrapper[4743]: I1122 09:16:38.493995 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20"} err="failed to get container status \"e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20\": rpc error: code = NotFound desc = could not find container \"e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20\": container with ID starting with e8917a23e9009c9e4c9c6af2d0c1794315f6ade5fe814b900076f4e314385f20 not found: ID does not exist" Nov 22 09:16:39 crc kubenswrapper[4743]: I1122 09:16:39.161147 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bcebc40-cc52-411d-9962-c4e264473937" path="/var/lib/kubelet/pods/9bcebc40-cc52-411d-9962-c4e264473937/volumes" Nov 22 09:16:42 crc kubenswrapper[4743]: I1122 09:16:42.151873 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:16:42 crc kubenswrapper[4743]: E1122 09:16:42.152406 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:16:56 crc kubenswrapper[4743]: I1122 09:16:56.152162 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:16:56 crc kubenswrapper[4743]: E1122 09:16:56.153101 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:17:11 crc kubenswrapper[4743]: I1122 09:17:11.151893 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:17:11 crc kubenswrapper[4743]: E1122 09:17:11.152514 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:17:25 crc kubenswrapper[4743]: I1122 09:17:25.153773 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:17:25 crc kubenswrapper[4743]: E1122 09:17:25.156473 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:17:38 crc kubenswrapper[4743]: I1122 09:17:38.151092 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:17:39 crc kubenswrapper[4743]: I1122 09:17:39.132923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"6b7794d1b1cbdcf57d318d1ebcfcef98bffe04b58b134a2714f9f1c535edc337"} Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.678182 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k5bqv"] Nov 22 09:17:59 crc kubenswrapper[4743]: E1122 09:17:59.679054 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcebc40-cc52-411d-9962-c4e264473937" containerName="extract-content" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.679070 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcebc40-cc52-411d-9962-c4e264473937" containerName="extract-content" Nov 22 09:17:59 crc kubenswrapper[4743]: E1122 09:17:59.679096 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcebc40-cc52-411d-9962-c4e264473937" containerName="extract-utilities" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.679105 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcebc40-cc52-411d-9962-c4e264473937" containerName="extract-utilities" Nov 22 09:17:59 crc kubenswrapper[4743]: E1122 09:17:59.679131 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bcebc40-cc52-411d-9962-c4e264473937" containerName="registry-server" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.679141 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bcebc40-cc52-411d-9962-c4e264473937" containerName="registry-server" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.679301 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bcebc40-cc52-411d-9962-c4e264473937" containerName="registry-server" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.680557 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.685369 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k5bqv"] Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.714989 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-utilities\") pod \"community-operators-k5bqv\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.715063 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4hkm\" (UniqueName: \"kubernetes.io/projected/8b88f9c8-7354-4672-be42-3c80d0b0386a-kube-api-access-g4hkm\") pod \"community-operators-k5bqv\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.715120 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-catalog-content\") pod \"community-operators-k5bqv\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.815707 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-catalog-content\") pod \"community-operators-k5bqv\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.816023 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-utilities\") pod \"community-operators-k5bqv\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.816133 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4hkm\" (UniqueName: \"kubernetes.io/projected/8b88f9c8-7354-4672-be42-3c80d0b0386a-kube-api-access-g4hkm\") pod \"community-operators-k5bqv\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.816304 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-catalog-content\") pod \"community-operators-k5bqv\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.816316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-utilities\") pod \"community-operators-k5bqv\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:17:59 crc kubenswrapper[4743]: I1122 09:17:59.839536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4hkm\" (UniqueName: \"kubernetes.io/projected/8b88f9c8-7354-4672-be42-3c80d0b0386a-kube-api-access-g4hkm\") pod \"community-operators-k5bqv\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:18:00 crc kubenswrapper[4743]: I1122 09:18:00.015808 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:18:00 crc kubenswrapper[4743]: I1122 09:18:00.481888 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k5bqv"] Nov 22 09:18:01 crc kubenswrapper[4743]: I1122 09:18:01.276081 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerID="0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9" exitCode=0 Nov 22 09:18:01 crc kubenswrapper[4743]: I1122 09:18:01.276161 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5bqv" event={"ID":"8b88f9c8-7354-4672-be42-3c80d0b0386a","Type":"ContainerDied","Data":"0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9"} Nov 22 09:18:01 crc kubenswrapper[4743]: I1122 09:18:01.276459 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5bqv" event={"ID":"8b88f9c8-7354-4672-be42-3c80d0b0386a","Type":"ContainerStarted","Data":"905ce70c79c33e01fdd14de53ca085c6699cee204f448d6a470f4522945449ab"} Nov 22 09:18:02 crc kubenswrapper[4743]: I1122 09:18:02.286718 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerID="cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460" exitCode=0 Nov 22 09:18:02 crc kubenswrapper[4743]: I1122 09:18:02.286940 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5bqv" event={"ID":"8b88f9c8-7354-4672-be42-3c80d0b0386a","Type":"ContainerDied","Data":"cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460"} Nov 22 09:18:03 crc kubenswrapper[4743]: I1122 09:18:03.299432 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5bqv" event={"ID":"8b88f9c8-7354-4672-be42-3c80d0b0386a","Type":"ContainerStarted","Data":"62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b"} Nov 22 09:18:03 crc kubenswrapper[4743]: I1122 09:18:03.320166 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k5bqv" podStartSLOduration=2.913136895 podStartE2EDuration="4.320147045s" podCreationTimestamp="2025-11-22 09:17:59 +0000 UTC" firstStartedPulling="2025-11-22 09:18:01.277798126 +0000 UTC m=+3354.984159178" lastFinishedPulling="2025-11-22 09:18:02.684808276 +0000 UTC m=+3356.391169328" observedRunningTime="2025-11-22 09:18:03.318258301 +0000 UTC m=+3357.024619363" watchObservedRunningTime="2025-11-22 09:18:03.320147045 +0000 UTC m=+3357.026508097" Nov 22 09:18:10 crc kubenswrapper[4743]: I1122 09:18:10.016149 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:18:10 crc kubenswrapper[4743]: I1122 09:18:10.016722 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:18:10 crc kubenswrapper[4743]: I1122 09:18:10.068627 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:18:10 crc kubenswrapper[4743]: I1122 09:18:10.391803 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:18:10 crc kubenswrapper[4743]: I1122 09:18:10.443087 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k5bqv"] Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.359849 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k5bqv" podUID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerName="registry-server" containerID="cri-o://62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b" gracePeriod=2 Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.737468 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.892722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-utilities\") pod \"8b88f9c8-7354-4672-be42-3c80d0b0386a\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.892856 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4hkm\" (UniqueName: \"kubernetes.io/projected/8b88f9c8-7354-4672-be42-3c80d0b0386a-kube-api-access-g4hkm\") pod \"8b88f9c8-7354-4672-be42-3c80d0b0386a\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.892894 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-catalog-content\") pod \"8b88f9c8-7354-4672-be42-3c80d0b0386a\" (UID: \"8b88f9c8-7354-4672-be42-3c80d0b0386a\") " Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.893705 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-utilities" (OuterVolumeSpecName: "utilities") pod "8b88f9c8-7354-4672-be42-3c80d0b0386a" (UID: "8b88f9c8-7354-4672-be42-3c80d0b0386a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.898486 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b88f9c8-7354-4672-be42-3c80d0b0386a-kube-api-access-g4hkm" (OuterVolumeSpecName: "kube-api-access-g4hkm") pod "8b88f9c8-7354-4672-be42-3c80d0b0386a" (UID: "8b88f9c8-7354-4672-be42-3c80d0b0386a"). InnerVolumeSpecName "kube-api-access-g4hkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.951412 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b88f9c8-7354-4672-be42-3c80d0b0386a" (UID: "8b88f9c8-7354-4672-be42-3c80d0b0386a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.994097 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4hkm\" (UniqueName: \"kubernetes.io/projected/8b88f9c8-7354-4672-be42-3c80d0b0386a-kube-api-access-g4hkm\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.994131 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:12 crc kubenswrapper[4743]: I1122 09:18:12.994140 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b88f9c8-7354-4672-be42-3c80d0b0386a-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.366917 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerID="62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b" exitCode=0 Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.366962 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5bqv" Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.366988 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5bqv" event={"ID":"8b88f9c8-7354-4672-be42-3c80d0b0386a","Type":"ContainerDied","Data":"62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b"} Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.367746 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5bqv" event={"ID":"8b88f9c8-7354-4672-be42-3c80d0b0386a","Type":"ContainerDied","Data":"905ce70c79c33e01fdd14de53ca085c6699cee204f448d6a470f4522945449ab"} Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.367765 4743 scope.go:117] "RemoveContainer" containerID="62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b" Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.392620 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k5bqv"] Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.397457 4743 scope.go:117] "RemoveContainer" containerID="cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460" Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.397600 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k5bqv"] Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.415201 4743 scope.go:117] "RemoveContainer" containerID="0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9" Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.435138 4743 scope.go:117] "RemoveContainer" containerID="62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b" Nov 22 09:18:13 crc kubenswrapper[4743]: E1122 09:18:13.435565 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b\": container with ID starting with 62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b not found: ID does not exist" containerID="62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b" Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.435615 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b"} err="failed to get container status \"62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b\": rpc error: code = NotFound desc = could not find container \"62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b\": container with ID starting with 62be8a76a5d1f30b42cbb35e6dcdc9bf7d8a998dff4255f73be281a18028058b not found: ID does not exist" Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.435644 4743 scope.go:117] "RemoveContainer" containerID="cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460" Nov 22 09:18:13 crc kubenswrapper[4743]: E1122 09:18:13.435871 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460\": container with ID starting with cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460 not found: ID does not exist" containerID="cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460" Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.435900 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460"} err="failed to get container status \"cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460\": rpc error: code = NotFound desc = could not find container \"cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460\": container with ID starting with cea2e3bcd2b257c04dfb941c26de04ec0a809c041c5754c42b7a0f749c645460 not found: ID does not exist" Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.435917 4743 scope.go:117] "RemoveContainer" containerID="0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9" Nov 22 09:18:13 crc kubenswrapper[4743]: E1122 09:18:13.436488 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9\": container with ID starting with 0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9 not found: ID does not exist" containerID="0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9" Nov 22 09:18:13 crc kubenswrapper[4743]: I1122 09:18:13.436514 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9"} err="failed to get container status \"0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9\": rpc error: code = NotFound desc = could not find container \"0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9\": container with ID starting with 0721cff23a0c8eea20fc184e4444ac1a4c1ff620f65976fa8b7089a1ac3020c9 not found: ID does not exist" Nov 22 09:18:15 crc kubenswrapper[4743]: I1122 09:18:15.160809 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b88f9c8-7354-4672-be42-3c80d0b0386a" path="/var/lib/kubelet/pods/8b88f9c8-7354-4672-be42-3c80d0b0386a/volumes" Nov 22 09:20:01 crc kubenswrapper[4743]: I1122 09:20:01.241377 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:20:01 crc kubenswrapper[4743]: I1122 09:20:01.242336 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:20:31 crc kubenswrapper[4743]: I1122 09:20:31.241620 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:20:31 crc kubenswrapper[4743]: I1122 09:20:31.242226 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:21:01 crc kubenswrapper[4743]: I1122 09:21:01.241072 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:21:01 crc kubenswrapper[4743]: I1122 09:21:01.241750 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:21:01 crc kubenswrapper[4743]: I1122 09:21:01.241878 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:21:01 crc kubenswrapper[4743]: I1122 09:21:01.242370 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b7794d1b1cbdcf57d318d1ebcfcef98bffe04b58b134a2714f9f1c535edc337"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:21:01 crc kubenswrapper[4743]: I1122 09:21:01.242425 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://6b7794d1b1cbdcf57d318d1ebcfcef98bffe04b58b134a2714f9f1c535edc337" gracePeriod=600 Nov 22 09:21:01 crc kubenswrapper[4743]: I1122 09:21:01.655937 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="6b7794d1b1cbdcf57d318d1ebcfcef98bffe04b58b134a2714f9f1c535edc337" exitCode=0 Nov 22 09:21:01 crc kubenswrapper[4743]: I1122 09:21:01.655994 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"6b7794d1b1cbdcf57d318d1ebcfcef98bffe04b58b134a2714f9f1c535edc337"} Nov 22 09:21:01 crc kubenswrapper[4743]: I1122 09:21:01.656340 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e"} Nov 22 09:21:01 crc kubenswrapper[4743]: I1122 09:21:01.656363 4743 scope.go:117] "RemoveContainer" containerID="92f1bff18d7ea8f61576b8547493abaf479118a1e78dc7f3a391adb5aa2a32cb" Nov 22 09:21:27 crc kubenswrapper[4743]: I1122 09:21:27.795835 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sjnxb"] Nov 22 09:21:27 crc kubenswrapper[4743]: E1122 09:21:27.797050 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerName="extract-content" Nov 22 09:21:27 crc kubenswrapper[4743]: I1122 09:21:27.797068 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerName="extract-content" Nov 22 09:21:27 crc kubenswrapper[4743]: E1122 09:21:27.797087 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerName="extract-utilities" Nov 22 09:21:27 crc kubenswrapper[4743]: I1122 09:21:27.797095 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerName="extract-utilities" Nov 22 09:21:27 crc kubenswrapper[4743]: E1122 09:21:27.797116 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerName="registry-server" Nov 22 09:21:27 crc kubenswrapper[4743]: I1122 09:21:27.797124 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerName="registry-server" Nov 22 09:21:27 crc kubenswrapper[4743]: I1122 09:21:27.797305 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b88f9c8-7354-4672-be42-3c80d0b0386a" containerName="registry-server" Nov 22 09:21:27 crc kubenswrapper[4743]: I1122 09:21:27.799305 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:27 crc kubenswrapper[4743]: I1122 09:21:27.806526 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sjnxb"] Nov 22 09:21:27 crc kubenswrapper[4743]: I1122 09:21:27.959281 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-catalog-content\") pod \"redhat-operators-sjnxb\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:27 crc kubenswrapper[4743]: I1122 09:21:27.959457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-utilities\") pod \"redhat-operators-sjnxb\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:27 crc kubenswrapper[4743]: I1122 09:21:27.959491 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hctbq\" (UniqueName: \"kubernetes.io/projected/e5a8359f-f4bf-4636-9bee-8e135c4664e9-kube-api-access-hctbq\") pod \"redhat-operators-sjnxb\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.060389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-utilities\") pod \"redhat-operators-sjnxb\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.060440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hctbq\" (UniqueName: \"kubernetes.io/projected/e5a8359f-f4bf-4636-9bee-8e135c4664e9-kube-api-access-hctbq\") pod \"redhat-operators-sjnxb\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.060473 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-catalog-content\") pod \"redhat-operators-sjnxb\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.061255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-utilities\") pod \"redhat-operators-sjnxb\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.061315 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-catalog-content\") pod \"redhat-operators-sjnxb\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.082389 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hctbq\" (UniqueName: \"kubernetes.io/projected/e5a8359f-f4bf-4636-9bee-8e135c4664e9-kube-api-access-hctbq\") pod \"redhat-operators-sjnxb\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.126387 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.651164 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sjnxb"] Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.885872 4743 generic.go:334] "Generic (PLEG): container finished" podID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerID="f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9" exitCode=0 Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.886061 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjnxb" event={"ID":"e5a8359f-f4bf-4636-9bee-8e135c4664e9","Type":"ContainerDied","Data":"f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9"} Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.887160 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjnxb" event={"ID":"e5a8359f-f4bf-4636-9bee-8e135c4664e9","Type":"ContainerStarted","Data":"7efb8b96d472e62023eb7da66f03d31e01361301f8467efee7bc875282199bf4"} Nov 22 09:21:28 crc kubenswrapper[4743]: I1122 09:21:28.888108 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:21:29 crc kubenswrapper[4743]: I1122 09:21:29.901359 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjnxb" event={"ID":"e5a8359f-f4bf-4636-9bee-8e135c4664e9","Type":"ContainerStarted","Data":"2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f"} Nov 22 09:21:30 crc kubenswrapper[4743]: I1122 09:21:30.927452 4743 generic.go:334] "Generic (PLEG): container finished" podID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerID="2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f" exitCode=0 Nov 22 09:21:30 crc kubenswrapper[4743]: I1122 09:21:30.927661 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjnxb" event={"ID":"e5a8359f-f4bf-4636-9bee-8e135c4664e9","Type":"ContainerDied","Data":"2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f"} Nov 22 09:21:31 crc kubenswrapper[4743]: I1122 09:21:31.937725 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjnxb" event={"ID":"e5a8359f-f4bf-4636-9bee-8e135c4664e9","Type":"ContainerStarted","Data":"3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4"} Nov 22 09:21:31 crc kubenswrapper[4743]: I1122 09:21:31.961793 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sjnxb" podStartSLOduration=2.480417843 podStartE2EDuration="4.961775758s" podCreationTimestamp="2025-11-22 09:21:27 +0000 UTC" firstStartedPulling="2025-11-22 09:21:28.887874374 +0000 UTC m=+3562.594235426" lastFinishedPulling="2025-11-22 09:21:31.369232299 +0000 UTC m=+3565.075593341" observedRunningTime="2025-11-22 09:21:31.957486684 +0000 UTC m=+3565.663847736" watchObservedRunningTime="2025-11-22 09:21:31.961775758 +0000 UTC m=+3565.668136810" Nov 22 09:21:38 crc kubenswrapper[4743]: I1122 09:21:38.127472 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:38 crc kubenswrapper[4743]: I1122 09:21:38.128266 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:38 crc kubenswrapper[4743]: I1122 09:21:38.193461 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:39 crc kubenswrapper[4743]: I1122 09:21:39.065162 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:39 crc kubenswrapper[4743]: I1122 09:21:39.121988 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sjnxb"] Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.015644 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sjnxb" podUID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerName="registry-server" containerID="cri-o://3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4" gracePeriod=2 Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.416788 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.484471 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hctbq\" (UniqueName: \"kubernetes.io/projected/e5a8359f-f4bf-4636-9bee-8e135c4664e9-kube-api-access-hctbq\") pod \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.484537 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-utilities\") pod \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.484563 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-catalog-content\") pod \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\" (UID: \"e5a8359f-f4bf-4636-9bee-8e135c4664e9\") " Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.485912 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-utilities" (OuterVolumeSpecName: "utilities") pod "e5a8359f-f4bf-4636-9bee-8e135c4664e9" (UID: "e5a8359f-f4bf-4636-9bee-8e135c4664e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.496853 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a8359f-f4bf-4636-9bee-8e135c4664e9-kube-api-access-hctbq" (OuterVolumeSpecName: "kube-api-access-hctbq") pod "e5a8359f-f4bf-4636-9bee-8e135c4664e9" (UID: "e5a8359f-f4bf-4636-9bee-8e135c4664e9"). InnerVolumeSpecName "kube-api-access-hctbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.586542 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hctbq\" (UniqueName: \"kubernetes.io/projected/e5a8359f-f4bf-4636-9bee-8e135c4664e9-kube-api-access-hctbq\") on node \"crc\" DevicePath \"\"" Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.586588 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.607186 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5a8359f-f4bf-4636-9bee-8e135c4664e9" (UID: "e5a8359f-f4bf-4636-9bee-8e135c4664e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:21:41 crc kubenswrapper[4743]: I1122 09:21:41.688170 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5a8359f-f4bf-4636-9bee-8e135c4664e9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.025749 4743 generic.go:334] "Generic (PLEG): container finished" podID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerID="3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4" exitCode=0 Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.025808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjnxb" event={"ID":"e5a8359f-f4bf-4636-9bee-8e135c4664e9","Type":"ContainerDied","Data":"3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4"} Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.025846 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sjnxb" event={"ID":"e5a8359f-f4bf-4636-9bee-8e135c4664e9","Type":"ContainerDied","Data":"7efb8b96d472e62023eb7da66f03d31e01361301f8467efee7bc875282199bf4"} Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.025874 4743 scope.go:117] "RemoveContainer" containerID="3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4" Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.026101 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sjnxb" Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.063140 4743 scope.go:117] "RemoveContainer" containerID="2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f" Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.075047 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sjnxb"] Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.081905 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sjnxb"] Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.101128 4743 scope.go:117] "RemoveContainer" containerID="f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9" Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.122786 4743 scope.go:117] "RemoveContainer" containerID="3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4" Nov 22 09:21:42 crc kubenswrapper[4743]: E1122 09:21:42.123259 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4\": container with ID starting with 3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4 not found: ID does not exist" containerID="3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4" Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.123317 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4"} err="failed to get container status \"3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4\": rpc error: code = NotFound desc = could not find container \"3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4\": container with ID starting with 3193b593358b1a67f951002c45060533be29d7f463cd15cb4b2d2d114f17e1b4 not found: ID does not exist" Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.123361 4743 scope.go:117] "RemoveContainer" containerID="2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f" Nov 22 09:21:42 crc kubenswrapper[4743]: E1122 09:21:42.123874 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f\": container with ID starting with 2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f not found: ID does not exist" containerID="2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f" Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.123901 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f"} err="failed to get container status \"2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f\": rpc error: code = NotFound desc = could not find container \"2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f\": container with ID starting with 2214a90667c108fa34263facc2cbac25b51be6943169ea1098a08eb13b01972f not found: ID does not exist" Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.123917 4743 scope.go:117] "RemoveContainer" containerID="f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9" Nov 22 09:21:42 crc kubenswrapper[4743]: E1122 09:21:42.124239 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9\": container with ID starting with f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9 not found: ID does not exist" containerID="f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9" Nov 22 09:21:42 crc kubenswrapper[4743]: I1122 09:21:42.124281 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9"} err="failed to get container status \"f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9\": rpc error: code = NotFound desc = could not find container \"f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9\": container with ID starting with f8ad8086b71c0b039b5db3ecb450ce07ec2523a1d122209e70b87e79ee2884c9 not found: ID does not exist" Nov 22 09:21:43 crc kubenswrapper[4743]: I1122 09:21:43.167635 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" path="/var/lib/kubelet/pods/e5a8359f-f4bf-4636-9bee-8e135c4664e9/volumes" Nov 22 09:23:01 crc kubenswrapper[4743]: I1122 09:23:01.240901 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:23:01 crc kubenswrapper[4743]: I1122 09:23:01.241437 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:23:31 crc kubenswrapper[4743]: I1122 09:23:31.241934 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:23:31 crc kubenswrapper[4743]: I1122 09:23:31.242530 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:24:01 crc kubenswrapper[4743]: I1122 09:24:01.241628 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:24:01 crc kubenswrapper[4743]: I1122 09:24:01.242317 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:24:01 crc kubenswrapper[4743]: I1122 09:24:01.242374 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:24:01 crc kubenswrapper[4743]: I1122 09:24:01.243106 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:24:01 crc kubenswrapper[4743]: I1122 09:24:01.243174 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" gracePeriod=600 Nov 22 09:24:01 crc kubenswrapper[4743]: E1122 09:24:01.401878 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:24:01 crc kubenswrapper[4743]: I1122 09:24:01.446802 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" exitCode=0 Nov 22 09:24:01 crc kubenswrapper[4743]: I1122 09:24:01.446873 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e"} Nov 22 09:24:01 crc kubenswrapper[4743]: I1122 09:24:01.446921 4743 scope.go:117] "RemoveContainer" containerID="6b7794d1b1cbdcf57d318d1ebcfcef98bffe04b58b134a2714f9f1c535edc337" Nov 22 09:24:01 crc kubenswrapper[4743]: I1122 09:24:01.447822 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:24:01 crc kubenswrapper[4743]: E1122 09:24:01.448101 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:24:16 crc kubenswrapper[4743]: I1122 09:24:16.151329 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:24:16 crc kubenswrapper[4743]: E1122 09:24:16.152362 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:24:30 crc kubenswrapper[4743]: I1122 09:24:30.152240 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:24:30 crc kubenswrapper[4743]: E1122 09:24:30.153986 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:24:43 crc kubenswrapper[4743]: I1122 09:24:43.152926 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:24:43 crc kubenswrapper[4743]: E1122 09:24:43.153713 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:24:54 crc kubenswrapper[4743]: I1122 09:24:54.151525 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:24:54 crc kubenswrapper[4743]: E1122 09:24:54.152471 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:25:07 crc kubenswrapper[4743]: I1122 09:25:07.155424 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:25:07 crc kubenswrapper[4743]: E1122 09:25:07.156227 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:25:21 crc kubenswrapper[4743]: I1122 09:25:21.152619 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:25:21 crc kubenswrapper[4743]: E1122 09:25:21.153633 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:25:32 crc kubenswrapper[4743]: I1122 09:25:32.152788 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:25:32 crc kubenswrapper[4743]: E1122 09:25:32.154228 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:25:44 crc kubenswrapper[4743]: I1122 09:25:44.152179 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:25:44 crc kubenswrapper[4743]: E1122 09:25:44.152947 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.260011 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-787ks"] Nov 22 09:25:56 crc kubenswrapper[4743]: E1122 09:25:56.261017 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerName="extract-content" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.261134 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerName="extract-content" Nov 22 09:25:56 crc kubenswrapper[4743]: E1122 09:25:56.261151 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerName="registry-server" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.261157 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerName="registry-server" Nov 22 09:25:56 crc kubenswrapper[4743]: E1122 09:25:56.261189 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerName="extract-utilities" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.261197 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerName="extract-utilities" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.261442 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a8359f-f4bf-4636-9bee-8e135c4664e9" containerName="registry-server" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.262854 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.274295 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-787ks"] Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.376660 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-utilities\") pod \"certified-operators-787ks\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.376727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8kd\" (UniqueName: \"kubernetes.io/projected/ff168ae2-53f5-437a-aaa3-2c9051051232-kube-api-access-fc8kd\") pod \"certified-operators-787ks\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.376770 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-catalog-content\") pod \"certified-operators-787ks\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.478467 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-utilities\") pod \"certified-operators-787ks\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.478538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8kd\" (UniqueName: \"kubernetes.io/projected/ff168ae2-53f5-437a-aaa3-2c9051051232-kube-api-access-fc8kd\") pod \"certified-operators-787ks\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.478565 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-catalog-content\") pod \"certified-operators-787ks\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.479052 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-catalog-content\") pod \"certified-operators-787ks\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.479187 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-utilities\") pod \"certified-operators-787ks\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.497664 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8kd\" (UniqueName: \"kubernetes.io/projected/ff168ae2-53f5-437a-aaa3-2c9051051232-kube-api-access-fc8kd\") pod \"certified-operators-787ks\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:56 crc kubenswrapper[4743]: I1122 09:25:56.593199 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:25:57 crc kubenswrapper[4743]: I1122 09:25:57.058709 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-787ks"] Nov 22 09:25:57 crc kubenswrapper[4743]: I1122 09:25:57.423154 4743 generic.go:334] "Generic (PLEG): container finished" podID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerID="6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be" exitCode=0 Nov 22 09:25:57 crc kubenswrapper[4743]: I1122 09:25:57.423207 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-787ks" event={"ID":"ff168ae2-53f5-437a-aaa3-2c9051051232","Type":"ContainerDied","Data":"6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be"} Nov 22 09:25:57 crc kubenswrapper[4743]: I1122 09:25:57.423245 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-787ks" event={"ID":"ff168ae2-53f5-437a-aaa3-2c9051051232","Type":"ContainerStarted","Data":"fbd53f836c3a9455853e08754cf3475c582a2b3c41b4aee87fcf600f1e485a4f"} Nov 22 09:25:59 crc kubenswrapper[4743]: I1122 09:25:59.152276 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:25:59 crc kubenswrapper[4743]: E1122 09:25:59.153029 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:25:59 crc kubenswrapper[4743]: I1122 09:25:59.439407 4743 generic.go:334] "Generic (PLEG): container finished" podID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerID="00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685" exitCode=0 Nov 22 09:25:59 crc kubenswrapper[4743]: I1122 09:25:59.439446 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-787ks" event={"ID":"ff168ae2-53f5-437a-aaa3-2c9051051232","Type":"ContainerDied","Data":"00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685"} Nov 22 09:26:00 crc kubenswrapper[4743]: I1122 09:26:00.451394 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-787ks" event={"ID":"ff168ae2-53f5-437a-aaa3-2c9051051232","Type":"ContainerStarted","Data":"b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f"} Nov 22 09:26:00 crc kubenswrapper[4743]: I1122 09:26:00.480507 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-787ks" podStartSLOduration=1.875303532 podStartE2EDuration="4.480484185s" podCreationTimestamp="2025-11-22 09:25:56 +0000 UTC" firstStartedPulling="2025-11-22 09:25:57.42488274 +0000 UTC m=+3831.131243792" lastFinishedPulling="2025-11-22 09:26:00.030063403 +0000 UTC m=+3833.736424445" observedRunningTime="2025-11-22 09:26:00.473970318 +0000 UTC m=+3834.180331390" watchObservedRunningTime="2025-11-22 09:26:00.480484185 +0000 UTC m=+3834.186845237" Nov 22 09:26:06 crc kubenswrapper[4743]: I1122 09:26:06.594138 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:26:06 crc kubenswrapper[4743]: I1122 09:26:06.594765 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:26:06 crc kubenswrapper[4743]: I1122 09:26:06.637271 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:26:07 crc kubenswrapper[4743]: I1122 09:26:07.559528 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:26:09 crc kubenswrapper[4743]: I1122 09:26:09.043939 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-787ks"] Nov 22 09:26:09 crc kubenswrapper[4743]: I1122 09:26:09.527892 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-787ks" podUID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerName="registry-server" containerID="cri-o://b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f" gracePeriod=2 Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.453714 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.537592 4743 generic.go:334] "Generic (PLEG): container finished" podID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerID="b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f" exitCode=0 Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.537651 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-787ks" event={"ID":"ff168ae2-53f5-437a-aaa3-2c9051051232","Type":"ContainerDied","Data":"b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f"} Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.537665 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-787ks" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.537683 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-787ks" event={"ID":"ff168ae2-53f5-437a-aaa3-2c9051051232","Type":"ContainerDied","Data":"fbd53f836c3a9455853e08754cf3475c582a2b3c41b4aee87fcf600f1e485a4f"} Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.537706 4743 scope.go:117] "RemoveContainer" containerID="b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.557837 4743 scope.go:117] "RemoveContainer" containerID="00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.576788 4743 scope.go:117] "RemoveContainer" containerID="6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.589407 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8kd\" (UniqueName: \"kubernetes.io/projected/ff168ae2-53f5-437a-aaa3-2c9051051232-kube-api-access-fc8kd\") pod \"ff168ae2-53f5-437a-aaa3-2c9051051232\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.589832 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-utilities\") pod \"ff168ae2-53f5-437a-aaa3-2c9051051232\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.589955 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-catalog-content\") pod \"ff168ae2-53f5-437a-aaa3-2c9051051232\" (UID: \"ff168ae2-53f5-437a-aaa3-2c9051051232\") " Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.591657 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-utilities" (OuterVolumeSpecName: "utilities") pod "ff168ae2-53f5-437a-aaa3-2c9051051232" (UID: "ff168ae2-53f5-437a-aaa3-2c9051051232"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.599030 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff168ae2-53f5-437a-aaa3-2c9051051232-kube-api-access-fc8kd" (OuterVolumeSpecName: "kube-api-access-fc8kd") pod "ff168ae2-53f5-437a-aaa3-2c9051051232" (UID: "ff168ae2-53f5-437a-aaa3-2c9051051232"). InnerVolumeSpecName "kube-api-access-fc8kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.603655 4743 scope.go:117] "RemoveContainer" containerID="b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f" Nov 22 09:26:10 crc kubenswrapper[4743]: E1122 09:26:10.604335 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f\": container with ID starting with b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f not found: ID does not exist" containerID="b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.604378 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f"} err="failed to get container status \"b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f\": rpc error: code = NotFound desc = could not find container \"b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f\": container with ID starting with b85a63cb20bd726ac42289b388b19fec2e2ae6610b2315d8d1020786b02e987f not found: ID does not exist" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.604405 4743 scope.go:117] "RemoveContainer" containerID="00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685" Nov 22 09:26:10 crc kubenswrapper[4743]: E1122 09:26:10.604676 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685\": container with ID starting with 00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685 not found: ID does not exist" containerID="00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.604719 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685"} err="failed to get container status \"00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685\": rpc error: code = NotFound desc = could not find container \"00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685\": container with ID starting with 00a9a8ae05a2c50e4f50c345e638b274876fc5496626e459903f9ac4ae4bb685 not found: ID does not exist" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.604745 4743 scope.go:117] "RemoveContainer" containerID="6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be" Nov 22 09:26:10 crc kubenswrapper[4743]: E1122 09:26:10.605382 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be\": container with ID starting with 6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be not found: ID does not exist" containerID="6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.605415 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be"} err="failed to get container status \"6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be\": rpc error: code = NotFound desc = could not find container \"6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be\": container with ID starting with 6c1f52748097af0de50ec24a8c66b97ebf0fcc10e0cc4fa3466b8794a40046be not found: ID does not exist" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.642858 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff168ae2-53f5-437a-aaa3-2c9051051232" (UID: "ff168ae2-53f5-437a-aaa3-2c9051051232"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.692767 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.692810 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff168ae2-53f5-437a-aaa3-2c9051051232-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.692823 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8kd\" (UniqueName: \"kubernetes.io/projected/ff168ae2-53f5-437a-aaa3-2c9051051232-kube-api-access-fc8kd\") on node \"crc\" DevicePath \"\"" Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.876443 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-787ks"] Nov 22 09:26:10 crc kubenswrapper[4743]: I1122 09:26:10.881117 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-787ks"] Nov 22 09:26:11 crc kubenswrapper[4743]: I1122 09:26:11.165194 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff168ae2-53f5-437a-aaa3-2c9051051232" path="/var/lib/kubelet/pods/ff168ae2-53f5-437a-aaa3-2c9051051232/volumes" Nov 22 09:26:13 crc kubenswrapper[4743]: I1122 09:26:13.151435 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:26:13 crc kubenswrapper[4743]: E1122 09:26:13.151691 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:26:25 crc kubenswrapper[4743]: I1122 09:26:25.152946 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:26:25 crc kubenswrapper[4743]: E1122 09:26:25.154051 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:26:37 crc kubenswrapper[4743]: I1122 09:26:37.156408 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:26:37 crc kubenswrapper[4743]: E1122 09:26:37.157200 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:26:48 crc kubenswrapper[4743]: I1122 09:26:48.151305 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:26:48 crc kubenswrapper[4743]: E1122 09:26:48.152199 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.207301 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sj9f6"] Nov 22 09:26:51 crc kubenswrapper[4743]: E1122 09:26:51.208138 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerName="extract-utilities" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.208159 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerName="extract-utilities" Nov 22 09:26:51 crc kubenswrapper[4743]: E1122 09:26:51.208196 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerName="registry-server" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.208205 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerName="registry-server" Nov 22 09:26:51 crc kubenswrapper[4743]: E1122 09:26:51.208234 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerName="extract-content" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.208242 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerName="extract-content" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.208455 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff168ae2-53f5-437a-aaa3-2c9051051232" containerName="registry-server" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.210168 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.222315 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sj9f6"] Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.306410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-catalog-content\") pod \"redhat-marketplace-sj9f6\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.306601 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nnl4\" (UniqueName: \"kubernetes.io/projected/a0208b7c-63b2-4885-9371-8c5d94469255-kube-api-access-8nnl4\") pod \"redhat-marketplace-sj9f6\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.306652 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-utilities\") pod \"redhat-marketplace-sj9f6\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.408419 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nnl4\" (UniqueName: \"kubernetes.io/projected/a0208b7c-63b2-4885-9371-8c5d94469255-kube-api-access-8nnl4\") pod \"redhat-marketplace-sj9f6\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.408495 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-utilities\") pod \"redhat-marketplace-sj9f6\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.408517 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-catalog-content\") pod \"redhat-marketplace-sj9f6\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.408980 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-catalog-content\") pod \"redhat-marketplace-sj9f6\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.409045 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-utilities\") pod \"redhat-marketplace-sj9f6\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.435946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nnl4\" (UniqueName: \"kubernetes.io/projected/a0208b7c-63b2-4885-9371-8c5d94469255-kube-api-access-8nnl4\") pod \"redhat-marketplace-sj9f6\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.536999 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.742171 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sj9f6"] Nov 22 09:26:51 crc kubenswrapper[4743]: I1122 09:26:51.875093 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sj9f6" event={"ID":"a0208b7c-63b2-4885-9371-8c5d94469255","Type":"ContainerStarted","Data":"6fe8e7661f51dd798605b45edff03d223bcc1252b8319568e9753c87dfc880c9"} Nov 22 09:26:52 crc kubenswrapper[4743]: I1122 09:26:52.888511 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0208b7c-63b2-4885-9371-8c5d94469255" containerID="3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432" exitCode=0 Nov 22 09:26:52 crc kubenswrapper[4743]: I1122 09:26:52.888669 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sj9f6" event={"ID":"a0208b7c-63b2-4885-9371-8c5d94469255","Type":"ContainerDied","Data":"3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432"} Nov 22 09:26:52 crc kubenswrapper[4743]: I1122 09:26:52.892286 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:26:54 crc kubenswrapper[4743]: I1122 09:26:54.907648 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0208b7c-63b2-4885-9371-8c5d94469255" containerID="cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a" exitCode=0 Nov 22 09:26:54 crc kubenswrapper[4743]: I1122 09:26:54.907711 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sj9f6" event={"ID":"a0208b7c-63b2-4885-9371-8c5d94469255","Type":"ContainerDied","Data":"cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a"} Nov 22 09:26:55 crc kubenswrapper[4743]: I1122 09:26:55.918676 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sj9f6" event={"ID":"a0208b7c-63b2-4885-9371-8c5d94469255","Type":"ContainerStarted","Data":"891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745"} Nov 22 09:26:55 crc kubenswrapper[4743]: I1122 09:26:55.943442 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sj9f6" podStartSLOduration=2.514982738 podStartE2EDuration="4.943418778s" podCreationTimestamp="2025-11-22 09:26:51 +0000 UTC" firstStartedPulling="2025-11-22 09:26:52.891697645 +0000 UTC m=+3886.598058737" lastFinishedPulling="2025-11-22 09:26:55.320133725 +0000 UTC m=+3889.026494777" observedRunningTime="2025-11-22 09:26:55.939154895 +0000 UTC m=+3889.645515967" watchObservedRunningTime="2025-11-22 09:26:55.943418778 +0000 UTC m=+3889.649779830" Nov 22 09:27:01 crc kubenswrapper[4743]: I1122 09:27:01.156632 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:27:01 crc kubenswrapper[4743]: E1122 09:27:01.157458 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:27:01 crc kubenswrapper[4743]: I1122 09:27:01.556911 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:27:01 crc kubenswrapper[4743]: I1122 09:27:01.556978 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:27:01 crc kubenswrapper[4743]: I1122 09:27:01.596597 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:27:01 crc kubenswrapper[4743]: I1122 09:27:01.998686 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:27:02 crc kubenswrapper[4743]: I1122 09:27:02.039910 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sj9f6"] Nov 22 09:27:03 crc kubenswrapper[4743]: I1122 09:27:03.972798 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sj9f6" podUID="a0208b7c-63b2-4885-9371-8c5d94469255" containerName="registry-server" containerID="cri-o://891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745" gracePeriod=2 Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.329010 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.509095 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-utilities\") pod \"a0208b7c-63b2-4885-9371-8c5d94469255\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.509224 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nnl4\" (UniqueName: \"kubernetes.io/projected/a0208b7c-63b2-4885-9371-8c5d94469255-kube-api-access-8nnl4\") pod \"a0208b7c-63b2-4885-9371-8c5d94469255\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.509408 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-catalog-content\") pod \"a0208b7c-63b2-4885-9371-8c5d94469255\" (UID: \"a0208b7c-63b2-4885-9371-8c5d94469255\") " Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.510070 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-utilities" (OuterVolumeSpecName: "utilities") pod "a0208b7c-63b2-4885-9371-8c5d94469255" (UID: "a0208b7c-63b2-4885-9371-8c5d94469255"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.514104 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0208b7c-63b2-4885-9371-8c5d94469255-kube-api-access-8nnl4" (OuterVolumeSpecName: "kube-api-access-8nnl4") pod "a0208b7c-63b2-4885-9371-8c5d94469255" (UID: "a0208b7c-63b2-4885-9371-8c5d94469255"). InnerVolumeSpecName "kube-api-access-8nnl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.527724 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0208b7c-63b2-4885-9371-8c5d94469255" (UID: "a0208b7c-63b2-4885-9371-8c5d94469255"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.611285 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.611329 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0208b7c-63b2-4885-9371-8c5d94469255-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.611340 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nnl4\" (UniqueName: \"kubernetes.io/projected/a0208b7c-63b2-4885-9371-8c5d94469255-kube-api-access-8nnl4\") on node \"crc\" DevicePath \"\"" Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.983731 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0208b7c-63b2-4885-9371-8c5d94469255" containerID="891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745" exitCode=0 Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.983807 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sj9f6" Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.983805 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sj9f6" event={"ID":"a0208b7c-63b2-4885-9371-8c5d94469255","Type":"ContainerDied","Data":"891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745"} Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.983958 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sj9f6" event={"ID":"a0208b7c-63b2-4885-9371-8c5d94469255","Type":"ContainerDied","Data":"6fe8e7661f51dd798605b45edff03d223bcc1252b8319568e9753c87dfc880c9"} Nov 22 09:27:04 crc kubenswrapper[4743]: I1122 09:27:04.983998 4743 scope.go:117] "RemoveContainer" containerID="891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745" Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.011564 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sj9f6"] Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.019098 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sj9f6"] Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.026931 4743 scope.go:117] "RemoveContainer" containerID="cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a" Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.058048 4743 scope.go:117] "RemoveContainer" containerID="3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432" Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.074676 4743 scope.go:117] "RemoveContainer" containerID="891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745" Nov 22 09:27:05 crc kubenswrapper[4743]: E1122 09:27:05.075085 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745\": container with ID starting with 891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745 not found: ID does not exist" containerID="891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745" Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.075167 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745"} err="failed to get container status \"891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745\": rpc error: code = NotFound desc = could not find container \"891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745\": container with ID starting with 891129562ab2f00cc78b5e5006ac6597aa96f6062b86019b8c74e840c2b13745 not found: ID does not exist" Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.075242 4743 scope.go:117] "RemoveContainer" containerID="cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a" Nov 22 09:27:05 crc kubenswrapper[4743]: E1122 09:27:05.075466 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a\": container with ID starting with cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a not found: ID does not exist" containerID="cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a" Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.075540 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a"} err="failed to get container status \"cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a\": rpc error: code = NotFound desc = could not find container \"cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a\": container with ID starting with cc3da525a022486193ca2f0f332c4175071277cbcdc16226df2b25a07736981a not found: ID does not exist" Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.075626 4743 scope.go:117] "RemoveContainer" containerID="3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432" Nov 22 09:27:05 crc kubenswrapper[4743]: E1122 09:27:05.075884 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432\": container with ID starting with 3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432 not found: ID does not exist" containerID="3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432" Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.075914 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432"} err="failed to get container status \"3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432\": rpc error: code = NotFound desc = could not find container \"3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432\": container with ID starting with 3046288bcdea329a25838785b44e9ed048f9f0c0712884861519fda4f05e8432 not found: ID does not exist" Nov 22 09:27:05 crc kubenswrapper[4743]: I1122 09:27:05.158781 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0208b7c-63b2-4885-9371-8c5d94469255" path="/var/lib/kubelet/pods/a0208b7c-63b2-4885-9371-8c5d94469255/volumes" Nov 22 09:27:16 crc kubenswrapper[4743]: I1122 09:27:16.151619 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:27:16 crc kubenswrapper[4743]: E1122 09:27:16.152447 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:27:30 crc kubenswrapper[4743]: I1122 09:27:30.151825 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:27:30 crc kubenswrapper[4743]: E1122 09:27:30.152762 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:27:45 crc kubenswrapper[4743]: I1122 09:27:45.152135 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:27:45 crc kubenswrapper[4743]: E1122 09:27:45.153386 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:27:56 crc kubenswrapper[4743]: I1122 09:27:56.152111 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:27:56 crc kubenswrapper[4743]: E1122 09:27:56.152991 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.565653 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kp8kb"] Nov 22 09:28:10 crc kubenswrapper[4743]: E1122 09:28:10.567946 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0208b7c-63b2-4885-9371-8c5d94469255" containerName="extract-content" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.568079 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0208b7c-63b2-4885-9371-8c5d94469255" containerName="extract-content" Nov 22 09:28:10 crc kubenswrapper[4743]: E1122 09:28:10.568181 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0208b7c-63b2-4885-9371-8c5d94469255" containerName="registry-server" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.568506 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0208b7c-63b2-4885-9371-8c5d94469255" containerName="registry-server" Nov 22 09:28:10 crc kubenswrapper[4743]: E1122 09:28:10.568603 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0208b7c-63b2-4885-9371-8c5d94469255" containerName="extract-utilities" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.568714 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0208b7c-63b2-4885-9371-8c5d94469255" containerName="extract-utilities" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.568953 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0208b7c-63b2-4885-9371-8c5d94469255" containerName="registry-server" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.572681 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.573487 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kp8kb"] Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.664540 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-utilities\") pod \"community-operators-kp8kb\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.665558 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2q7n\" (UniqueName: \"kubernetes.io/projected/dedb1872-20d8-4912-826a-7e74ac373290-kube-api-access-v2q7n\") pod \"community-operators-kp8kb\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.665725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-catalog-content\") pod \"community-operators-kp8kb\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.766865 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-utilities\") pod \"community-operators-kp8kb\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.767194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2q7n\" (UniqueName: \"kubernetes.io/projected/dedb1872-20d8-4912-826a-7e74ac373290-kube-api-access-v2q7n\") pod \"community-operators-kp8kb\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.767303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-catalog-content\") pod \"community-operators-kp8kb\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.767743 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-catalog-content\") pod \"community-operators-kp8kb\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.767951 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-utilities\") pod \"community-operators-kp8kb\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.786765 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2q7n\" (UniqueName: \"kubernetes.io/projected/dedb1872-20d8-4912-826a-7e74ac373290-kube-api-access-v2q7n\") pod \"community-operators-kp8kb\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:10 crc kubenswrapper[4743]: I1122 09:28:10.897980 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:11 crc kubenswrapper[4743]: I1122 09:28:11.129290 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kp8kb"] Nov 22 09:28:11 crc kubenswrapper[4743]: I1122 09:28:11.152232 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:28:11 crc kubenswrapper[4743]: E1122 09:28:11.152488 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:28:11 crc kubenswrapper[4743]: I1122 09:28:11.497008 4743 generic.go:334] "Generic (PLEG): container finished" podID="dedb1872-20d8-4912-826a-7e74ac373290" containerID="13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7" exitCode=0 Nov 22 09:28:11 crc kubenswrapper[4743]: I1122 09:28:11.497060 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp8kb" event={"ID":"dedb1872-20d8-4912-826a-7e74ac373290","Type":"ContainerDied","Data":"13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7"} Nov 22 09:28:11 crc kubenswrapper[4743]: I1122 09:28:11.497092 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp8kb" event={"ID":"dedb1872-20d8-4912-826a-7e74ac373290","Type":"ContainerStarted","Data":"5094eb5b01e0fa0f794f2e9373d9a96f2321087ccc034e4231bd612d3e4ce429"} Nov 22 09:28:12 crc kubenswrapper[4743]: I1122 09:28:12.505025 4743 generic.go:334] "Generic (PLEG): container finished" podID="dedb1872-20d8-4912-826a-7e74ac373290" containerID="c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e" exitCode=0 Nov 22 09:28:12 crc kubenswrapper[4743]: I1122 09:28:12.505071 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp8kb" event={"ID":"dedb1872-20d8-4912-826a-7e74ac373290","Type":"ContainerDied","Data":"c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e"} Nov 22 09:28:13 crc kubenswrapper[4743]: I1122 09:28:13.516591 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp8kb" event={"ID":"dedb1872-20d8-4912-826a-7e74ac373290","Type":"ContainerStarted","Data":"54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f"} Nov 22 09:28:13 crc kubenswrapper[4743]: I1122 09:28:13.538014 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kp8kb" podStartSLOduration=2.106071821 podStartE2EDuration="3.537994787s" podCreationTimestamp="2025-11-22 09:28:10 +0000 UTC" firstStartedPulling="2025-11-22 09:28:11.498680257 +0000 UTC m=+3965.205041309" lastFinishedPulling="2025-11-22 09:28:12.930603213 +0000 UTC m=+3966.636964275" observedRunningTime="2025-11-22 09:28:13.532290943 +0000 UTC m=+3967.238651995" watchObservedRunningTime="2025-11-22 09:28:13.537994787 +0000 UTC m=+3967.244355839" Nov 22 09:28:20 crc kubenswrapper[4743]: I1122 09:28:20.898282 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:20 crc kubenswrapper[4743]: I1122 09:28:20.898924 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:20 crc kubenswrapper[4743]: I1122 09:28:20.944202 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:21 crc kubenswrapper[4743]: I1122 09:28:21.622950 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:21 crc kubenswrapper[4743]: I1122 09:28:21.671027 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kp8kb"] Nov 22 09:28:22 crc kubenswrapper[4743]: I1122 09:28:22.151142 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:28:22 crc kubenswrapper[4743]: E1122 09:28:22.151461 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:28:23 crc kubenswrapper[4743]: I1122 09:28:23.601690 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kp8kb" podUID="dedb1872-20d8-4912-826a-7e74ac373290" containerName="registry-server" containerID="cri-o://54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f" gracePeriod=2 Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.389205 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.574437 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-catalog-content\") pod \"dedb1872-20d8-4912-826a-7e74ac373290\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.574597 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2q7n\" (UniqueName: \"kubernetes.io/projected/dedb1872-20d8-4912-826a-7e74ac373290-kube-api-access-v2q7n\") pod \"dedb1872-20d8-4912-826a-7e74ac373290\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.574680 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-utilities\") pod \"dedb1872-20d8-4912-826a-7e74ac373290\" (UID: \"dedb1872-20d8-4912-826a-7e74ac373290\") " Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.575879 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-utilities" (OuterVolumeSpecName: "utilities") pod "dedb1872-20d8-4912-826a-7e74ac373290" (UID: "dedb1872-20d8-4912-826a-7e74ac373290"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.582562 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dedb1872-20d8-4912-826a-7e74ac373290-kube-api-access-v2q7n" (OuterVolumeSpecName: "kube-api-access-v2q7n") pod "dedb1872-20d8-4912-826a-7e74ac373290" (UID: "dedb1872-20d8-4912-826a-7e74ac373290"). InnerVolumeSpecName "kube-api-access-v2q7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.612331 4743 generic.go:334] "Generic (PLEG): container finished" podID="dedb1872-20d8-4912-826a-7e74ac373290" containerID="54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f" exitCode=0 Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.612358 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kp8kb" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.612410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp8kb" event={"ID":"dedb1872-20d8-4912-826a-7e74ac373290","Type":"ContainerDied","Data":"54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f"} Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.612485 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kp8kb" event={"ID":"dedb1872-20d8-4912-826a-7e74ac373290","Type":"ContainerDied","Data":"5094eb5b01e0fa0f794f2e9373d9a96f2321087ccc034e4231bd612d3e4ce429"} Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.612518 4743 scope.go:117] "RemoveContainer" containerID="54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.628257 4743 scope.go:117] "RemoveContainer" containerID="c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.632069 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dedb1872-20d8-4912-826a-7e74ac373290" (UID: "dedb1872-20d8-4912-826a-7e74ac373290"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.646198 4743 scope.go:117] "RemoveContainer" containerID="13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.677076 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.677115 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2q7n\" (UniqueName: \"kubernetes.io/projected/dedb1872-20d8-4912-826a-7e74ac373290-kube-api-access-v2q7n\") on node \"crc\" DevicePath \"\"" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.677124 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dedb1872-20d8-4912-826a-7e74ac373290-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.686745 4743 scope.go:117] "RemoveContainer" containerID="54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f" Nov 22 09:28:24 crc kubenswrapper[4743]: E1122 09:28:24.687388 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f\": container with ID starting with 54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f not found: ID does not exist" containerID="54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.687421 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f"} err="failed to get container status \"54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f\": rpc error: code = NotFound desc = could not find container \"54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f\": container with ID starting with 54283fbd3a65ed1f330de27ef78c0b61bbb94c9cb23e82ca8a380b2c9839850f not found: ID does not exist" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.687442 4743 scope.go:117] "RemoveContainer" containerID="c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e" Nov 22 09:28:24 crc kubenswrapper[4743]: E1122 09:28:24.687908 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e\": container with ID starting with c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e not found: ID does not exist" containerID="c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.688045 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e"} err="failed to get container status \"c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e\": rpc error: code = NotFound desc = could not find container \"c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e\": container with ID starting with c36899cd63bf9bbfc0cf3e23d66022f8b8f49fef3a4ba9e8603d242a74b87c2e not found: ID does not exist" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.688062 4743 scope.go:117] "RemoveContainer" containerID="13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7" Nov 22 09:28:24 crc kubenswrapper[4743]: E1122 09:28:24.688332 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7\": container with ID starting with 13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7 not found: ID does not exist" containerID="13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.688358 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7"} err="failed to get container status \"13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7\": rpc error: code = NotFound desc = could not find container \"13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7\": container with ID starting with 13b5488a08e3b943b169bedb93110d68fab57c43fa2f1df554ba92ca463c10c7 not found: ID does not exist" Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.966671 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kp8kb"] Nov 22 09:28:24 crc kubenswrapper[4743]: I1122 09:28:24.978648 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kp8kb"] Nov 22 09:28:25 crc kubenswrapper[4743]: I1122 09:28:25.163169 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dedb1872-20d8-4912-826a-7e74ac373290" path="/var/lib/kubelet/pods/dedb1872-20d8-4912-826a-7e74ac373290/volumes" Nov 22 09:28:33 crc kubenswrapper[4743]: I1122 09:28:33.151401 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:28:33 crc kubenswrapper[4743]: E1122 09:28:33.151960 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:28:45 crc kubenswrapper[4743]: I1122 09:28:45.152100 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:28:45 crc kubenswrapper[4743]: E1122 09:28:45.153332 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:28:58 crc kubenswrapper[4743]: I1122 09:28:58.152097 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:28:58 crc kubenswrapper[4743]: E1122 09:28:58.153143 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:29:12 crc kubenswrapper[4743]: I1122 09:29:12.151754 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:29:12 crc kubenswrapper[4743]: I1122 09:29:12.984967 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"a2541c84f477d94b6b95264fa987196c77e3c9a933370a0e137c2817afdd5f2f"} Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.152356 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2"] Nov 22 09:30:00 crc kubenswrapper[4743]: E1122 09:30:00.154740 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedb1872-20d8-4912-826a-7e74ac373290" containerName="extract-content" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.154872 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedb1872-20d8-4912-826a-7e74ac373290" containerName="extract-content" Nov 22 09:30:00 crc kubenswrapper[4743]: E1122 09:30:00.154971 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedb1872-20d8-4912-826a-7e74ac373290" containerName="registry-server" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.155043 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedb1872-20d8-4912-826a-7e74ac373290" containerName="registry-server" Nov 22 09:30:00 crc kubenswrapper[4743]: E1122 09:30:00.155145 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedb1872-20d8-4912-826a-7e74ac373290" containerName="extract-utilities" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.155218 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedb1872-20d8-4912-826a-7e74ac373290" containerName="extract-utilities" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.155476 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedb1872-20d8-4912-826a-7e74ac373290" containerName="registry-server" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.156234 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.161057 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.161120 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.162591 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2"] Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.311061 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-secret-volume\") pod \"collect-profiles-29396730-spmx2\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.311125 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq9tg\" (UniqueName: \"kubernetes.io/projected/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-kube-api-access-rq9tg\") pod \"collect-profiles-29396730-spmx2\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.311173 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-config-volume\") pod \"collect-profiles-29396730-spmx2\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.412123 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-secret-volume\") pod \"collect-profiles-29396730-spmx2\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.412174 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq9tg\" (UniqueName: \"kubernetes.io/projected/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-kube-api-access-rq9tg\") pod \"collect-profiles-29396730-spmx2\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.412222 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-config-volume\") pod \"collect-profiles-29396730-spmx2\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.413302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-config-volume\") pod \"collect-profiles-29396730-spmx2\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.420343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-secret-volume\") pod \"collect-profiles-29396730-spmx2\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.428714 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq9tg\" (UniqueName: \"kubernetes.io/projected/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-kube-api-access-rq9tg\") pod \"collect-profiles-29396730-spmx2\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.485233 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:00 crc kubenswrapper[4743]: I1122 09:30:00.928813 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2"] Nov 22 09:30:01 crc kubenswrapper[4743]: I1122 09:30:01.539051 4743 generic.go:334] "Generic (PLEG): container finished" podID="473686fe-38ae-4f9a-bfc0-c7946ebb17bc" containerID="b0b9f55113c4c4681d970884f32060a4d74a46af79a18307032740fd5d878788" exitCode=0 Nov 22 09:30:01 crc kubenswrapper[4743]: I1122 09:30:01.539119 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" event={"ID":"473686fe-38ae-4f9a-bfc0-c7946ebb17bc","Type":"ContainerDied","Data":"b0b9f55113c4c4681d970884f32060a4d74a46af79a18307032740fd5d878788"} Nov 22 09:30:01 crc kubenswrapper[4743]: I1122 09:30:01.539165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" event={"ID":"473686fe-38ae-4f9a-bfc0-c7946ebb17bc","Type":"ContainerStarted","Data":"9e28f80624ee897fb45362bca2d4ec5051fff9a216f59e2f6460544305a1e6df"} Nov 22 09:30:02 crc kubenswrapper[4743]: I1122 09:30:02.828297 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:02 crc kubenswrapper[4743]: I1122 09:30:02.949919 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-config-volume\") pod \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " Nov 22 09:30:02 crc kubenswrapper[4743]: I1122 09:30:02.950135 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-secret-volume\") pod \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " Nov 22 09:30:02 crc kubenswrapper[4743]: I1122 09:30:02.950171 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rq9tg\" (UniqueName: \"kubernetes.io/projected/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-kube-api-access-rq9tg\") pod \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\" (UID: \"473686fe-38ae-4f9a-bfc0-c7946ebb17bc\") " Nov 22 09:30:02 crc kubenswrapper[4743]: I1122 09:30:02.952280 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-config-volume" (OuterVolumeSpecName: "config-volume") pod "473686fe-38ae-4f9a-bfc0-c7946ebb17bc" (UID: "473686fe-38ae-4f9a-bfc0-c7946ebb17bc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:30:02 crc kubenswrapper[4743]: I1122 09:30:02.956073 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "473686fe-38ae-4f9a-bfc0-c7946ebb17bc" (UID: "473686fe-38ae-4f9a-bfc0-c7946ebb17bc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:30:02 crc kubenswrapper[4743]: I1122 09:30:02.957069 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-kube-api-access-rq9tg" (OuterVolumeSpecName: "kube-api-access-rq9tg") pod "473686fe-38ae-4f9a-bfc0-c7946ebb17bc" (UID: "473686fe-38ae-4f9a-bfc0-c7946ebb17bc"). InnerVolumeSpecName "kube-api-access-rq9tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:30:03 crc kubenswrapper[4743]: I1122 09:30:03.051796 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:03 crc kubenswrapper[4743]: I1122 09:30:03.051848 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:03 crc kubenswrapper[4743]: I1122 09:30:03.051867 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rq9tg\" (UniqueName: \"kubernetes.io/projected/473686fe-38ae-4f9a-bfc0-c7946ebb17bc-kube-api-access-rq9tg\") on node \"crc\" DevicePath \"\"" Nov 22 09:30:03 crc kubenswrapper[4743]: I1122 09:30:03.558106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" event={"ID":"473686fe-38ae-4f9a-bfc0-c7946ebb17bc","Type":"ContainerDied","Data":"9e28f80624ee897fb45362bca2d4ec5051fff9a216f59e2f6460544305a1e6df"} Nov 22 09:30:03 crc kubenswrapper[4743]: I1122 09:30:03.558399 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e28f80624ee897fb45362bca2d4ec5051fff9a216f59e2f6460544305a1e6df" Nov 22 09:30:03 crc kubenswrapper[4743]: I1122 09:30:03.558175 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2" Nov 22 09:30:03 crc kubenswrapper[4743]: I1122 09:30:03.895764 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z"] Nov 22 09:30:03 crc kubenswrapper[4743]: I1122 09:30:03.900354 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396685-s586z"] Nov 22 09:30:05 crc kubenswrapper[4743]: I1122 09:30:05.160029 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd54a052-ecbb-4344-9f1e-323a7fbf034b" path="/var/lib/kubelet/pods/bd54a052-ecbb-4344-9f1e-323a7fbf034b/volumes" Nov 22 09:30:31 crc kubenswrapper[4743]: I1122 09:30:31.402920 4743 scope.go:117] "RemoveContainer" containerID="7659ae12fad0147c8b735d5bd5d19689a44c9c8031ab0169048089c6125a5fe6" Nov 22 09:31:31 crc kubenswrapper[4743]: I1122 09:31:31.241661 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:31:31 crc kubenswrapper[4743]: I1122 09:31:31.242266 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:32:01 crc kubenswrapper[4743]: I1122 09:32:01.241720 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:32:01 crc kubenswrapper[4743]: I1122 09:32:01.242275 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:32:31 crc kubenswrapper[4743]: I1122 09:32:31.241795 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:32:31 crc kubenswrapper[4743]: I1122 09:32:31.242337 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:32:31 crc kubenswrapper[4743]: I1122 09:32:31.242385 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:32:31 crc kubenswrapper[4743]: I1122 09:32:31.243081 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a2541c84f477d94b6b95264fa987196c77e3c9a933370a0e137c2817afdd5f2f"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:32:31 crc kubenswrapper[4743]: I1122 09:32:31.243137 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://a2541c84f477d94b6b95264fa987196c77e3c9a933370a0e137c2817afdd5f2f" gracePeriod=600 Nov 22 09:32:31 crc kubenswrapper[4743]: I1122 09:32:31.751688 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="a2541c84f477d94b6b95264fa987196c77e3c9a933370a0e137c2817afdd5f2f" exitCode=0 Nov 22 09:32:31 crc kubenswrapper[4743]: I1122 09:32:31.751759 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"a2541c84f477d94b6b95264fa987196c77e3c9a933370a0e137c2817afdd5f2f"} Nov 22 09:32:31 crc kubenswrapper[4743]: I1122 09:32:31.752221 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae"} Nov 22 09:32:31 crc kubenswrapper[4743]: I1122 09:32:31.752260 4743 scope.go:117] "RemoveContainer" containerID="b9bcb11dea0dbf15383bcdc8e52b03bceb6c4c51bfca60b5df1911bd14f6971e" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.722897 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6d9km"] Nov 22 09:32:51 crc kubenswrapper[4743]: E1122 09:32:51.724231 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473686fe-38ae-4f9a-bfc0-c7946ebb17bc" containerName="collect-profiles" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.724454 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="473686fe-38ae-4f9a-bfc0-c7946ebb17bc" containerName="collect-profiles" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.724908 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="473686fe-38ae-4f9a-bfc0-c7946ebb17bc" containerName="collect-profiles" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.727414 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.732235 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6d9km"] Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.798675 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4hhs\" (UniqueName: \"kubernetes.io/projected/c70b425f-0629-4aad-812d-41537b548a5f-kube-api-access-g4hhs\") pod \"redhat-operators-6d9km\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.798729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-catalog-content\") pod \"redhat-operators-6d9km\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.798785 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-utilities\") pod \"redhat-operators-6d9km\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.899659 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4hhs\" (UniqueName: \"kubernetes.io/projected/c70b425f-0629-4aad-812d-41537b548a5f-kube-api-access-g4hhs\") pod \"redhat-operators-6d9km\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.899712 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-catalog-content\") pod \"redhat-operators-6d9km\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.899749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-utilities\") pod \"redhat-operators-6d9km\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.900289 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-utilities\") pod \"redhat-operators-6d9km\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.900354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-catalog-content\") pod \"redhat-operators-6d9km\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:51 crc kubenswrapper[4743]: I1122 09:32:51.921426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4hhs\" (UniqueName: \"kubernetes.io/projected/c70b425f-0629-4aad-812d-41537b548a5f-kube-api-access-g4hhs\") pod \"redhat-operators-6d9km\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:52 crc kubenswrapper[4743]: I1122 09:32:52.054355 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:32:52 crc kubenswrapper[4743]: I1122 09:32:52.481602 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6d9km"] Nov 22 09:32:52 crc kubenswrapper[4743]: I1122 09:32:52.910438 4743 generic.go:334] "Generic (PLEG): container finished" podID="c70b425f-0629-4aad-812d-41537b548a5f" containerID="38e13c3e23fd8eb87054aab29b315e2b53df861e9c09bee8b4484f02d0e0727b" exitCode=0 Nov 22 09:32:52 crc kubenswrapper[4743]: I1122 09:32:52.910799 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d9km" event={"ID":"c70b425f-0629-4aad-812d-41537b548a5f","Type":"ContainerDied","Data":"38e13c3e23fd8eb87054aab29b315e2b53df861e9c09bee8b4484f02d0e0727b"} Nov 22 09:32:52 crc kubenswrapper[4743]: I1122 09:32:52.910825 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d9km" event={"ID":"c70b425f-0629-4aad-812d-41537b548a5f","Type":"ContainerStarted","Data":"b257b0948b6441e742788b23b517bfbbd216985a29b5bfce7005c379c349b6d0"} Nov 22 09:32:52 crc kubenswrapper[4743]: I1122 09:32:52.912544 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:32:53 crc kubenswrapper[4743]: I1122 09:32:53.918893 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d9km" event={"ID":"c70b425f-0629-4aad-812d-41537b548a5f","Type":"ContainerStarted","Data":"fcd7853a1e628150ac73017d75ff4fbc547a99d9b6c35a3e4a629cfabfa6d98d"} Nov 22 09:32:54 crc kubenswrapper[4743]: I1122 09:32:54.929836 4743 generic.go:334] "Generic (PLEG): container finished" podID="c70b425f-0629-4aad-812d-41537b548a5f" containerID="fcd7853a1e628150ac73017d75ff4fbc547a99d9b6c35a3e4a629cfabfa6d98d" exitCode=0 Nov 22 09:32:54 crc kubenswrapper[4743]: I1122 09:32:54.929866 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d9km" event={"ID":"c70b425f-0629-4aad-812d-41537b548a5f","Type":"ContainerDied","Data":"fcd7853a1e628150ac73017d75ff4fbc547a99d9b6c35a3e4a629cfabfa6d98d"} Nov 22 09:32:55 crc kubenswrapper[4743]: I1122 09:32:55.938248 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d9km" event={"ID":"c70b425f-0629-4aad-812d-41537b548a5f","Type":"ContainerStarted","Data":"3a1aeaf7c8bab22a7d34740514587ff210b11d63af622a252e47cc977b9604c7"} Nov 22 09:32:55 crc kubenswrapper[4743]: I1122 09:32:55.957712 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6d9km" podStartSLOduration=2.471632326 podStartE2EDuration="4.957673654s" podCreationTimestamp="2025-11-22 09:32:51 +0000 UTC" firstStartedPulling="2025-11-22 09:32:52.912305174 +0000 UTC m=+4246.618666226" lastFinishedPulling="2025-11-22 09:32:55.398346502 +0000 UTC m=+4249.104707554" observedRunningTime="2025-11-22 09:32:55.957263472 +0000 UTC m=+4249.663624524" watchObservedRunningTime="2025-11-22 09:32:55.957673654 +0000 UTC m=+4249.664034706" Nov 22 09:33:02 crc kubenswrapper[4743]: I1122 09:33:02.055284 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:33:02 crc kubenswrapper[4743]: I1122 09:33:02.055774 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:33:02 crc kubenswrapper[4743]: I1122 09:33:02.109340 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:33:03 crc kubenswrapper[4743]: I1122 09:33:03.036008 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:33:03 crc kubenswrapper[4743]: I1122 09:33:03.080147 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6d9km"] Nov 22 09:33:05 crc kubenswrapper[4743]: I1122 09:33:05.024798 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6d9km" podUID="c70b425f-0629-4aad-812d-41537b548a5f" containerName="registry-server" containerID="cri-o://3a1aeaf7c8bab22a7d34740514587ff210b11d63af622a252e47cc977b9604c7" gracePeriod=2 Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.033329 4743 generic.go:334] "Generic (PLEG): container finished" podID="c70b425f-0629-4aad-812d-41537b548a5f" containerID="3a1aeaf7c8bab22a7d34740514587ff210b11d63af622a252e47cc977b9604c7" exitCode=0 Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.033377 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d9km" event={"ID":"c70b425f-0629-4aad-812d-41537b548a5f","Type":"ContainerDied","Data":"3a1aeaf7c8bab22a7d34740514587ff210b11d63af622a252e47cc977b9604c7"} Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.303506 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.404341 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4hhs\" (UniqueName: \"kubernetes.io/projected/c70b425f-0629-4aad-812d-41537b548a5f-kube-api-access-g4hhs\") pod \"c70b425f-0629-4aad-812d-41537b548a5f\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.404396 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-utilities\") pod \"c70b425f-0629-4aad-812d-41537b548a5f\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.404448 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-catalog-content\") pod \"c70b425f-0629-4aad-812d-41537b548a5f\" (UID: \"c70b425f-0629-4aad-812d-41537b548a5f\") " Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.405564 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-utilities" (OuterVolumeSpecName: "utilities") pod "c70b425f-0629-4aad-812d-41537b548a5f" (UID: "c70b425f-0629-4aad-812d-41537b548a5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.414893 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70b425f-0629-4aad-812d-41537b548a5f-kube-api-access-g4hhs" (OuterVolumeSpecName: "kube-api-access-g4hhs") pod "c70b425f-0629-4aad-812d-41537b548a5f" (UID: "c70b425f-0629-4aad-812d-41537b548a5f"). InnerVolumeSpecName "kube-api-access-g4hhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.494690 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c70b425f-0629-4aad-812d-41537b548a5f" (UID: "c70b425f-0629-4aad-812d-41537b548a5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.506024 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.506053 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c70b425f-0629-4aad-812d-41537b548a5f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:06 crc kubenswrapper[4743]: I1122 09:33:06.506065 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4hhs\" (UniqueName: \"kubernetes.io/projected/c70b425f-0629-4aad-812d-41537b548a5f-kube-api-access-g4hhs\") on node \"crc\" DevicePath \"\"" Nov 22 09:33:07 crc kubenswrapper[4743]: I1122 09:33:07.042417 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6d9km" event={"ID":"c70b425f-0629-4aad-812d-41537b548a5f","Type":"ContainerDied","Data":"b257b0948b6441e742788b23b517bfbbd216985a29b5bfce7005c379c349b6d0"} Nov 22 09:33:07 crc kubenswrapper[4743]: I1122 09:33:07.042471 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6d9km" Nov 22 09:33:07 crc kubenswrapper[4743]: I1122 09:33:07.042479 4743 scope.go:117] "RemoveContainer" containerID="3a1aeaf7c8bab22a7d34740514587ff210b11d63af622a252e47cc977b9604c7" Nov 22 09:33:07 crc kubenswrapper[4743]: I1122 09:33:07.077958 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6d9km"] Nov 22 09:33:07 crc kubenswrapper[4743]: I1122 09:33:07.084097 4743 scope.go:117] "RemoveContainer" containerID="fcd7853a1e628150ac73017d75ff4fbc547a99d9b6c35a3e4a629cfabfa6d98d" Nov 22 09:33:07 crc kubenswrapper[4743]: I1122 09:33:07.089086 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6d9km"] Nov 22 09:33:07 crc kubenswrapper[4743]: I1122 09:33:07.103122 4743 scope.go:117] "RemoveContainer" containerID="38e13c3e23fd8eb87054aab29b315e2b53df861e9c09bee8b4484f02d0e0727b" Nov 22 09:33:07 crc kubenswrapper[4743]: I1122 09:33:07.161177 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c70b425f-0629-4aad-812d-41537b548a5f" path="/var/lib/kubelet/pods/c70b425f-0629-4aad-812d-41537b548a5f/volumes" Nov 22 09:34:31 crc kubenswrapper[4743]: I1122 09:34:31.241155 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:34:31 crc kubenswrapper[4743]: I1122 09:34:31.241746 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:35:01 crc kubenswrapper[4743]: I1122 09:35:01.241261 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:35:01 crc kubenswrapper[4743]: I1122 09:35:01.241889 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:35:31 crc kubenswrapper[4743]: I1122 09:35:31.241350 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:35:31 crc kubenswrapper[4743]: I1122 09:35:31.242721 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:35:31 crc kubenswrapper[4743]: I1122 09:35:31.242822 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:35:31 crc kubenswrapper[4743]: I1122 09:35:31.243909 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:35:31 crc kubenswrapper[4743]: I1122 09:35:31.244011 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" gracePeriod=600 Nov 22 09:35:31 crc kubenswrapper[4743]: E1122 09:35:31.372766 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:35:32 crc kubenswrapper[4743]: I1122 09:35:32.234366 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" exitCode=0 Nov 22 09:35:32 crc kubenswrapper[4743]: I1122 09:35:32.234494 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae"} Nov 22 09:35:32 crc kubenswrapper[4743]: I1122 09:35:32.235084 4743 scope.go:117] "RemoveContainer" containerID="a2541c84f477d94b6b95264fa987196c77e3c9a933370a0e137c2817afdd5f2f" Nov 22 09:35:32 crc kubenswrapper[4743]: I1122 09:35:32.236260 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:35:32 crc kubenswrapper[4743]: E1122 09:35:32.238984 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:35:47 crc kubenswrapper[4743]: I1122 09:35:47.160082 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:35:47 crc kubenswrapper[4743]: E1122 09:35:47.161205 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:36:02 crc kubenswrapper[4743]: I1122 09:36:02.152566 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:36:02 crc kubenswrapper[4743]: E1122 09:36:02.153702 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:36:16 crc kubenswrapper[4743]: I1122 09:36:16.151408 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:36:16 crc kubenswrapper[4743]: E1122 09:36:16.152306 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:36:31 crc kubenswrapper[4743]: I1122 09:36:31.151342 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:36:31 crc kubenswrapper[4743]: E1122 09:36:31.152659 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:36:46 crc kubenswrapper[4743]: I1122 09:36:46.152163 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:36:46 crc kubenswrapper[4743]: E1122 09:36:46.153064 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:36:57 crc kubenswrapper[4743]: I1122 09:36:57.155911 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:36:57 crc kubenswrapper[4743]: E1122 09:36:57.156679 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:37:08 crc kubenswrapper[4743]: I1122 09:37:08.151653 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:37:08 crc kubenswrapper[4743]: E1122 09:37:08.152411 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.496071 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-5njjw"] Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.505321 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-5njjw"] Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.680749 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-qsb69"] Nov 22 09:37:10 crc kubenswrapper[4743]: E1122 09:37:10.681095 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70b425f-0629-4aad-812d-41537b548a5f" containerName="registry-server" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.681111 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70b425f-0629-4aad-812d-41537b548a5f" containerName="registry-server" Nov 22 09:37:10 crc kubenswrapper[4743]: E1122 09:37:10.681153 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70b425f-0629-4aad-812d-41537b548a5f" containerName="extract-content" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.681159 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70b425f-0629-4aad-812d-41537b548a5f" containerName="extract-content" Nov 22 09:37:10 crc kubenswrapper[4743]: E1122 09:37:10.681167 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70b425f-0629-4aad-812d-41537b548a5f" containerName="extract-utilities" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.681174 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70b425f-0629-4aad-812d-41537b548a5f" containerName="extract-utilities" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.681351 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70b425f-0629-4aad-812d-41537b548a5f" containerName="registry-server" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.681848 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.686226 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.686252 4743 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-4ltb4" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.686999 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.687291 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.689569 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qsb69"] Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.723112 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/38ca632b-b8af-4cad-9138-bdded3a9c7b8-crc-storage\") pod \"crc-storage-crc-qsb69\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.723171 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/38ca632b-b8af-4cad-9138-bdded3a9c7b8-node-mnt\") pod \"crc-storage-crc-qsb69\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.723193 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwg84\" (UniqueName: \"kubernetes.io/projected/38ca632b-b8af-4cad-9138-bdded3a9c7b8-kube-api-access-wwg84\") pod \"crc-storage-crc-qsb69\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.824248 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/38ca632b-b8af-4cad-9138-bdded3a9c7b8-node-mnt\") pod \"crc-storage-crc-qsb69\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.824330 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwg84\" (UniqueName: \"kubernetes.io/projected/38ca632b-b8af-4cad-9138-bdded3a9c7b8-kube-api-access-wwg84\") pod \"crc-storage-crc-qsb69\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.824434 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/38ca632b-b8af-4cad-9138-bdded3a9c7b8-crc-storage\") pod \"crc-storage-crc-qsb69\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.824454 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/38ca632b-b8af-4cad-9138-bdded3a9c7b8-node-mnt\") pod \"crc-storage-crc-qsb69\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.825083 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/38ca632b-b8af-4cad-9138-bdded3a9c7b8-crc-storage\") pod \"crc-storage-crc-qsb69\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:10 crc kubenswrapper[4743]: I1122 09:37:10.848171 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwg84\" (UniqueName: \"kubernetes.io/projected/38ca632b-b8af-4cad-9138-bdded3a9c7b8-kube-api-access-wwg84\") pod \"crc-storage-crc-qsb69\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:11 crc kubenswrapper[4743]: I1122 09:37:11.020883 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:11 crc kubenswrapper[4743]: I1122 09:37:11.172715 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b57e70a1-7869-42b1-bdaf-2b7199c5e463" path="/var/lib/kubelet/pods/b57e70a1-7869-42b1-bdaf-2b7199c5e463/volumes" Nov 22 09:37:11 crc kubenswrapper[4743]: I1122 09:37:11.528903 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-qsb69"] Nov 22 09:37:12 crc kubenswrapper[4743]: I1122 09:37:12.145587 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qsb69" event={"ID":"38ca632b-b8af-4cad-9138-bdded3a9c7b8","Type":"ContainerStarted","Data":"bdb405594bf0e15754b08d3ec29f3d578df3f35d7a5ad57c8dcf279243995400"} Nov 22 09:37:13 crc kubenswrapper[4743]: I1122 09:37:13.158928 4743 generic.go:334] "Generic (PLEG): container finished" podID="38ca632b-b8af-4cad-9138-bdded3a9c7b8" containerID="4fa8b8a76399829b269c653da64f2d72e76f48f14658f0bc2f878402abcc19f6" exitCode=0 Nov 22 09:37:13 crc kubenswrapper[4743]: I1122 09:37:13.167450 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qsb69" event={"ID":"38ca632b-b8af-4cad-9138-bdded3a9c7b8","Type":"ContainerDied","Data":"4fa8b8a76399829b269c653da64f2d72e76f48f14658f0bc2f878402abcc19f6"} Nov 22 09:37:14 crc kubenswrapper[4743]: I1122 09:37:14.465605 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:14 crc kubenswrapper[4743]: I1122 09:37:14.577729 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/38ca632b-b8af-4cad-9138-bdded3a9c7b8-crc-storage\") pod \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " Nov 22 09:37:14 crc kubenswrapper[4743]: I1122 09:37:14.577822 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwg84\" (UniqueName: \"kubernetes.io/projected/38ca632b-b8af-4cad-9138-bdded3a9c7b8-kube-api-access-wwg84\") pod \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " Nov 22 09:37:14 crc kubenswrapper[4743]: I1122 09:37:14.577864 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/38ca632b-b8af-4cad-9138-bdded3a9c7b8-node-mnt\") pod \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\" (UID: \"38ca632b-b8af-4cad-9138-bdded3a9c7b8\") " Nov 22 09:37:14 crc kubenswrapper[4743]: I1122 09:37:14.578053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38ca632b-b8af-4cad-9138-bdded3a9c7b8-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "38ca632b-b8af-4cad-9138-bdded3a9c7b8" (UID: "38ca632b-b8af-4cad-9138-bdded3a9c7b8"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:37:14 crc kubenswrapper[4743]: I1122 09:37:14.578242 4743 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/38ca632b-b8af-4cad-9138-bdded3a9c7b8-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:14 crc kubenswrapper[4743]: I1122 09:37:14.583791 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ca632b-b8af-4cad-9138-bdded3a9c7b8-kube-api-access-wwg84" (OuterVolumeSpecName: "kube-api-access-wwg84") pod "38ca632b-b8af-4cad-9138-bdded3a9c7b8" (UID: "38ca632b-b8af-4cad-9138-bdded3a9c7b8"). InnerVolumeSpecName "kube-api-access-wwg84". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:37:14 crc kubenswrapper[4743]: I1122 09:37:14.601121 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38ca632b-b8af-4cad-9138-bdded3a9c7b8-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "38ca632b-b8af-4cad-9138-bdded3a9c7b8" (UID: "38ca632b-b8af-4cad-9138-bdded3a9c7b8"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:37:14 crc kubenswrapper[4743]: I1122 09:37:14.679936 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwg84\" (UniqueName: \"kubernetes.io/projected/38ca632b-b8af-4cad-9138-bdded3a9c7b8-kube-api-access-wwg84\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:14 crc kubenswrapper[4743]: I1122 09:37:14.679976 4743 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/38ca632b-b8af-4cad-9138-bdded3a9c7b8-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:15 crc kubenswrapper[4743]: I1122 09:37:15.178615 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-qsb69" event={"ID":"38ca632b-b8af-4cad-9138-bdded3a9c7b8","Type":"ContainerDied","Data":"bdb405594bf0e15754b08d3ec29f3d578df3f35d7a5ad57c8dcf279243995400"} Nov 22 09:37:15 crc kubenswrapper[4743]: I1122 09:37:15.178687 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdb405594bf0e15754b08d3ec29f3d578df3f35d7a5ad57c8dcf279243995400" Nov 22 09:37:15 crc kubenswrapper[4743]: I1122 09:37:15.179318 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-qsb69" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.209422 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-qsb69"] Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.217095 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-qsb69"] Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.393880 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-k2vtw"] Nov 22 09:37:17 crc kubenswrapper[4743]: E1122 09:37:17.394171 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ca632b-b8af-4cad-9138-bdded3a9c7b8" containerName="storage" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.394186 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ca632b-b8af-4cad-9138-bdded3a9c7b8" containerName="storage" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.394313 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ca632b-b8af-4cad-9138-bdded3a9c7b8" containerName="storage" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.394771 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.396390 4743 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-4ltb4" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.397208 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.397229 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.397312 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.399163 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-k2vtw"] Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.531842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-node-mnt\") pod \"crc-storage-crc-k2vtw\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.532038 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-crc-storage\") pod \"crc-storage-crc-k2vtw\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.532144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d2qw\" (UniqueName: \"kubernetes.io/projected/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-kube-api-access-4d2qw\") pod \"crc-storage-crc-k2vtw\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.633909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-node-mnt\") pod \"crc-storage-crc-k2vtw\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.634062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-crc-storage\") pod \"crc-storage-crc-k2vtw\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.634112 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d2qw\" (UniqueName: \"kubernetes.io/projected/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-kube-api-access-4d2qw\") pod \"crc-storage-crc-k2vtw\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.634191 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-node-mnt\") pod \"crc-storage-crc-k2vtw\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.635107 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-crc-storage\") pod \"crc-storage-crc-k2vtw\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.660203 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d2qw\" (UniqueName: \"kubernetes.io/projected/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-kube-api-access-4d2qw\") pod \"crc-storage-crc-k2vtw\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:17 crc kubenswrapper[4743]: I1122 09:37:17.708228 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:18 crc kubenswrapper[4743]: I1122 09:37:18.170085 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-k2vtw"] Nov 22 09:37:18 crc kubenswrapper[4743]: W1122 09:37:18.181180 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7193d9e7_a263_4bf5_b0c8_2afc8fc43704.slice/crio-e05b696beefcb1c7ffa24716a35a2fe29ddcb3c5e7c282e673b96f6d1f80cdbe WatchSource:0}: Error finding container e05b696beefcb1c7ffa24716a35a2fe29ddcb3c5e7c282e673b96f6d1f80cdbe: Status 404 returned error can't find the container with id e05b696beefcb1c7ffa24716a35a2fe29ddcb3c5e7c282e673b96f6d1f80cdbe Nov 22 09:37:18 crc kubenswrapper[4743]: I1122 09:37:18.226457 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-k2vtw" event={"ID":"7193d9e7-a263-4bf5-b0c8-2afc8fc43704","Type":"ContainerStarted","Data":"e05b696beefcb1c7ffa24716a35a2fe29ddcb3c5e7c282e673b96f6d1f80cdbe"} Nov 22 09:37:19 crc kubenswrapper[4743]: I1122 09:37:19.150994 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:37:19 crc kubenswrapper[4743]: E1122 09:37:19.152529 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:37:19 crc kubenswrapper[4743]: I1122 09:37:19.161822 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ca632b-b8af-4cad-9138-bdded3a9c7b8" path="/var/lib/kubelet/pods/38ca632b-b8af-4cad-9138-bdded3a9c7b8/volumes" Nov 22 09:37:19 crc kubenswrapper[4743]: I1122 09:37:19.238822 4743 generic.go:334] "Generic (PLEG): container finished" podID="7193d9e7-a263-4bf5-b0c8-2afc8fc43704" containerID="7dc0dec14cc1125d6aea002ec2065b9d0e0c0222fd2e9f023710ad832e993dc8" exitCode=0 Nov 22 09:37:19 crc kubenswrapper[4743]: I1122 09:37:19.238875 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-k2vtw" event={"ID":"7193d9e7-a263-4bf5-b0c8-2afc8fc43704","Type":"ContainerDied","Data":"7dc0dec14cc1125d6aea002ec2065b9d0e0c0222fd2e9f023710ad832e993dc8"} Nov 22 09:37:20 crc kubenswrapper[4743]: I1122 09:37:20.581211 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:20 crc kubenswrapper[4743]: I1122 09:37:20.681527 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-crc-storage\") pod \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " Nov 22 09:37:20 crc kubenswrapper[4743]: I1122 09:37:20.681632 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-node-mnt\") pod \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " Nov 22 09:37:20 crc kubenswrapper[4743]: I1122 09:37:20.681722 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7193d9e7-a263-4bf5-b0c8-2afc8fc43704" (UID: "7193d9e7-a263-4bf5-b0c8-2afc8fc43704"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:37:20 crc kubenswrapper[4743]: I1122 09:37:20.681750 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d2qw\" (UniqueName: \"kubernetes.io/projected/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-kube-api-access-4d2qw\") pod \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\" (UID: \"7193d9e7-a263-4bf5-b0c8-2afc8fc43704\") " Nov 22 09:37:20 crc kubenswrapper[4743]: I1122 09:37:20.681971 4743 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-node-mnt\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:20 crc kubenswrapper[4743]: I1122 09:37:20.690921 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-kube-api-access-4d2qw" (OuterVolumeSpecName: "kube-api-access-4d2qw") pod "7193d9e7-a263-4bf5-b0c8-2afc8fc43704" (UID: "7193d9e7-a263-4bf5-b0c8-2afc8fc43704"). InnerVolumeSpecName "kube-api-access-4d2qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:37:20 crc kubenswrapper[4743]: I1122 09:37:20.702060 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7193d9e7-a263-4bf5-b0c8-2afc8fc43704" (UID: "7193d9e7-a263-4bf5-b0c8-2afc8fc43704"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:37:20 crc kubenswrapper[4743]: I1122 09:37:20.783312 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d2qw\" (UniqueName: \"kubernetes.io/projected/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-kube-api-access-4d2qw\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:20 crc kubenswrapper[4743]: I1122 09:37:20.783360 4743 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7193d9e7-a263-4bf5-b0c8-2afc8fc43704-crc-storage\") on node \"crc\" DevicePath \"\"" Nov 22 09:37:21 crc kubenswrapper[4743]: I1122 09:37:21.261425 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-k2vtw" event={"ID":"7193d9e7-a263-4bf5-b0c8-2afc8fc43704","Type":"ContainerDied","Data":"e05b696beefcb1c7ffa24716a35a2fe29ddcb3c5e7c282e673b96f6d1f80cdbe"} Nov 22 09:37:21 crc kubenswrapper[4743]: I1122 09:37:21.261492 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e05b696beefcb1c7ffa24716a35a2fe29ddcb3c5e7c282e673b96f6d1f80cdbe" Nov 22 09:37:21 crc kubenswrapper[4743]: I1122 09:37:21.261559 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-k2vtw" Nov 22 09:37:30 crc kubenswrapper[4743]: I1122 09:37:30.151830 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:37:30 crc kubenswrapper[4743]: E1122 09:37:30.152740 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:37:31 crc kubenswrapper[4743]: I1122 09:37:31.528093 4743 scope.go:117] "RemoveContainer" containerID="50f7684c7231686be26e388276494b4a329a83b9d59dc01f91ca9a1d238c0115" Nov 22 09:37:42 crc kubenswrapper[4743]: I1122 09:37:42.151752 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:37:42 crc kubenswrapper[4743]: E1122 09:37:42.153108 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:37:57 crc kubenswrapper[4743]: I1122 09:37:57.156543 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:37:57 crc kubenswrapper[4743]: E1122 09:37:57.157342 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.227895 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qnr"] Nov 22 09:38:03 crc kubenswrapper[4743]: E1122 09:38:03.229134 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7193d9e7-a263-4bf5-b0c8-2afc8fc43704" containerName="storage" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.229153 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7193d9e7-a263-4bf5-b0c8-2afc8fc43704" containerName="storage" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.229388 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7193d9e7-a263-4bf5-b0c8-2afc8fc43704" containerName="storage" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.231556 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.241144 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qnr"] Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.345093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-catalog-content\") pod \"redhat-marketplace-r5qnr\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.345248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-utilities\") pod \"redhat-marketplace-r5qnr\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.345348 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cwt7\" (UniqueName: \"kubernetes.io/projected/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-kube-api-access-9cwt7\") pod \"redhat-marketplace-r5qnr\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.446968 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-utilities\") pod \"redhat-marketplace-r5qnr\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.447099 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cwt7\" (UniqueName: \"kubernetes.io/projected/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-kube-api-access-9cwt7\") pod \"redhat-marketplace-r5qnr\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.447129 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-catalog-content\") pod \"redhat-marketplace-r5qnr\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.447464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-utilities\") pod \"redhat-marketplace-r5qnr\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.447778 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-catalog-content\") pod \"redhat-marketplace-r5qnr\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.479676 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cwt7\" (UniqueName: \"kubernetes.io/projected/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-kube-api-access-9cwt7\") pod \"redhat-marketplace-r5qnr\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:03 crc kubenswrapper[4743]: I1122 09:38:03.575362 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:04 crc kubenswrapper[4743]: I1122 09:38:04.076705 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qnr"] Nov 22 09:38:04 crc kubenswrapper[4743]: I1122 09:38:04.657903 4743 generic.go:334] "Generic (PLEG): container finished" podID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerID="ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524" exitCode=0 Nov 22 09:38:04 crc kubenswrapper[4743]: I1122 09:38:04.657961 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qnr" event={"ID":"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0","Type":"ContainerDied","Data":"ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524"} Nov 22 09:38:04 crc kubenswrapper[4743]: I1122 09:38:04.658000 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qnr" event={"ID":"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0","Type":"ContainerStarted","Data":"8bb46b5eb3b5a661379f40edc82f0cb68e1bc445da68ab75170e1c63eed8e1f7"} Nov 22 09:38:04 crc kubenswrapper[4743]: I1122 09:38:04.661171 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:38:05 crc kubenswrapper[4743]: I1122 09:38:05.672994 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qnr" event={"ID":"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0","Type":"ContainerStarted","Data":"cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d"} Nov 22 09:38:06 crc kubenswrapper[4743]: I1122 09:38:06.684827 4743 generic.go:334] "Generic (PLEG): container finished" podID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerID="cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d" exitCode=0 Nov 22 09:38:06 crc kubenswrapper[4743]: I1122 09:38:06.684913 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qnr" event={"ID":"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0","Type":"ContainerDied","Data":"cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d"} Nov 22 09:38:07 crc kubenswrapper[4743]: I1122 09:38:07.701625 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qnr" event={"ID":"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0","Type":"ContainerStarted","Data":"d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3"} Nov 22 09:38:11 crc kubenswrapper[4743]: I1122 09:38:11.152378 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:38:11 crc kubenswrapper[4743]: E1122 09:38:11.153104 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:38:13 crc kubenswrapper[4743]: I1122 09:38:13.576619 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:13 crc kubenswrapper[4743]: I1122 09:38:13.577137 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:13 crc kubenswrapper[4743]: I1122 09:38:13.635816 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:13 crc kubenswrapper[4743]: I1122 09:38:13.662100 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r5qnr" podStartSLOduration=8.217950011 podStartE2EDuration="10.66207628s" podCreationTimestamp="2025-11-22 09:38:03 +0000 UTC" firstStartedPulling="2025-11-22 09:38:04.66083713 +0000 UTC m=+4558.367198182" lastFinishedPulling="2025-11-22 09:38:07.104963359 +0000 UTC m=+4560.811324451" observedRunningTime="2025-11-22 09:38:07.735074268 +0000 UTC m=+4561.441435330" watchObservedRunningTime="2025-11-22 09:38:13.66207628 +0000 UTC m=+4567.368437382" Nov 22 09:38:13 crc kubenswrapper[4743]: I1122 09:38:13.807952 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:13 crc kubenswrapper[4743]: I1122 09:38:13.876056 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qnr"] Nov 22 09:38:15 crc kubenswrapper[4743]: I1122 09:38:15.776701 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r5qnr" podUID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerName="registry-server" containerID="cri-o://d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3" gracePeriod=2 Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.252111 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.254210 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cwt7\" (UniqueName: \"kubernetes.io/projected/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-kube-api-access-9cwt7\") pod \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.254332 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-utilities\") pod \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.254418 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-catalog-content\") pod \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\" (UID: \"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0\") " Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.255239 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-utilities" (OuterVolumeSpecName: "utilities") pod "af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" (UID: "af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.269736 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-kube-api-access-9cwt7" (OuterVolumeSpecName: "kube-api-access-9cwt7") pod "af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" (UID: "af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0"). InnerVolumeSpecName "kube-api-access-9cwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.296701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" (UID: "af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.356880 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.357149 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cwt7\" (UniqueName: \"kubernetes.io/projected/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-kube-api-access-9cwt7\") on node \"crc\" DevicePath \"\"" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.357367 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.788044 4743 generic.go:334] "Generic (PLEG): container finished" podID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerID="d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3" exitCode=0 Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.788135 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r5qnr" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.788155 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qnr" event={"ID":"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0","Type":"ContainerDied","Data":"d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3"} Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.788384 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r5qnr" event={"ID":"af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0","Type":"ContainerDied","Data":"8bb46b5eb3b5a661379f40edc82f0cb68e1bc445da68ab75170e1c63eed8e1f7"} Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.788414 4743 scope.go:117] "RemoveContainer" containerID="d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.843629 4743 scope.go:117] "RemoveContainer" containerID="cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.879343 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qnr"] Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.892956 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r5qnr"] Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.893068 4743 scope.go:117] "RemoveContainer" containerID="ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.925550 4743 scope.go:117] "RemoveContainer" containerID="d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3" Nov 22 09:38:16 crc kubenswrapper[4743]: E1122 09:38:16.926405 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3\": container with ID starting with d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3 not found: ID does not exist" containerID="d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.926477 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3"} err="failed to get container status \"d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3\": rpc error: code = NotFound desc = could not find container \"d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3\": container with ID starting with d53038fd324d20459a883eae99990ee8e41bd08f8fd6b44a31009476538b18b3 not found: ID does not exist" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.926529 4743 scope.go:117] "RemoveContainer" containerID="cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d" Nov 22 09:38:16 crc kubenswrapper[4743]: E1122 09:38:16.927377 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d\": container with ID starting with cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d not found: ID does not exist" containerID="cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.927566 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d"} err="failed to get container status \"cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d\": rpc error: code = NotFound desc = could not find container \"cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d\": container with ID starting with cd90df14443d74d1b7a457d4ff04ce449f52ff729bb40afc8af25101adf00a0d not found: ID does not exist" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.927739 4743 scope.go:117] "RemoveContainer" containerID="ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524" Nov 22 09:38:16 crc kubenswrapper[4743]: E1122 09:38:16.928381 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524\": container with ID starting with ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524 not found: ID does not exist" containerID="ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524" Nov 22 09:38:16 crc kubenswrapper[4743]: I1122 09:38:16.928536 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524"} err="failed to get container status \"ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524\": rpc error: code = NotFound desc = could not find container \"ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524\": container with ID starting with ef8753212be0e4a8a8b29b506205c52569aa5f440b52a003b0f8ea49c5859524 not found: ID does not exist" Nov 22 09:38:17 crc kubenswrapper[4743]: I1122 09:38:17.165237 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" path="/var/lib/kubelet/pods/af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0/volumes" Nov 22 09:38:22 crc kubenswrapper[4743]: I1122 09:38:22.152432 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:38:22 crc kubenswrapper[4743]: E1122 09:38:22.153750 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:38:36 crc kubenswrapper[4743]: I1122 09:38:36.151772 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:38:36 crc kubenswrapper[4743]: E1122 09:38:36.152437 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:38:50 crc kubenswrapper[4743]: I1122 09:38:50.151755 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:38:50 crc kubenswrapper[4743]: E1122 09:38:50.152607 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:39:04 crc kubenswrapper[4743]: I1122 09:39:04.151852 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:39:04 crc kubenswrapper[4743]: E1122 09:39:04.152987 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.617806 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cl8z6"] Nov 22 09:39:14 crc kubenswrapper[4743]: E1122 09:39:14.618593 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerName="extract-content" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.618605 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerName="extract-content" Nov 22 09:39:14 crc kubenswrapper[4743]: E1122 09:39:14.618613 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerName="registry-server" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.618619 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerName="registry-server" Nov 22 09:39:14 crc kubenswrapper[4743]: E1122 09:39:14.618628 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerName="extract-utilities" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.618634 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerName="extract-utilities" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.618768 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="af03f82c-6cd4-4fc9-afcc-f7da89f4fdc0" containerName="registry-server" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.619730 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.686719 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cl8z6"] Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.708412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-catalog-content\") pod \"community-operators-cl8z6\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.708518 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lndc4\" (UniqueName: \"kubernetes.io/projected/588a1581-2cfe-4ce0-b28e-dcb1e471a151-kube-api-access-lndc4\") pod \"community-operators-cl8z6\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.708643 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-utilities\") pod \"community-operators-cl8z6\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.810459 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lndc4\" (UniqueName: \"kubernetes.io/projected/588a1581-2cfe-4ce0-b28e-dcb1e471a151-kube-api-access-lndc4\") pod \"community-operators-cl8z6\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.810552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-utilities\") pod \"community-operators-cl8z6\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.810655 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-catalog-content\") pod \"community-operators-cl8z6\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.811236 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-catalog-content\") pod \"community-operators-cl8z6\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.811346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-utilities\") pod \"community-operators-cl8z6\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.834598 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lndc4\" (UniqueName: \"kubernetes.io/projected/588a1581-2cfe-4ce0-b28e-dcb1e471a151-kube-api-access-lndc4\") pod \"community-operators-cl8z6\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:14 crc kubenswrapper[4743]: I1122 09:39:14.937929 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:15 crc kubenswrapper[4743]: I1122 09:39:15.483359 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cl8z6"] Nov 22 09:39:15 crc kubenswrapper[4743]: W1122 09:39:15.491465 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod588a1581_2cfe_4ce0_b28e_dcb1e471a151.slice/crio-9a63e1088c23065c72615d91f28943171980b6b0abcb5c065b0533e4e0c004ce WatchSource:0}: Error finding container 9a63e1088c23065c72615d91f28943171980b6b0abcb5c065b0533e4e0c004ce: Status 404 returned error can't find the container with id 9a63e1088c23065c72615d91f28943171980b6b0abcb5c065b0533e4e0c004ce Nov 22 09:39:16 crc kubenswrapper[4743]: I1122 09:39:16.353457 4743 generic.go:334] "Generic (PLEG): container finished" podID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerID="cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb" exitCode=0 Nov 22 09:39:16 crc kubenswrapper[4743]: I1122 09:39:16.353512 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl8z6" event={"ID":"588a1581-2cfe-4ce0-b28e-dcb1e471a151","Type":"ContainerDied","Data":"cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb"} Nov 22 09:39:16 crc kubenswrapper[4743]: I1122 09:39:16.353542 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl8z6" event={"ID":"588a1581-2cfe-4ce0-b28e-dcb1e471a151","Type":"ContainerStarted","Data":"9a63e1088c23065c72615d91f28943171980b6b0abcb5c065b0533e4e0c004ce"} Nov 22 09:39:17 crc kubenswrapper[4743]: I1122 09:39:17.155474 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:39:17 crc kubenswrapper[4743]: E1122 09:39:17.155996 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:39:17 crc kubenswrapper[4743]: I1122 09:39:17.362470 4743 generic.go:334] "Generic (PLEG): container finished" podID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerID="ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b" exitCode=0 Nov 22 09:39:17 crc kubenswrapper[4743]: I1122 09:39:17.362535 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl8z6" event={"ID":"588a1581-2cfe-4ce0-b28e-dcb1e471a151","Type":"ContainerDied","Data":"ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b"} Nov 22 09:39:18 crc kubenswrapper[4743]: I1122 09:39:18.390213 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl8z6" event={"ID":"588a1581-2cfe-4ce0-b28e-dcb1e471a151","Type":"ContainerStarted","Data":"bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f"} Nov 22 09:39:18 crc kubenswrapper[4743]: I1122 09:39:18.428691 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cl8z6" podStartSLOduration=2.9705055270000003 podStartE2EDuration="4.428550634s" podCreationTimestamp="2025-11-22 09:39:14 +0000 UTC" firstStartedPulling="2025-11-22 09:39:16.358336162 +0000 UTC m=+4630.064697244" lastFinishedPulling="2025-11-22 09:39:17.816381299 +0000 UTC m=+4631.522742351" observedRunningTime="2025-11-22 09:39:18.424716803 +0000 UTC m=+4632.131077905" watchObservedRunningTime="2025-11-22 09:39:18.428550634 +0000 UTC m=+4632.134911716" Nov 22 09:39:24 crc kubenswrapper[4743]: I1122 09:39:24.938716 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:24 crc kubenswrapper[4743]: I1122 09:39:24.939295 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:25 crc kubenswrapper[4743]: I1122 09:39:25.378780 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:25 crc kubenswrapper[4743]: I1122 09:39:25.532432 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:25 crc kubenswrapper[4743]: I1122 09:39:25.627484 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cl8z6"] Nov 22 09:39:27 crc kubenswrapper[4743]: I1122 09:39:27.477042 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cl8z6" podUID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerName="registry-server" containerID="cri-o://bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f" gracePeriod=2 Nov 22 09:39:27 crc kubenswrapper[4743]: I1122 09:39:27.934036 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.030933 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lndc4\" (UniqueName: \"kubernetes.io/projected/588a1581-2cfe-4ce0-b28e-dcb1e471a151-kube-api-access-lndc4\") pod \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.031011 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-utilities\") pod \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.031037 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-catalog-content\") pod \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\" (UID: \"588a1581-2cfe-4ce0-b28e-dcb1e471a151\") " Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.032332 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-utilities" (OuterVolumeSpecName: "utilities") pod "588a1581-2cfe-4ce0-b28e-dcb1e471a151" (UID: "588a1581-2cfe-4ce0-b28e-dcb1e471a151"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.037905 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588a1581-2cfe-4ce0-b28e-dcb1e471a151-kube-api-access-lndc4" (OuterVolumeSpecName: "kube-api-access-lndc4") pod "588a1581-2cfe-4ce0-b28e-dcb1e471a151" (UID: "588a1581-2cfe-4ce0-b28e-dcb1e471a151"). InnerVolumeSpecName "kube-api-access-lndc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.082140 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "588a1581-2cfe-4ce0-b28e-dcb1e471a151" (UID: "588a1581-2cfe-4ce0-b28e-dcb1e471a151"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.132463 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lndc4\" (UniqueName: \"kubernetes.io/projected/588a1581-2cfe-4ce0-b28e-dcb1e471a151-kube-api-access-lndc4\") on node \"crc\" DevicePath \"\"" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.132495 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.132506 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/588a1581-2cfe-4ce0-b28e-dcb1e471a151-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.489233 4743 generic.go:334] "Generic (PLEG): container finished" podID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerID="bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f" exitCode=0 Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.489291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl8z6" event={"ID":"588a1581-2cfe-4ce0-b28e-dcb1e471a151","Type":"ContainerDied","Data":"bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f"} Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.489503 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cl8z6" event={"ID":"588a1581-2cfe-4ce0-b28e-dcb1e471a151","Type":"ContainerDied","Data":"9a63e1088c23065c72615d91f28943171980b6b0abcb5c065b0533e4e0c004ce"} Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.489526 4743 scope.go:117] "RemoveContainer" containerID="bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.489315 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cl8z6" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.519245 4743 scope.go:117] "RemoveContainer" containerID="ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.527645 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cl8z6"] Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.536035 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cl8z6"] Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.541008 4743 scope.go:117] "RemoveContainer" containerID="cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.564017 4743 scope.go:117] "RemoveContainer" containerID="bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f" Nov 22 09:39:28 crc kubenswrapper[4743]: E1122 09:39:28.564358 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f\": container with ID starting with bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f not found: ID does not exist" containerID="bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.564405 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f"} err="failed to get container status \"bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f\": rpc error: code = NotFound desc = could not find container \"bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f\": container with ID starting with bd203ba3637e0a739bc3f704b5bfa93750327188f6844e54fa7d8e6e86846b1f not found: ID does not exist" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.564434 4743 scope.go:117] "RemoveContainer" containerID="ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b" Nov 22 09:39:28 crc kubenswrapper[4743]: E1122 09:39:28.564824 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b\": container with ID starting with ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b not found: ID does not exist" containerID="ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.564847 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b"} err="failed to get container status \"ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b\": rpc error: code = NotFound desc = could not find container \"ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b\": container with ID starting with ec484a89044d041a4a7ff70a43ad102b89b0c4eebea5d28df67c7a6abc38246b not found: ID does not exist" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.564884 4743 scope.go:117] "RemoveContainer" containerID="cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb" Nov 22 09:39:28 crc kubenswrapper[4743]: E1122 09:39:28.565094 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb\": container with ID starting with cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb not found: ID does not exist" containerID="cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb" Nov 22 09:39:28 crc kubenswrapper[4743]: I1122 09:39:28.565120 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb"} err="failed to get container status \"cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb\": rpc error: code = NotFound desc = could not find container \"cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb\": container with ID starting with cd1bf480fedbc2ccdc3e7f97b0591fb157024d193570495136d101bf5f783bdb not found: ID does not exist" Nov 22 09:39:29 crc kubenswrapper[4743]: I1122 09:39:29.165220 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" path="/var/lib/kubelet/pods/588a1581-2cfe-4ce0-b28e-dcb1e471a151/volumes" Nov 22 09:39:30 crc kubenswrapper[4743]: I1122 09:39:30.151279 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:39:30 crc kubenswrapper[4743]: E1122 09:39:30.152714 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:39:45 crc kubenswrapper[4743]: I1122 09:39:45.152126 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:39:45 crc kubenswrapper[4743]: E1122 09:39:45.153278 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:40:00 crc kubenswrapper[4743]: I1122 09:40:00.151446 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:40:00 crc kubenswrapper[4743]: E1122 09:40:00.152297 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:40:14 crc kubenswrapper[4743]: I1122 09:40:14.152465 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:40:14 crc kubenswrapper[4743]: E1122 09:40:14.153988 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:40:27 crc kubenswrapper[4743]: I1122 09:40:27.159496 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:40:27 crc kubenswrapper[4743]: E1122 09:40:27.160435 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.159142 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.463543 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-2slv2"] Nov 22 09:40:39 crc kubenswrapper[4743]: E1122 09:40:39.464307 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerName="extract-utilities" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.464328 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerName="extract-utilities" Nov 22 09:40:39 crc kubenswrapper[4743]: E1122 09:40:39.464363 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerName="registry-server" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.464372 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerName="registry-server" Nov 22 09:40:39 crc kubenswrapper[4743]: E1122 09:40:39.464389 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerName="extract-content" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.464396 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerName="extract-content" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.464566 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="588a1581-2cfe-4ce0-b28e-dcb1e471a151" containerName="registry-server" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.465517 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.469899 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.470168 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.470829 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.471010 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-j68sd" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.475641 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-2slv2"] Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.477392 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.617614 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4jmh\" (UniqueName: \"kubernetes.io/projected/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-kube-api-access-b4jmh\") pod \"dnsmasq-dns-5d7b5456f5-2slv2\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.617699 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-2slv2\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.617735 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-config\") pod \"dnsmasq-dns-5d7b5456f5-2slv2\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.673051 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x7jsl"] Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.674178 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.691256 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x7jsl"] Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.720432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-2slv2\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.720475 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-config\") pod \"dnsmasq-dns-5d7b5456f5-2slv2\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.720524 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4jmh\" (UniqueName: \"kubernetes.io/projected/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-kube-api-access-b4jmh\") pod \"dnsmasq-dns-5d7b5456f5-2slv2\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.722005 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-2slv2\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.722493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-config\") pod \"dnsmasq-dns-5d7b5456f5-2slv2\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.756627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4jmh\" (UniqueName: \"kubernetes.io/projected/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-kube-api-access-b4jmh\") pod \"dnsmasq-dns-5d7b5456f5-2slv2\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.790087 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.821491 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-x7jsl\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.821632 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-config\") pod \"dnsmasq-dns-98ddfc8f-x7jsl\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.821687 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x676z\" (UniqueName: \"kubernetes.io/projected/d6676dcf-2992-4a50-a37a-feab61d327e4-kube-api-access-x676z\") pod \"dnsmasq-dns-98ddfc8f-x7jsl\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.923240 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-config\") pod \"dnsmasq-dns-98ddfc8f-x7jsl\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.923618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x676z\" (UniqueName: \"kubernetes.io/projected/d6676dcf-2992-4a50-a37a-feab61d327e4-kube-api-access-x676z\") pod \"dnsmasq-dns-98ddfc8f-x7jsl\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.923667 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-x7jsl\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.924198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-config\") pod \"dnsmasq-dns-98ddfc8f-x7jsl\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.924379 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-x7jsl\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.953306 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x676z\" (UniqueName: \"kubernetes.io/projected/d6676dcf-2992-4a50-a37a-feab61d327e4-kube-api-access-x676z\") pod \"dnsmasq-dns-98ddfc8f-x7jsl\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:39 crc kubenswrapper[4743]: I1122 09:40:39.995178 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.054314 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"4146b0d7d55e01b0e59622b589a7a214eef363f3e0d0b2a21abc2d4eaf2f55f5"} Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.277518 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-2slv2"] Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.439014 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x7jsl"] Nov 22 09:40:40 crc kubenswrapper[4743]: E1122 09:40:40.549278 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64e1f2f3_9b7b_46a9_9d81_d4187536df5a.slice/crio-28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10.scope\": RecentStats: unable to find data in memory cache]" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.562980 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.564368 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.566733 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.567232 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.567309 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.567549 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lfz2k" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.568343 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.579155 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.736143 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.737411 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.737479 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.737518 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.737556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.737598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm6x4\" (UniqueName: \"kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-kube-api-access-xm6x4\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.737656 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.737719 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.737781 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.797408 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.798792 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.801257 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5cvz5" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.801418 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.801444 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.802422 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.802522 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.811144 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.842128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.842348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.843145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.844264 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-server-conf\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.844371 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.844429 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.844445 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm6x4\" (UniqueName: \"kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-kube-api-access-xm6x4\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.844486 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.845137 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.845546 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.845673 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.846206 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.846863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.849879 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.849937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-pod-info\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.853478 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.855502 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.855535 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8456811baf4ea1f550c890ff7a8ba9ccfae89e30299224b2d58e4c2bac2b9738/globalmount\"" pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.866448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm6x4\" (UniqueName: \"kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-kube-api-access-xm6x4\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.897812 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") pod \"rabbitmq-server-0\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.910498 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.948468 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.948512 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjk2c\" (UniqueName: \"kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-kube-api-access-rjk2c\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.948530 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e468630-9fa8-4efb-af86-811dd40b6f3c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.948560 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.948612 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e468630-9fa8-4efb-af86-811dd40b6f3c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.948646 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.948676 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.948694 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:40 crc kubenswrapper[4743]: I1122 09:40:40.948798 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.050456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.050521 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e468630-9fa8-4efb-af86-811dd40b6f3c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.050566 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.050626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.050651 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.050681 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.050714 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.050737 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjk2c\" (UniqueName: \"kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-kube-api-access-rjk2c\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.050754 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e468630-9fa8-4efb-af86-811dd40b6f3c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.051321 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.051536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.051895 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.052669 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.054836 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.055121 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e468630-9fa8-4efb-af86-811dd40b6f3c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.055520 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.055675 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f9bce1c2a821a60f09b63c9bc3276b70afe1c2ebc894453033219ddee0b70426/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.057040 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e468630-9fa8-4efb-af86-811dd40b6f3c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.069793 4743 generic.go:334] "Generic (PLEG): container finished" podID="d6676dcf-2992-4a50-a37a-feab61d327e4" containerID="2a87852c7f3766128169efb694821d1e68f435311a31e7bdb702c77650573ed0" exitCode=0 Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.069891 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" event={"ID":"d6676dcf-2992-4a50-a37a-feab61d327e4","Type":"ContainerDied","Data":"2a87852c7f3766128169efb694821d1e68f435311a31e7bdb702c77650573ed0"} Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.069917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" event={"ID":"d6676dcf-2992-4a50-a37a-feab61d327e4","Type":"ContainerStarted","Data":"1188912da7847e29b3eed1186d6ab312fece0a3f5f66056724d0fba4284e3fba"} Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.079331 4743 generic.go:334] "Generic (PLEG): container finished" podID="64e1f2f3-9b7b-46a9-9d81-d4187536df5a" containerID="28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10" exitCode=0 Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.079423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" event={"ID":"64e1f2f3-9b7b-46a9-9d81-d4187536df5a","Type":"ContainerDied","Data":"28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10"} Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.079459 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" event={"ID":"64e1f2f3-9b7b-46a9-9d81-d4187536df5a","Type":"ContainerStarted","Data":"2f2becffb625efa7429b3bf3d9e2688f6eaf15f4304e29f81d3870742baf12bf"} Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.080822 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjk2c\" (UniqueName: \"kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-kube-api-access-rjk2c\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.140883 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") pod \"rabbitmq-cell1-server-0\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.155071 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.190040 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:40:41 crc kubenswrapper[4743]: I1122 09:40:41.616667 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:40:41 crc kubenswrapper[4743]: W1122 09:40:41.639444 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e468630_9fa8_4efb_af86_811dd40b6f3c.slice/crio-7cb1152aaed510e5cf32b6a85c9aa9d2ecaa309e3a0f1f14c659b49f79faeb1d WatchSource:0}: Error finding container 7cb1152aaed510e5cf32b6a85c9aa9d2ecaa309e3a0f1f14c659b49f79faeb1d: Status 404 returned error can't find the container with id 7cb1152aaed510e5cf32b6a85c9aa9d2ecaa309e3a0f1f14c659b49f79faeb1d Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.089633 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e468630-9fa8-4efb-af86-811dd40b6f3c","Type":"ContainerStarted","Data":"7cb1152aaed510e5cf32b6a85c9aa9d2ecaa309e3a0f1f14c659b49f79faeb1d"} Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.090970 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"50de19a9-ddc9-4417-bb70-8057fa9dcdfb","Type":"ContainerStarted","Data":"5748efc96ea0592fd16b22c3ea4bb25a539ef3ca3ef45ef67f846a4dec30c24c"} Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.093512 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" event={"ID":"d6676dcf-2992-4a50-a37a-feab61d327e4","Type":"ContainerStarted","Data":"d3e58b6a189af54ae625b579ec13f91891b6c2e96189a6c4eacf4ccff9bbf230"} Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.094418 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.096245 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" event={"ID":"64e1f2f3-9b7b-46a9-9d81-d4187536df5a","Type":"ContainerStarted","Data":"32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee"} Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.096738 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.145792 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" podStartSLOduration=3.145773263 podStartE2EDuration="3.145773263s" podCreationTimestamp="2025-11-22 09:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:40:42.140762299 +0000 UTC m=+4715.847123351" watchObservedRunningTime="2025-11-22 09:40:42.145773263 +0000 UTC m=+4715.852134315" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.146440 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" podStartSLOduration=3.146434612 podStartE2EDuration="3.146434612s" podCreationTimestamp="2025-11-22 09:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:40:42.122191535 +0000 UTC m=+4715.828552597" watchObservedRunningTime="2025-11-22 09:40:42.146434612 +0000 UTC m=+4715.852795664" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.191106 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.193995 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.196380 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.196404 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.196604 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.196705 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4msxp" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.200061 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.206414 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.378459 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/087b5455-e53b-49da-a7d5-6d2317df7d4f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.378523 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/087b5455-e53b-49da-a7d5-6d2317df7d4f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.378551 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c45c758f-53d8-427c-a4ed-4595b879472b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c45c758f-53d8-427c-a4ed-4595b879472b\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.378606 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087b5455-e53b-49da-a7d5-6d2317df7d4f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.378652 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/087b5455-e53b-49da-a7d5-6d2317df7d4f-kolla-config\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.378675 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/087b5455-e53b-49da-a7d5-6d2317df7d4f-config-data-default\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.378704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/087b5455-e53b-49da-a7d5-6d2317df7d4f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.378729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkx9h\" (UniqueName: \"kubernetes.io/projected/087b5455-e53b-49da-a7d5-6d2317df7d4f-kube-api-access-kkx9h\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.457603 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.458768 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.460637 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-f5mkw" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.467830 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.468800 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.480047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/087b5455-e53b-49da-a7d5-6d2317df7d4f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.480097 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/087b5455-e53b-49da-a7d5-6d2317df7d4f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.480116 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c45c758f-53d8-427c-a4ed-4595b879472b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c45c758f-53d8-427c-a4ed-4595b879472b\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.480150 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087b5455-e53b-49da-a7d5-6d2317df7d4f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.480171 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/087b5455-e53b-49da-a7d5-6d2317df7d4f-kolla-config\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.480189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/087b5455-e53b-49da-a7d5-6d2317df7d4f-config-data-default\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.480209 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/087b5455-e53b-49da-a7d5-6d2317df7d4f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.480229 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkx9h\" (UniqueName: \"kubernetes.io/projected/087b5455-e53b-49da-a7d5-6d2317df7d4f-kube-api-access-kkx9h\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.481664 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/087b5455-e53b-49da-a7d5-6d2317df7d4f-operator-scripts\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.481792 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/087b5455-e53b-49da-a7d5-6d2317df7d4f-kolla-config\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.481795 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/087b5455-e53b-49da-a7d5-6d2317df7d4f-config-data-default\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.482071 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/087b5455-e53b-49da-a7d5-6d2317df7d4f-config-data-generated\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.484107 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.484130 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c45c758f-53d8-427c-a4ed-4595b879472b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c45c758f-53d8-427c-a4ed-4595b879472b\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4759025a1314a7dd8b9e36b04b11de524e061dafee6cf1b753c176e5df454e1f/globalmount\"" pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.581070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj465\" (UniqueName: \"kubernetes.io/projected/3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce-kube-api-access-sj465\") pod \"memcached-0\" (UID: \"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce\") " pod="openstack/memcached-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.581182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce-kolla-config\") pod \"memcached-0\" (UID: \"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce\") " pod="openstack/memcached-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.581226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce-config-data\") pod \"memcached-0\" (UID: \"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce\") " pod="openstack/memcached-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.682470 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce-kolla-config\") pod \"memcached-0\" (UID: \"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce\") " pod="openstack/memcached-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.682537 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce-config-data\") pod \"memcached-0\" (UID: \"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce\") " pod="openstack/memcached-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.682601 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj465\" (UniqueName: \"kubernetes.io/projected/3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce-kube-api-access-sj465\") pod \"memcached-0\" (UID: \"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce\") " pod="openstack/memcached-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.683357 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce-kolla-config\") pod \"memcached-0\" (UID: \"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce\") " pod="openstack/memcached-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.683440 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce-config-data\") pod \"memcached-0\" (UID: \"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce\") " pod="openstack/memcached-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.740429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/087b5455-e53b-49da-a7d5-6d2317df7d4f-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.740644 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/087b5455-e53b-49da-a7d5-6d2317df7d4f-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.743823 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkx9h\" (UniqueName: \"kubernetes.io/projected/087b5455-e53b-49da-a7d5-6d2317df7d4f-kube-api-access-kkx9h\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.744049 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj465\" (UniqueName: \"kubernetes.io/projected/3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce-kube-api-access-sj465\") pod \"memcached-0\" (UID: \"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce\") " pod="openstack/memcached-0" Nov 22 09:40:42 crc kubenswrapper[4743]: I1122 09:40:42.826347 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.052549 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c45c758f-53d8-427c-a4ed-4595b879472b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c45c758f-53d8-427c-a4ed-4595b879472b\") pod \"openstack-galera-0\" (UID: \"087b5455-e53b-49da-a7d5-6d2317df7d4f\") " pod="openstack/openstack-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.103155 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"50de19a9-ddc9-4417-bb70-8057fa9dcdfb","Type":"ContainerStarted","Data":"968adc6c9d405475d4d1b1c69c86a0ec387890f0b790fce61334cfba80d49542"} Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.118493 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.244112 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Nov 22 09:40:43 crc kubenswrapper[4743]: W1122 09:40:43.260389 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3362e3d8_3a46_4ef9_abb5_0c75ea0d28ce.slice/crio-32152184bd8038d779298095beb656a94d75bc9e505836aff29b9262ad8fadd6 WatchSource:0}: Error finding container 32152184bd8038d779298095beb656a94d75bc9e505836aff29b9262ad8fadd6: Status 404 returned error can't find the container with id 32152184bd8038d779298095beb656a94d75bc9e505836aff29b9262ad8fadd6 Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.569957 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.601405 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.602745 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.605309 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.607013 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-rrzgw" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.608552 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.608844 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.616602 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.705355 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e64bc7-078f-4609-add5-ac4679314d0a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.705509 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e64bc7-078f-4609-add5-ac4679314d0a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.705528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0e64bc7-078f-4609-add5-ac4679314d0a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.705557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvzn9\" (UniqueName: \"kubernetes.io/projected/e0e64bc7-078f-4609-add5-ac4679314d0a-kube-api-access-tvzn9\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.705591 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0e64bc7-078f-4609-add5-ac4679314d0a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.705636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9960f28f-3866-4319-9ff2-987d29984290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9960f28f-3866-4319-9ff2-987d29984290\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.705661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0e64bc7-078f-4609-add5-ac4679314d0a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.705694 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e64bc7-078f-4609-add5-ac4679314d0a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.807984 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e64bc7-078f-4609-add5-ac4679314d0a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.808047 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e64bc7-078f-4609-add5-ac4679314d0a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.808067 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0e64bc7-078f-4609-add5-ac4679314d0a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.808147 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvzn9\" (UniqueName: \"kubernetes.io/projected/e0e64bc7-078f-4609-add5-ac4679314d0a-kube-api-access-tvzn9\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.808189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0e64bc7-078f-4609-add5-ac4679314d0a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.808277 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9960f28f-3866-4319-9ff2-987d29984290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9960f28f-3866-4319-9ff2-987d29984290\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.808303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0e64bc7-078f-4609-add5-ac4679314d0a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.808372 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e64bc7-078f-4609-add5-ac4679314d0a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.810257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e0e64bc7-078f-4609-add5-ac4679314d0a-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.812592 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e0e64bc7-078f-4609-add5-ac4679314d0a-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.813247 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e0e64bc7-078f-4609-add5-ac4679314d0a-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.813761 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0e64bc7-078f-4609-add5-ac4679314d0a-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.814327 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.814379 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9960f28f-3866-4319-9ff2-987d29984290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9960f28f-3866-4319-9ff2-987d29984290\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/489b5ff4f93a8f5c94e493cea28b03f6ae0cce91b0419394c080a2fcc715adfd/globalmount\"" pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.816138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0e64bc7-078f-4609-add5-ac4679314d0a-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.816490 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0e64bc7-078f-4609-add5-ac4679314d0a-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.829367 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvzn9\" (UniqueName: \"kubernetes.io/projected/e0e64bc7-078f-4609-add5-ac4679314d0a-kube-api-access-tvzn9\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.846093 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9960f28f-3866-4319-9ff2-987d29984290\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9960f28f-3866-4319-9ff2-987d29984290\") pod \"openstack-cell1-galera-0\" (UID: \"e0e64bc7-078f-4609-add5-ac4679314d0a\") " pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:43 crc kubenswrapper[4743]: I1122 09:40:43.963939 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:44 crc kubenswrapper[4743]: I1122 09:40:44.113446 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce","Type":"ContainerStarted","Data":"5d3f25aa48a069dc6281b7d2d12fb1f059afee16c347bc1e10b3d173bcde0856"} Nov 22 09:40:44 crc kubenswrapper[4743]: I1122 09:40:44.113502 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce","Type":"ContainerStarted","Data":"32152184bd8038d779298095beb656a94d75bc9e505836aff29b9262ad8fadd6"} Nov 22 09:40:44 crc kubenswrapper[4743]: I1122 09:40:44.113635 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Nov 22 09:40:44 crc kubenswrapper[4743]: I1122 09:40:44.114806 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"087b5455-e53b-49da-a7d5-6d2317df7d4f","Type":"ContainerStarted","Data":"1d11106bb3b55c5a1a836f4b413ecbd5601286e82bc73479d874f30d8b7cb9d2"} Nov 22 09:40:44 crc kubenswrapper[4743]: I1122 09:40:44.114837 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"087b5455-e53b-49da-a7d5-6d2317df7d4f","Type":"ContainerStarted","Data":"8b852d860086f2436e649b0b3231b082101a8bba26fc8bbcca8dae2a826bb69e"} Nov 22 09:40:44 crc kubenswrapper[4743]: I1122 09:40:44.117066 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e468630-9fa8-4efb-af86-811dd40b6f3c","Type":"ContainerStarted","Data":"e713d8c4e4e8e5b440ab5542ab098efda9d0c5b8b4d416ca70682cd59674062f"} Nov 22 09:40:44 crc kubenswrapper[4743]: I1122 09:40:44.131301 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.13128045 podStartE2EDuration="2.13128045s" podCreationTimestamp="2025-11-22 09:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:40:44.127431589 +0000 UTC m=+4717.833792651" watchObservedRunningTime="2025-11-22 09:40:44.13128045 +0000 UTC m=+4717.837641502" Nov 22 09:40:44 crc kubenswrapper[4743]: I1122 09:40:44.554375 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Nov 22 09:40:45 crc kubenswrapper[4743]: I1122 09:40:45.125430 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0e64bc7-078f-4609-add5-ac4679314d0a","Type":"ContainerStarted","Data":"affc9440d24250ed7bcbc477e88b4a6823028730de8738d5ddbcc46734eee31f"} Nov 22 09:40:45 crc kubenswrapper[4743]: I1122 09:40:45.125850 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0e64bc7-078f-4609-add5-ac4679314d0a","Type":"ContainerStarted","Data":"914963dc06af15df33ba8f3a7aaac041970f50ee43bfc37d09f61b68b4202c69"} Nov 22 09:40:48 crc kubenswrapper[4743]: I1122 09:40:48.148249 4743 generic.go:334] "Generic (PLEG): container finished" podID="087b5455-e53b-49da-a7d5-6d2317df7d4f" containerID="1d11106bb3b55c5a1a836f4b413ecbd5601286e82bc73479d874f30d8b7cb9d2" exitCode=0 Nov 22 09:40:48 crc kubenswrapper[4743]: I1122 09:40:48.148337 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"087b5455-e53b-49da-a7d5-6d2317df7d4f","Type":"ContainerDied","Data":"1d11106bb3b55c5a1a836f4b413ecbd5601286e82bc73479d874f30d8b7cb9d2"} Nov 22 09:40:48 crc kubenswrapper[4743]: I1122 09:40:48.150694 4743 generic.go:334] "Generic (PLEG): container finished" podID="e0e64bc7-078f-4609-add5-ac4679314d0a" containerID="affc9440d24250ed7bcbc477e88b4a6823028730de8738d5ddbcc46734eee31f" exitCode=0 Nov 22 09:40:48 crc kubenswrapper[4743]: I1122 09:40:48.150718 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0e64bc7-078f-4609-add5-ac4679314d0a","Type":"ContainerDied","Data":"affc9440d24250ed7bcbc477e88b4a6823028730de8738d5ddbcc46734eee31f"} Nov 22 09:40:49 crc kubenswrapper[4743]: I1122 09:40:49.165360 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"087b5455-e53b-49da-a7d5-6d2317df7d4f","Type":"ContainerStarted","Data":"923a2b49007e2a3a817d729dd27af58b1a74e857e6ad1f8845ddc51da2313911"} Nov 22 09:40:49 crc kubenswrapper[4743]: I1122 09:40:49.166202 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e0e64bc7-078f-4609-add5-ac4679314d0a","Type":"ContainerStarted","Data":"47bbc09884d7823a89c67f809543062b5f31ccc20b8bd7d7238632626507d47a"} Nov 22 09:40:49 crc kubenswrapper[4743]: I1122 09:40:49.200213 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.200179197 podStartE2EDuration="7.200179197s" podCreationTimestamp="2025-11-22 09:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:40:49.190858229 +0000 UTC m=+4722.897219291" watchObservedRunningTime="2025-11-22 09:40:49.200179197 +0000 UTC m=+4722.906540289" Nov 22 09:40:49 crc kubenswrapper[4743]: I1122 09:40:49.217626 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.217604488 podStartE2EDuration="8.217604488s" podCreationTimestamp="2025-11-22 09:40:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:40:49.21523205 +0000 UTC m=+4722.921593162" watchObservedRunningTime="2025-11-22 09:40:49.217604488 +0000 UTC m=+4722.923965550" Nov 22 09:40:49 crc kubenswrapper[4743]: I1122 09:40:49.792057 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:49 crc kubenswrapper[4743]: I1122 09:40:49.996861 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.037414 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-2slv2"] Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.173307 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" podUID="64e1f2f3-9b7b-46a9-9d81-d4187536df5a" containerName="dnsmasq-dns" containerID="cri-o://32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee" gracePeriod=10 Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.571744 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.614872 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-dns-svc\") pod \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.615010 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4jmh\" (UniqueName: \"kubernetes.io/projected/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-kube-api-access-b4jmh\") pod \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.615152 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-config\") pod \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\" (UID: \"64e1f2f3-9b7b-46a9-9d81-d4187536df5a\") " Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.631133 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-kube-api-access-b4jmh" (OuterVolumeSpecName: "kube-api-access-b4jmh") pod "64e1f2f3-9b7b-46a9-9d81-d4187536df5a" (UID: "64e1f2f3-9b7b-46a9-9d81-d4187536df5a"). InnerVolumeSpecName "kube-api-access-b4jmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.672984 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-config" (OuterVolumeSpecName: "config") pod "64e1f2f3-9b7b-46a9-9d81-d4187536df5a" (UID: "64e1f2f3-9b7b-46a9-9d81-d4187536df5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.677427 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "64e1f2f3-9b7b-46a9-9d81-d4187536df5a" (UID: "64e1f2f3-9b7b-46a9-9d81-d4187536df5a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.720104 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.720132 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4jmh\" (UniqueName: \"kubernetes.io/projected/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-kube-api-access-b4jmh\") on node \"crc\" DevicePath \"\"" Nov 22 09:40:50 crc kubenswrapper[4743]: I1122 09:40:50.720142 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64e1f2f3-9b7b-46a9-9d81-d4187536df5a-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.181950 4743 generic.go:334] "Generic (PLEG): container finished" podID="64e1f2f3-9b7b-46a9-9d81-d4187536df5a" containerID="32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee" exitCode=0 Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.182033 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.182048 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" event={"ID":"64e1f2f3-9b7b-46a9-9d81-d4187536df5a","Type":"ContainerDied","Data":"32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee"} Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.182409 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-2slv2" event={"ID":"64e1f2f3-9b7b-46a9-9d81-d4187536df5a","Type":"ContainerDied","Data":"2f2becffb625efa7429b3bf3d9e2688f6eaf15f4304e29f81d3870742baf12bf"} Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.182430 4743 scope.go:117] "RemoveContainer" containerID="32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee" Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.201240 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-2slv2"] Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.206466 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-2slv2"] Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.212628 4743 scope.go:117] "RemoveContainer" containerID="28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10" Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.227201 4743 scope.go:117] "RemoveContainer" containerID="32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee" Nov 22 09:40:51 crc kubenswrapper[4743]: E1122 09:40:51.227564 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee\": container with ID starting with 32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee not found: ID does not exist" containerID="32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee" Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.227625 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee"} err="failed to get container status \"32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee\": rpc error: code = NotFound desc = could not find container \"32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee\": container with ID starting with 32d241f837b9f5cc0014b8f16485e4ec063a2cdc6699da1892834780fc0c61ee not found: ID does not exist" Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.227650 4743 scope.go:117] "RemoveContainer" containerID="28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10" Nov 22 09:40:51 crc kubenswrapper[4743]: E1122 09:40:51.228070 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10\": container with ID starting with 28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10 not found: ID does not exist" containerID="28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10" Nov 22 09:40:51 crc kubenswrapper[4743]: I1122 09:40:51.228127 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10"} err="failed to get container status \"28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10\": rpc error: code = NotFound desc = could not find container \"28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10\": container with ID starting with 28ac02dc86ab8b8230cb43b0be9f27896e49749a334c2025483daeaf838d7e10 not found: ID does not exist" Nov 22 09:40:52 crc kubenswrapper[4743]: I1122 09:40:52.827217 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Nov 22 09:40:53 crc kubenswrapper[4743]: I1122 09:40:53.119030 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Nov 22 09:40:53 crc kubenswrapper[4743]: I1122 09:40:53.119146 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Nov 22 09:40:53 crc kubenswrapper[4743]: I1122 09:40:53.159778 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64e1f2f3-9b7b-46a9-9d81-d4187536df5a" path="/var/lib/kubelet/pods/64e1f2f3-9b7b-46a9-9d81-d4187536df5a/volumes" Nov 22 09:40:53 crc kubenswrapper[4743]: I1122 09:40:53.191834 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Nov 22 09:40:53 crc kubenswrapper[4743]: I1122 09:40:53.258207 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Nov 22 09:40:53 crc kubenswrapper[4743]: E1122 09:40:53.471251 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.245:40706->38.102.83.245:33143: write tcp 38.102.83.245:40706->38.102.83.245:33143: write: broken pipe Nov 22 09:40:53 crc kubenswrapper[4743]: I1122 09:40:53.964896 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:53 crc kubenswrapper[4743]: I1122 09:40:53.964974 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:55 crc kubenswrapper[4743]: I1122 09:40:55.195423 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Nov 22 09:40:55 crc kubenswrapper[4743]: I1122 09:40:55.278319 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.666960 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kmwpv"] Nov 22 09:41:13 crc kubenswrapper[4743]: E1122 09:41:13.668639 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e1f2f3-9b7b-46a9-9d81-d4187536df5a" containerName="dnsmasq-dns" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.668664 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e1f2f3-9b7b-46a9-9d81-d4187536df5a" containerName="dnsmasq-dns" Nov 22 09:41:13 crc kubenswrapper[4743]: E1122 09:41:13.668713 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e1f2f3-9b7b-46a9-9d81-d4187536df5a" containerName="init" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.668729 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e1f2f3-9b7b-46a9-9d81-d4187536df5a" containerName="init" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.669064 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e1f2f3-9b7b-46a9-9d81-d4187536df5a" containerName="dnsmasq-dns" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.671219 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.691767 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kmwpv"] Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.810091 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-utilities\") pod \"certified-operators-kmwpv\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.810210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xltj\" (UniqueName: \"kubernetes.io/projected/6d4affcd-03ba-4b66-ae35-cd74cc309adb-kube-api-access-7xltj\") pod \"certified-operators-kmwpv\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.810244 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-catalog-content\") pod \"certified-operators-kmwpv\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.911956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-utilities\") pod \"certified-operators-kmwpv\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.912137 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xltj\" (UniqueName: \"kubernetes.io/projected/6d4affcd-03ba-4b66-ae35-cd74cc309adb-kube-api-access-7xltj\") pod \"certified-operators-kmwpv\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.912192 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-catalog-content\") pod \"certified-operators-kmwpv\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.913400 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-utilities\") pod \"certified-operators-kmwpv\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.913486 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-catalog-content\") pod \"certified-operators-kmwpv\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:13 crc kubenswrapper[4743]: I1122 09:41:13.938377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xltj\" (UniqueName: \"kubernetes.io/projected/6d4affcd-03ba-4b66-ae35-cd74cc309adb-kube-api-access-7xltj\") pod \"certified-operators-kmwpv\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:14 crc kubenswrapper[4743]: I1122 09:41:14.045098 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:14 crc kubenswrapper[4743]: I1122 09:41:14.338511 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kmwpv"] Nov 22 09:41:14 crc kubenswrapper[4743]: I1122 09:41:14.420765 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmwpv" event={"ID":"6d4affcd-03ba-4b66-ae35-cd74cc309adb","Type":"ContainerStarted","Data":"cdb270bb3891efc4b137fd44d812c219fc57a2f593a274a0e2e672ec3c4763a0"} Nov 22 09:41:15 crc kubenswrapper[4743]: I1122 09:41:15.438263 4743 generic.go:334] "Generic (PLEG): container finished" podID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerID="514a23168c0ec224d152fc8a62294814b214a6fa07c85c63deab9136fccb7688" exitCode=0 Nov 22 09:41:15 crc kubenswrapper[4743]: I1122 09:41:15.438362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmwpv" event={"ID":"6d4affcd-03ba-4b66-ae35-cd74cc309adb","Type":"ContainerDied","Data":"514a23168c0ec224d152fc8a62294814b214a6fa07c85c63deab9136fccb7688"} Nov 22 09:41:16 crc kubenswrapper[4743]: I1122 09:41:16.451971 4743 generic.go:334] "Generic (PLEG): container finished" podID="50de19a9-ddc9-4417-bb70-8057fa9dcdfb" containerID="968adc6c9d405475d4d1b1c69c86a0ec387890f0b790fce61334cfba80d49542" exitCode=0 Nov 22 09:41:16 crc kubenswrapper[4743]: I1122 09:41:16.452116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"50de19a9-ddc9-4417-bb70-8057fa9dcdfb","Type":"ContainerDied","Data":"968adc6c9d405475d4d1b1c69c86a0ec387890f0b790fce61334cfba80d49542"} Nov 22 09:41:16 crc kubenswrapper[4743]: I1122 09:41:16.453973 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e468630-9fa8-4efb-af86-811dd40b6f3c" containerID="e713d8c4e4e8e5b440ab5542ab098efda9d0c5b8b4d416ca70682cd59674062f" exitCode=0 Nov 22 09:41:16 crc kubenswrapper[4743]: I1122 09:41:16.454046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e468630-9fa8-4efb-af86-811dd40b6f3c","Type":"ContainerDied","Data":"e713d8c4e4e8e5b440ab5542ab098efda9d0c5b8b4d416ca70682cd59674062f"} Nov 22 09:41:16 crc kubenswrapper[4743]: I1122 09:41:16.457680 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmwpv" event={"ID":"6d4affcd-03ba-4b66-ae35-cd74cc309adb","Type":"ContainerStarted","Data":"ade96096eb9b9c3a750c694d40123aae946dd67c0877f721d5f297f509b4e819"} Nov 22 09:41:17 crc kubenswrapper[4743]: I1122 09:41:17.478249 4743 generic.go:334] "Generic (PLEG): container finished" podID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerID="ade96096eb9b9c3a750c694d40123aae946dd67c0877f721d5f297f509b4e819" exitCode=0 Nov 22 09:41:17 crc kubenswrapper[4743]: I1122 09:41:17.478381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmwpv" event={"ID":"6d4affcd-03ba-4b66-ae35-cd74cc309adb","Type":"ContainerDied","Data":"ade96096eb9b9c3a750c694d40123aae946dd67c0877f721d5f297f509b4e819"} Nov 22 09:41:17 crc kubenswrapper[4743]: I1122 09:41:17.483792 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"50de19a9-ddc9-4417-bb70-8057fa9dcdfb","Type":"ContainerStarted","Data":"d8be5d0cd3105bbfbececc0c4f7f6731429ddf5049ca3a78edf22dd37d12ba92"} Nov 22 09:41:17 crc kubenswrapper[4743]: I1122 09:41:17.484010 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 09:41:17 crc kubenswrapper[4743]: I1122 09:41:17.488972 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e468630-9fa8-4efb-af86-811dd40b6f3c","Type":"ContainerStarted","Data":"c8b6fa1bd52738f4de8c972ec07b8f8b458883e7b730a4d1eda6efc48d8f7349"} Nov 22 09:41:17 crc kubenswrapper[4743]: I1122 09:41:17.489249 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:17 crc kubenswrapper[4743]: I1122 09:41:17.534103 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.534074758 podStartE2EDuration="38.534074758s" podCreationTimestamp="2025-11-22 09:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:41:17.524493413 +0000 UTC m=+4751.230854485" watchObservedRunningTime="2025-11-22 09:41:17.534074758 +0000 UTC m=+4751.240435820" Nov 22 09:41:17 crc kubenswrapper[4743]: I1122 09:41:17.551703 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.551656764 podStartE2EDuration="38.551656764s" podCreationTimestamp="2025-11-22 09:40:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:41:17.546497405 +0000 UTC m=+4751.252858467" watchObservedRunningTime="2025-11-22 09:41:17.551656764 +0000 UTC m=+4751.258017816" Nov 22 09:41:18 crc kubenswrapper[4743]: I1122 09:41:18.500818 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmwpv" event={"ID":"6d4affcd-03ba-4b66-ae35-cd74cc309adb","Type":"ContainerStarted","Data":"561f74096fb0b5529ed18ba0218029ebe31b6113488d5a49d8d319dd7b1b4eef"} Nov 22 09:41:18 crc kubenswrapper[4743]: I1122 09:41:18.517376 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kmwpv" podStartSLOduration=3.084187837 podStartE2EDuration="5.51736085s" podCreationTimestamp="2025-11-22 09:41:13 +0000 UTC" firstStartedPulling="2025-11-22 09:41:15.442595076 +0000 UTC m=+4749.148956118" lastFinishedPulling="2025-11-22 09:41:17.875768069 +0000 UTC m=+4751.582129131" observedRunningTime="2025-11-22 09:41:18.514733694 +0000 UTC m=+4752.221094756" watchObservedRunningTime="2025-11-22 09:41:18.51736085 +0000 UTC m=+4752.223721902" Nov 22 09:41:24 crc kubenswrapper[4743]: I1122 09:41:24.046254 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:24 crc kubenswrapper[4743]: I1122 09:41:24.046907 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:24 crc kubenswrapper[4743]: I1122 09:41:24.089973 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:24 crc kubenswrapper[4743]: I1122 09:41:24.646773 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:27 crc kubenswrapper[4743]: I1122 09:41:27.049774 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kmwpv"] Nov 22 09:41:27 crc kubenswrapper[4743]: I1122 09:41:27.050990 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kmwpv" podUID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerName="registry-server" containerID="cri-o://561f74096fb0b5529ed18ba0218029ebe31b6113488d5a49d8d319dd7b1b4eef" gracePeriod=2 Nov 22 09:41:27 crc kubenswrapper[4743]: I1122 09:41:27.585225 4743 generic.go:334] "Generic (PLEG): container finished" podID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerID="561f74096fb0b5529ed18ba0218029ebe31b6113488d5a49d8d319dd7b1b4eef" exitCode=0 Nov 22 09:41:27 crc kubenswrapper[4743]: I1122 09:41:27.585486 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmwpv" event={"ID":"6d4affcd-03ba-4b66-ae35-cd74cc309adb","Type":"ContainerDied","Data":"561f74096fb0b5529ed18ba0218029ebe31b6113488d5a49d8d319dd7b1b4eef"} Nov 22 09:41:27 crc kubenswrapper[4743]: I1122 09:41:27.999480 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.168910 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-catalog-content\") pod \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.168978 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-utilities\") pod \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.169015 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xltj\" (UniqueName: \"kubernetes.io/projected/6d4affcd-03ba-4b66-ae35-cd74cc309adb-kube-api-access-7xltj\") pod \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\" (UID: \"6d4affcd-03ba-4b66-ae35-cd74cc309adb\") " Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.173226 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-utilities" (OuterVolumeSpecName: "utilities") pod "6d4affcd-03ba-4b66-ae35-cd74cc309adb" (UID: "6d4affcd-03ba-4b66-ae35-cd74cc309adb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.235392 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d4affcd-03ba-4b66-ae35-cd74cc309adb-kube-api-access-7xltj" (OuterVolumeSpecName: "kube-api-access-7xltj") pod "6d4affcd-03ba-4b66-ae35-cd74cc309adb" (UID: "6d4affcd-03ba-4b66-ae35-cd74cc309adb"). InnerVolumeSpecName "kube-api-access-7xltj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.240263 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6d4affcd-03ba-4b66-ae35-cd74cc309adb" (UID: "6d4affcd-03ba-4b66-ae35-cd74cc309adb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.270903 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xltj\" (UniqueName: \"kubernetes.io/projected/6d4affcd-03ba-4b66-ae35-cd74cc309adb-kube-api-access-7xltj\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.270941 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.270955 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6d4affcd-03ba-4b66-ae35-cd74cc309adb-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.602066 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kmwpv" event={"ID":"6d4affcd-03ba-4b66-ae35-cd74cc309adb","Type":"ContainerDied","Data":"cdb270bb3891efc4b137fd44d812c219fc57a2f593a274a0e2e672ec3c4763a0"} Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.602211 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kmwpv" Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.602357 4743 scope.go:117] "RemoveContainer" containerID="561f74096fb0b5529ed18ba0218029ebe31b6113488d5a49d8d319dd7b1b4eef" Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.633170 4743 scope.go:117] "RemoveContainer" containerID="ade96096eb9b9c3a750c694d40123aae946dd67c0877f721d5f297f509b4e819" Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.645663 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kmwpv"] Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.650640 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kmwpv"] Nov 22 09:41:28 crc kubenswrapper[4743]: I1122 09:41:28.692123 4743 scope.go:117] "RemoveContainer" containerID="514a23168c0ec224d152fc8a62294814b214a6fa07c85c63deab9136fccb7688" Nov 22 09:41:29 crc kubenswrapper[4743]: I1122 09:41:29.171077 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" path="/var/lib/kubelet/pods/6d4affcd-03ba-4b66-ae35-cd74cc309adb/volumes" Nov 22 09:41:30 crc kubenswrapper[4743]: I1122 09:41:30.913378 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 09:41:31 crc kubenswrapper[4743]: I1122 09:41:31.166508 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.137374 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2wbsk"] Nov 22 09:41:37 crc kubenswrapper[4743]: E1122 09:41:37.138807 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerName="registry-server" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.138834 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerName="registry-server" Nov 22 09:41:37 crc kubenswrapper[4743]: E1122 09:41:37.138870 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerName="extract-utilities" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.138883 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerName="extract-utilities" Nov 22 09:41:37 crc kubenswrapper[4743]: E1122 09:41:37.138922 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerName="extract-content" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.138940 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerName="extract-content" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.139296 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d4affcd-03ba-4b66-ae35-cd74cc309adb" containerName="registry-server" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.141176 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.146667 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2wbsk"] Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.232365 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwdxz\" (UniqueName: \"kubernetes.io/projected/9c859fc4-c111-4a3b-aa17-5af4446f2edf-kube-api-access-vwdxz\") pod \"dnsmasq-dns-5b7946d7b9-2wbsk\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.232773 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-2wbsk\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.232828 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-config\") pod \"dnsmasq-dns-5b7946d7b9-2wbsk\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.333851 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-2wbsk\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.333898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-config\") pod \"dnsmasq-dns-5b7946d7b9-2wbsk\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.333941 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwdxz\" (UniqueName: \"kubernetes.io/projected/9c859fc4-c111-4a3b-aa17-5af4446f2edf-kube-api-access-vwdxz\") pod \"dnsmasq-dns-5b7946d7b9-2wbsk\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.335113 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-2wbsk\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.336073 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-config\") pod \"dnsmasq-dns-5b7946d7b9-2wbsk\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.362056 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwdxz\" (UniqueName: \"kubernetes.io/projected/9c859fc4-c111-4a3b-aa17-5af4446f2edf-kube-api-access-vwdxz\") pod \"dnsmasq-dns-5b7946d7b9-2wbsk\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.460221 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.755801 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2wbsk"] Nov 22 09:41:37 crc kubenswrapper[4743]: W1122 09:41:37.761229 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c859fc4_c111_4a3b_aa17_5af4446f2edf.slice/crio-7338e2b87884e50253422f038ace19903e603f3268b4ef6b56424a636f34c908 WatchSource:0}: Error finding container 7338e2b87884e50253422f038ace19903e603f3268b4ef6b56424a636f34c908: Status 404 returned error can't find the container with id 7338e2b87884e50253422f038ace19903e603f3268b4ef6b56424a636f34c908 Nov 22 09:41:37 crc kubenswrapper[4743]: I1122 09:41:37.912940 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:41:38 crc kubenswrapper[4743]: I1122 09:41:38.701215 4743 generic.go:334] "Generic (PLEG): container finished" podID="9c859fc4-c111-4a3b-aa17-5af4446f2edf" containerID="8b50c90349ce49f9d440fd03409b7ee5970aa905ec157d175e6b9f08e2849888" exitCode=0 Nov 22 09:41:38 crc kubenswrapper[4743]: I1122 09:41:38.701299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" event={"ID":"9c859fc4-c111-4a3b-aa17-5af4446f2edf","Type":"ContainerDied","Data":"8b50c90349ce49f9d440fd03409b7ee5970aa905ec157d175e6b9f08e2849888"} Nov 22 09:41:38 crc kubenswrapper[4743]: I1122 09:41:38.701339 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" event={"ID":"9c859fc4-c111-4a3b-aa17-5af4446f2edf","Type":"ContainerStarted","Data":"7338e2b87884e50253422f038ace19903e603f3268b4ef6b56424a636f34c908"} Nov 22 09:41:38 crc kubenswrapper[4743]: I1122 09:41:38.706577 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:41:39 crc kubenswrapper[4743]: I1122 09:41:39.732350 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" event={"ID":"9c859fc4-c111-4a3b-aa17-5af4446f2edf","Type":"ContainerStarted","Data":"a0f5020c6a50793b4fdd6b8fe473589a2cee580ea6d7cf060900bb51c6b827a1"} Nov 22 09:41:39 crc kubenswrapper[4743]: I1122 09:41:39.734003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:39 crc kubenswrapper[4743]: I1122 09:41:39.767277 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" podStartSLOduration=2.767259645 podStartE2EDuration="2.767259645s" podCreationTimestamp="2025-11-22 09:41:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:41:39.76012488 +0000 UTC m=+4773.466485932" watchObservedRunningTime="2025-11-22 09:41:39.767259645 +0000 UTC m=+4773.473620687" Nov 22 09:41:40 crc kubenswrapper[4743]: I1122 09:41:40.048388 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="50de19a9-ddc9-4417-bb70-8057fa9dcdfb" containerName="rabbitmq" containerID="cri-o://d8be5d0cd3105bbfbececc0c4f7f6731429ddf5049ca3a78edf22dd37d12ba92" gracePeriod=604798 Nov 22 09:41:40 crc kubenswrapper[4743]: I1122 09:41:40.590979 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1e468630-9fa8-4efb-af86-811dd40b6f3c" containerName="rabbitmq" containerID="cri-o://c8b6fa1bd52738f4de8c972ec07b8f8b458883e7b730a4d1eda6efc48d8f7349" gracePeriod=604799 Nov 22 09:41:40 crc kubenswrapper[4743]: I1122 09:41:40.912371 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="50de19a9-ddc9-4417-bb70-8057fa9dcdfb" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.237:5672: connect: connection refused" Nov 22 09:41:41 crc kubenswrapper[4743]: I1122 09:41:41.156892 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1e468630-9fa8-4efb-af86-811dd40b6f3c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.238:5672: connect: connection refused" Nov 22 09:41:46 crc kubenswrapper[4743]: I1122 09:41:46.820034 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e468630-9fa8-4efb-af86-811dd40b6f3c" containerID="c8b6fa1bd52738f4de8c972ec07b8f8b458883e7b730a4d1eda6efc48d8f7349" exitCode=0 Nov 22 09:41:46 crc kubenswrapper[4743]: I1122 09:41:46.820153 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e468630-9fa8-4efb-af86-811dd40b6f3c","Type":"ContainerDied","Data":"c8b6fa1bd52738f4de8c972ec07b8f8b458883e7b730a4d1eda6efc48d8f7349"} Nov 22 09:41:46 crc kubenswrapper[4743]: I1122 09:41:46.824078 4743 generic.go:334] "Generic (PLEG): container finished" podID="50de19a9-ddc9-4417-bb70-8057fa9dcdfb" containerID="d8be5d0cd3105bbfbececc0c4f7f6731429ddf5049ca3a78edf22dd37d12ba92" exitCode=0 Nov 22 09:41:46 crc kubenswrapper[4743]: I1122 09:41:46.824118 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"50de19a9-ddc9-4417-bb70-8057fa9dcdfb","Type":"ContainerDied","Data":"d8be5d0cd3105bbfbececc0c4f7f6731429ddf5049ca3a78edf22dd37d12ba92"} Nov 22 09:41:46 crc kubenswrapper[4743]: I1122 09:41:46.824200 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"50de19a9-ddc9-4417-bb70-8057fa9dcdfb","Type":"ContainerDied","Data":"5748efc96ea0592fd16b22c3ea4bb25a539ef3ca3ef45ef67f846a4dec30c24c"} Nov 22 09:41:46 crc kubenswrapper[4743]: I1122 09:41:46.824216 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5748efc96ea0592fd16b22c3ea4bb25a539ef3ca3ef45ef67f846a4dec30c24c" Nov 22 09:41:46 crc kubenswrapper[4743]: I1122 09:41:46.899109 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.008258 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm6x4\" (UniqueName: \"kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-kube-api-access-xm6x4\") pod \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.008324 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-erlang-cookie\") pod \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.008455 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-server-conf\") pod \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.008510 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-confd\") pod \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.008544 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-plugins\") pod \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.008575 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-erlang-cookie-secret\") pod \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.008747 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") pod \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.008783 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-plugins-conf\") pod \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.008868 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-pod-info\") pod \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\" (UID: \"50de19a9-ddc9-4417-bb70-8057fa9dcdfb\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.009747 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "50de19a9-ddc9-4417-bb70-8057fa9dcdfb" (UID: "50de19a9-ddc9-4417-bb70-8057fa9dcdfb"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.009949 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "50de19a9-ddc9-4417-bb70-8057fa9dcdfb" (UID: "50de19a9-ddc9-4417-bb70-8057fa9dcdfb"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.010225 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "50de19a9-ddc9-4417-bb70-8057fa9dcdfb" (UID: "50de19a9-ddc9-4417-bb70-8057fa9dcdfb"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.014751 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "50de19a9-ddc9-4417-bb70-8057fa9dcdfb" (UID: "50de19a9-ddc9-4417-bb70-8057fa9dcdfb"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.014789 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-pod-info" (OuterVolumeSpecName: "pod-info") pod "50de19a9-ddc9-4417-bb70-8057fa9dcdfb" (UID: "50de19a9-ddc9-4417-bb70-8057fa9dcdfb"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.015787 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-kube-api-access-xm6x4" (OuterVolumeSpecName: "kube-api-access-xm6x4") pod "50de19a9-ddc9-4417-bb70-8057fa9dcdfb" (UID: "50de19a9-ddc9-4417-bb70-8057fa9dcdfb"). InnerVolumeSpecName "kube-api-access-xm6x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.027759 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a" (OuterVolumeSpecName: "persistence") pod "50de19a9-ddc9-4417-bb70-8057fa9dcdfb" (UID: "50de19a9-ddc9-4417-bb70-8057fa9dcdfb"). InnerVolumeSpecName "pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.029589 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-server-conf" (OuterVolumeSpecName: "server-conf") pod "50de19a9-ddc9-4417-bb70-8057fa9dcdfb" (UID: "50de19a9-ddc9-4417-bb70-8057fa9dcdfb"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.111050 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.111051 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "50de19a9-ddc9-4417-bb70-8057fa9dcdfb" (UID: "50de19a9-ddc9-4417-bb70-8057fa9dcdfb"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.111082 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm6x4\" (UniqueName: \"kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-kube-api-access-xm6x4\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.111132 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.111144 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.111154 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.111164 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.111194 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") on node \"crc\" " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.111206 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.128674 4743 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.128988 4743 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a") on node "crc" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.213150 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/50de19a9-ddc9-4417-bb70-8057fa9dcdfb-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.213223 4743 reconciler_common.go:293] "Volume detached for volume \"pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.251879 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.314315 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-plugins-conf\") pod \"1e468630-9fa8-4efb-af86-811dd40b6f3c\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.314438 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e468630-9fa8-4efb-af86-811dd40b6f3c-pod-info\") pod \"1e468630-9fa8-4efb-af86-811dd40b6f3c\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.314627 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e468630-9fa8-4efb-af86-811dd40b6f3c-erlang-cookie-secret\") pod \"1e468630-9fa8-4efb-af86-811dd40b6f3c\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.314676 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-erlang-cookie\") pod \"1e468630-9fa8-4efb-af86-811dd40b6f3c\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.314748 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-server-conf\") pod \"1e468630-9fa8-4efb-af86-811dd40b6f3c\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.314824 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjk2c\" (UniqueName: \"kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-kube-api-access-rjk2c\") pod \"1e468630-9fa8-4efb-af86-811dd40b6f3c\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.314894 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-plugins\") pod \"1e468630-9fa8-4efb-af86-811dd40b6f3c\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.315074 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") pod \"1e468630-9fa8-4efb-af86-811dd40b6f3c\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.315116 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-confd\") pod \"1e468630-9fa8-4efb-af86-811dd40b6f3c\" (UID: \"1e468630-9fa8-4efb-af86-811dd40b6f3c\") " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.316297 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1e468630-9fa8-4efb-af86-811dd40b6f3c" (UID: "1e468630-9fa8-4efb-af86-811dd40b6f3c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.316350 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1e468630-9fa8-4efb-af86-811dd40b6f3c" (UID: "1e468630-9fa8-4efb-af86-811dd40b6f3c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.319748 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e468630-9fa8-4efb-af86-811dd40b6f3c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1e468630-9fa8-4efb-af86-811dd40b6f3c" (UID: "1e468630-9fa8-4efb-af86-811dd40b6f3c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.320026 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1e468630-9fa8-4efb-af86-811dd40b6f3c" (UID: "1e468630-9fa8-4efb-af86-811dd40b6f3c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.320642 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-kube-api-access-rjk2c" (OuterVolumeSpecName: "kube-api-access-rjk2c") pod "1e468630-9fa8-4efb-af86-811dd40b6f3c" (UID: "1e468630-9fa8-4efb-af86-811dd40b6f3c"). InnerVolumeSpecName "kube-api-access-rjk2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.328615 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1e468630-9fa8-4efb-af86-811dd40b6f3c-pod-info" (OuterVolumeSpecName: "pod-info") pod "1e468630-9fa8-4efb-af86-811dd40b6f3c" (UID: "1e468630-9fa8-4efb-af86-811dd40b6f3c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.333919 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c" (OuterVolumeSpecName: "persistence") pod "1e468630-9fa8-4efb-af86-811dd40b6f3c" (UID: "1e468630-9fa8-4efb-af86-811dd40b6f3c"). InnerVolumeSpecName "pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.341350 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-server-conf" (OuterVolumeSpecName: "server-conf") pod "1e468630-9fa8-4efb-af86-811dd40b6f3c" (UID: "1e468630-9fa8-4efb-af86-811dd40b6f3c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.400118 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1e468630-9fa8-4efb-af86-811dd40b6f3c" (UID: "1e468630-9fa8-4efb-af86-811dd40b6f3c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.417907 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.417943 4743 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-plugins-conf\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.417952 4743 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1e468630-9fa8-4efb-af86-811dd40b6f3c-pod-info\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.417963 4743 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1e468630-9fa8-4efb-af86-811dd40b6f3c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.417975 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.417985 4743 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1e468630-9fa8-4efb-af86-811dd40b6f3c-server-conf\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.417993 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjk2c\" (UniqueName: \"kubernetes.io/projected/1e468630-9fa8-4efb-af86-811dd40b6f3c-kube-api-access-rjk2c\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.418001 4743 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1e468630-9fa8-4efb-af86-811dd40b6f3c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.418041 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") on node \"crc\" " Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.447170 4743 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.447406 4743 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c") on node "crc" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.461853 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.518113 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x7jsl"] Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.518920 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" podUID="d6676dcf-2992-4a50-a37a-feab61d327e4" containerName="dnsmasq-dns" containerID="cri-o://d3e58b6a189af54ae625b579ec13f91891b6c2e96189a6c4eacf4ccff9bbf230" gracePeriod=10 Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.526967 4743 reconciler_common.go:293] "Volume detached for volume \"pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.841080 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1e468630-9fa8-4efb-af86-811dd40b6f3c","Type":"ContainerDied","Data":"7cb1152aaed510e5cf32b6a85c9aa9d2ecaa309e3a0f1f14c659b49f79faeb1d"} Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.841145 4743 scope.go:117] "RemoveContainer" containerID="c8b6fa1bd52738f4de8c972ec07b8f8b458883e7b730a4d1eda6efc48d8f7349" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.841277 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.852191 4743 generic.go:334] "Generic (PLEG): container finished" podID="d6676dcf-2992-4a50-a37a-feab61d327e4" containerID="d3e58b6a189af54ae625b579ec13f91891b6c2e96189a6c4eacf4ccff9bbf230" exitCode=0 Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.852297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" event={"ID":"d6676dcf-2992-4a50-a37a-feab61d327e4","Type":"ContainerDied","Data":"d3e58b6a189af54ae625b579ec13f91891b6c2e96189a6c4eacf4ccff9bbf230"} Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.852323 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.884370 4743 scope.go:117] "RemoveContainer" containerID="e713d8c4e4e8e5b440ab5542ab098efda9d0c5b8b4d416ca70682cd59674062f" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.888777 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.894790 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.910262 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.920008 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.932398 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:41:47 crc kubenswrapper[4743]: E1122 09:41:47.932752 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e468630-9fa8-4efb-af86-811dd40b6f3c" containerName="setup-container" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.932770 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e468630-9fa8-4efb-af86-811dd40b6f3c" containerName="setup-container" Nov 22 09:41:47 crc kubenswrapper[4743]: E1122 09:41:47.932781 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50de19a9-ddc9-4417-bb70-8057fa9dcdfb" containerName="rabbitmq" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.932787 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="50de19a9-ddc9-4417-bb70-8057fa9dcdfb" containerName="rabbitmq" Nov 22 09:41:47 crc kubenswrapper[4743]: E1122 09:41:47.932803 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e468630-9fa8-4efb-af86-811dd40b6f3c" containerName="rabbitmq" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.932809 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e468630-9fa8-4efb-af86-811dd40b6f3c" containerName="rabbitmq" Nov 22 09:41:47 crc kubenswrapper[4743]: E1122 09:41:47.932827 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50de19a9-ddc9-4417-bb70-8057fa9dcdfb" containerName="setup-container" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.932834 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="50de19a9-ddc9-4417-bb70-8057fa9dcdfb" containerName="setup-container" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.932970 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e468630-9fa8-4efb-af86-811dd40b6f3c" containerName="rabbitmq" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.932987 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="50de19a9-ddc9-4417-bb70-8057fa9dcdfb" containerName="rabbitmq" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.933822 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.937139 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-lfz2k" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.937321 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.937457 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.937635 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.937763 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.952849 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.960413 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.963364 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.974616 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.974771 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-5cvz5" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.975040 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.975157 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.975273 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Nov 22 09:41:47 crc kubenswrapper[4743]: I1122 09:41:47.976189 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.033425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c512ff1-fd60-4b1c-a421-fd277d259d35-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.033964 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034008 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034028 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c512ff1-fd60-4b1c-a421-fd277d259d35-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034047 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c512ff1-fd60-4b1c-a421-fd277d259d35-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034113 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034138 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034173 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c512ff1-fd60-4b1c-a421-fd277d259d35-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034220 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034257 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c512ff1-fd60-4b1c-a421-fd277d259d35-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2dmt\" (UniqueName: \"kubernetes.io/projected/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-kube-api-access-w2dmt\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034295 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c512ff1-fd60-4b1c-a421-fd277d259d35-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034326 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034346 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7xsk\" (UniqueName: \"kubernetes.io/projected/1c512ff1-fd60-4b1c-a421-fd277d259d35-kube-api-access-b7xsk\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034366 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.034415 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c512ff1-fd60-4b1c-a421-fd277d259d35-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.135898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c512ff1-fd60-4b1c-a421-fd277d259d35-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.135951 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c512ff1-fd60-4b1c-a421-fd277d259d35-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.135976 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136004 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c512ff1-fd60-4b1c-a421-fd277d259d35-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136093 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c512ff1-fd60-4b1c-a421-fd277d259d35-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136114 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136171 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c512ff1-fd60-4b1c-a421-fd277d259d35-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c512ff1-fd60-4b1c-a421-fd277d259d35-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136288 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2dmt\" (UniqueName: \"kubernetes.io/projected/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-kube-api-access-w2dmt\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136313 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136329 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c512ff1-fd60-4b1c-a421-fd277d259d35-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7xsk\" (UniqueName: \"kubernetes.io/projected/1c512ff1-fd60-4b1c-a421-fd277d259d35-kube-api-access-b7xsk\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.136373 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.137230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.137665 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c512ff1-fd60-4b1c-a421-fd277d259d35-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.138034 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.138185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c512ff1-fd60-4b1c-a421-fd277d259d35-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.138251 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.138316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c512ff1-fd60-4b1c-a421-fd277d259d35-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.139117 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c512ff1-fd60-4b1c-a421-fd277d259d35-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.140033 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.140068 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8456811baf4ea1f550c890ff7a8ba9ccfae89e30299224b2d58e4c2bac2b9738/globalmount\"" pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.140291 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.140325 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f9bce1c2a821a60f09b63c9bc3276b70afe1c2ebc894453033219ddee0b70426/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.140928 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.141977 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.143345 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.143635 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c512ff1-fd60-4b1c-a421-fd277d259d35-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.143650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c512ff1-fd60-4b1c-a421-fd277d259d35-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.144392 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.145703 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.150136 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c512ff1-fd60-4b1c-a421-fd277d259d35-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.179309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7xsk\" (UniqueName: \"kubernetes.io/projected/1c512ff1-fd60-4b1c-a421-fd277d259d35-kube-api-access-b7xsk\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.184042 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-626e3a4f-4839-419a-82cf-67dbcd7de46a\") pod \"rabbitmq-server-0\" (UID: \"1c512ff1-fd60-4b1c-a421-fd277d259d35\") " pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.189442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2dmt\" (UniqueName: \"kubernetes.io/projected/fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7-kube-api-access-w2dmt\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.208647 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-100a32a5-7488-4cc4-a8b5-6dfb355c548c\") pod \"rabbitmq-cell1-server-0\" (UID: \"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7\") " pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.236943 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-dns-svc\") pod \"d6676dcf-2992-4a50-a37a-feab61d327e4\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.237000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x676z\" (UniqueName: \"kubernetes.io/projected/d6676dcf-2992-4a50-a37a-feab61d327e4-kube-api-access-x676z\") pod \"d6676dcf-2992-4a50-a37a-feab61d327e4\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.237133 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-config\") pod \"d6676dcf-2992-4a50-a37a-feab61d327e4\" (UID: \"d6676dcf-2992-4a50-a37a-feab61d327e4\") " Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.240085 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6676dcf-2992-4a50-a37a-feab61d327e4-kube-api-access-x676z" (OuterVolumeSpecName: "kube-api-access-x676z") pod "d6676dcf-2992-4a50-a37a-feab61d327e4" (UID: "d6676dcf-2992-4a50-a37a-feab61d327e4"). InnerVolumeSpecName "kube-api-access-x676z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.258247 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.270430 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-config" (OuterVolumeSpecName: "config") pod "d6676dcf-2992-4a50-a37a-feab61d327e4" (UID: "d6676dcf-2992-4a50-a37a-feab61d327e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.284473 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6676dcf-2992-4a50-a37a-feab61d327e4" (UID: "d6676dcf-2992-4a50-a37a-feab61d327e4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.305118 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.338734 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.338994 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6676dcf-2992-4a50-a37a-feab61d327e4-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.339005 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x676z\" (UniqueName: \"kubernetes.io/projected/d6676dcf-2992-4a50-a37a-feab61d327e4-kube-api-access-x676z\") on node \"crc\" DevicePath \"\"" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.745877 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.808930 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Nov 22 09:41:48 crc kubenswrapper[4743]: W1122 09:41:48.812818 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf7c2ce_bdf7_4156_a7a6_5ba0b43e63a7.slice/crio-35664ed28a439e3104fc3bb100c7d6778df0621e8d4c5a01718b368bd7517388 WatchSource:0}: Error finding container 35664ed28a439e3104fc3bb100c7d6778df0621e8d4c5a01718b368bd7517388: Status 404 returned error can't find the container with id 35664ed28a439e3104fc3bb100c7d6778df0621e8d4c5a01718b368bd7517388 Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.891448 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1c512ff1-fd60-4b1c-a421-fd277d259d35","Type":"ContainerStarted","Data":"eae76b60254c77f06837a03664d66dc321b6b9986affe685194145b52c136826"} Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.893555 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" event={"ID":"d6676dcf-2992-4a50-a37a-feab61d327e4","Type":"ContainerDied","Data":"1188912da7847e29b3eed1186d6ab312fece0a3f5f66056724d0fba4284e3fba"} Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.893611 4743 scope.go:117] "RemoveContainer" containerID="d3e58b6a189af54ae625b579ec13f91891b6c2e96189a6c4eacf4ccff9bbf230" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.893667 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-x7jsl" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.896332 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7","Type":"ContainerStarted","Data":"35664ed28a439e3104fc3bb100c7d6778df0621e8d4c5a01718b368bd7517388"} Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.923396 4743 scope.go:117] "RemoveContainer" containerID="2a87852c7f3766128169efb694821d1e68f435311a31e7bdb702c77650573ed0" Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.949684 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x7jsl"] Nov 22 09:41:48 crc kubenswrapper[4743]: I1122 09:41:48.955669 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-x7jsl"] Nov 22 09:41:49 crc kubenswrapper[4743]: I1122 09:41:49.167223 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e468630-9fa8-4efb-af86-811dd40b6f3c" path="/var/lib/kubelet/pods/1e468630-9fa8-4efb-af86-811dd40b6f3c/volumes" Nov 22 09:41:49 crc kubenswrapper[4743]: I1122 09:41:49.169124 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50de19a9-ddc9-4417-bb70-8057fa9dcdfb" path="/var/lib/kubelet/pods/50de19a9-ddc9-4417-bb70-8057fa9dcdfb/volumes" Nov 22 09:41:49 crc kubenswrapper[4743]: I1122 09:41:49.177865 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6676dcf-2992-4a50-a37a-feab61d327e4" path="/var/lib/kubelet/pods/d6676dcf-2992-4a50-a37a-feab61d327e4/volumes" Nov 22 09:41:50 crc kubenswrapper[4743]: I1122 09:41:50.916709 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1c512ff1-fd60-4b1c-a421-fd277d259d35","Type":"ContainerStarted","Data":"7d4b47590da8f70b16a69a07cd49c083b0dd9410804d26103f4a406864b2fce3"} Nov 22 09:41:50 crc kubenswrapper[4743]: I1122 09:41:50.919305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7","Type":"ContainerStarted","Data":"27cc89dc46f8710b4d0714c4b5ff0d66bd3c4655d0e61c1815271092e4e697af"} Nov 22 09:42:23 crc kubenswrapper[4743]: I1122 09:42:23.259222 4743 generic.go:334] "Generic (PLEG): container finished" podID="fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7" containerID="27cc89dc46f8710b4d0714c4b5ff0d66bd3c4655d0e61c1815271092e4e697af" exitCode=0 Nov 22 09:42:23 crc kubenswrapper[4743]: I1122 09:42:23.259712 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7","Type":"ContainerDied","Data":"27cc89dc46f8710b4d0714c4b5ff0d66bd3c4655d0e61c1815271092e4e697af"} Nov 22 09:42:24 crc kubenswrapper[4743]: I1122 09:42:24.271101 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7","Type":"ContainerStarted","Data":"ef40e1da597fe798f2e1483b3a928cce253e2ea15373c98be223eeb697d1b2ce"} Nov 22 09:42:24 crc kubenswrapper[4743]: I1122 09:42:24.271797 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:42:24 crc kubenswrapper[4743]: I1122 09:42:24.272796 4743 generic.go:334] "Generic (PLEG): container finished" podID="1c512ff1-fd60-4b1c-a421-fd277d259d35" containerID="7d4b47590da8f70b16a69a07cd49c083b0dd9410804d26103f4a406864b2fce3" exitCode=0 Nov 22 09:42:24 crc kubenswrapper[4743]: I1122 09:42:24.272850 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1c512ff1-fd60-4b1c-a421-fd277d259d35","Type":"ContainerDied","Data":"7d4b47590da8f70b16a69a07cd49c083b0dd9410804d26103f4a406864b2fce3"} Nov 22 09:42:24 crc kubenswrapper[4743]: I1122 09:42:24.306071 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.306047732 podStartE2EDuration="37.306047732s" podCreationTimestamp="2025-11-22 09:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:42:24.298856385 +0000 UTC m=+4818.005217457" watchObservedRunningTime="2025-11-22 09:42:24.306047732 +0000 UTC m=+4818.012408794" Nov 22 09:42:25 crc kubenswrapper[4743]: I1122 09:42:25.282646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1c512ff1-fd60-4b1c-a421-fd277d259d35","Type":"ContainerStarted","Data":"f1d70d19a4cf57cc3f8f931f76fb8f2057c74c70a8616c794749913df0dc0757"} Nov 22 09:42:25 crc kubenswrapper[4743]: I1122 09:42:25.283018 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Nov 22 09:42:25 crc kubenswrapper[4743]: I1122 09:42:25.320503 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.320475847 podStartE2EDuration="38.320475847s" podCreationTimestamp="2025-11-22 09:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:42:25.311795168 +0000 UTC m=+4819.018156230" watchObservedRunningTime="2025-11-22 09:42:25.320475847 +0000 UTC m=+4819.026836939" Nov 22 09:42:38 crc kubenswrapper[4743]: I1122 09:42:38.261448 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Nov 22 09:42:38 crc kubenswrapper[4743]: I1122 09:42:38.307721 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.512423 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Nov 22 09:42:45 crc kubenswrapper[4743]: E1122 09:42:45.513400 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6676dcf-2992-4a50-a37a-feab61d327e4" containerName="init" Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.513419 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6676dcf-2992-4a50-a37a-feab61d327e4" containerName="init" Nov 22 09:42:45 crc kubenswrapper[4743]: E1122 09:42:45.513459 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6676dcf-2992-4a50-a37a-feab61d327e4" containerName="dnsmasq-dns" Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.513466 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6676dcf-2992-4a50-a37a-feab61d327e4" containerName="dnsmasq-dns" Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.513674 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6676dcf-2992-4a50-a37a-feab61d327e4" containerName="dnsmasq-dns" Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.514315 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.516857 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nnsnp" Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.531473 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.615003 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wsfm\" (UniqueName: \"kubernetes.io/projected/25166349-858c-47e4-a046-f1973fa979b9-kube-api-access-7wsfm\") pod \"mariadb-client-1-default\" (UID: \"25166349-858c-47e4-a046-f1973fa979b9\") " pod="openstack/mariadb-client-1-default" Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.716393 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wsfm\" (UniqueName: \"kubernetes.io/projected/25166349-858c-47e4-a046-f1973fa979b9-kube-api-access-7wsfm\") pod \"mariadb-client-1-default\" (UID: \"25166349-858c-47e4-a046-f1973fa979b9\") " pod="openstack/mariadb-client-1-default" Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.744937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wsfm\" (UniqueName: \"kubernetes.io/projected/25166349-858c-47e4-a046-f1973fa979b9-kube-api-access-7wsfm\") pod \"mariadb-client-1-default\" (UID: \"25166349-858c-47e4-a046-f1973fa979b9\") " pod="openstack/mariadb-client-1-default" Nov 22 09:42:45 crc kubenswrapper[4743]: I1122 09:42:45.855745 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 22 09:42:46 crc kubenswrapper[4743]: I1122 09:42:46.406278 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 22 09:42:46 crc kubenswrapper[4743]: I1122 09:42:46.448523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"25166349-858c-47e4-a046-f1973fa979b9","Type":"ContainerStarted","Data":"80df34fe7d0be7fc9c9d7bad8fca0bf6b46a095dfcd681c997938d7e2833eab2"} Nov 22 09:42:47 crc kubenswrapper[4743]: I1122 09:42:47.456228 4743 generic.go:334] "Generic (PLEG): container finished" podID="25166349-858c-47e4-a046-f1973fa979b9" containerID="bf4c489c82bbb4305e6f9ae74135e30836cbf7b59e661005b0a0c47e30c2be88" exitCode=0 Nov 22 09:42:47 crc kubenswrapper[4743]: I1122 09:42:47.456274 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"25166349-858c-47e4-a046-f1973fa979b9","Type":"ContainerDied","Data":"bf4c489c82bbb4305e6f9ae74135e30836cbf7b59e661005b0a0c47e30c2be88"} Nov 22 09:42:48 crc kubenswrapper[4743]: I1122 09:42:48.883139 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 22 09:42:48 crc kubenswrapper[4743]: I1122 09:42:48.919337 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_25166349-858c-47e4-a046-f1973fa979b9/mariadb-client-1-default/0.log" Nov 22 09:42:48 crc kubenswrapper[4743]: I1122 09:42:48.945974 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 22 09:42:48 crc kubenswrapper[4743]: I1122 09:42:48.953042 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Nov 22 09:42:48 crc kubenswrapper[4743]: I1122 09:42:48.968184 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wsfm\" (UniqueName: \"kubernetes.io/projected/25166349-858c-47e4-a046-f1973fa979b9-kube-api-access-7wsfm\") pod \"25166349-858c-47e4-a046-f1973fa979b9\" (UID: \"25166349-858c-47e4-a046-f1973fa979b9\") " Nov 22 09:42:48 crc kubenswrapper[4743]: I1122 09:42:48.977409 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25166349-858c-47e4-a046-f1973fa979b9-kube-api-access-7wsfm" (OuterVolumeSpecName: "kube-api-access-7wsfm") pod "25166349-858c-47e4-a046-f1973fa979b9" (UID: "25166349-858c-47e4-a046-f1973fa979b9"). InnerVolumeSpecName "kube-api-access-7wsfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.070304 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wsfm\" (UniqueName: \"kubernetes.io/projected/25166349-858c-47e4-a046-f1973fa979b9-kube-api-access-7wsfm\") on node \"crc\" DevicePath \"\"" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.165542 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25166349-858c-47e4-a046-f1973fa979b9" path="/var/lib/kubelet/pods/25166349-858c-47e4-a046-f1973fa979b9/volumes" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.407376 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Nov 22 09:42:49 crc kubenswrapper[4743]: E1122 09:42:49.407760 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25166349-858c-47e4-a046-f1973fa979b9" containerName="mariadb-client-1-default" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.407782 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="25166349-858c-47e4-a046-f1973fa979b9" containerName="mariadb-client-1-default" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.407999 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="25166349-858c-47e4-a046-f1973fa979b9" containerName="mariadb-client-1-default" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.408479 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.414683 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.471963 4743 scope.go:117] "RemoveContainer" containerID="bf4c489c82bbb4305e6f9ae74135e30836cbf7b59e661005b0a0c47e30c2be88" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.471998 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.475693 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qp56\" (UniqueName: \"kubernetes.io/projected/3f04de83-846f-4690-95ae-bce61054e926-kube-api-access-6qp56\") pod \"mariadb-client-2-default\" (UID: \"3f04de83-846f-4690-95ae-bce61054e926\") " pod="openstack/mariadb-client-2-default" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.576621 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qp56\" (UniqueName: \"kubernetes.io/projected/3f04de83-846f-4690-95ae-bce61054e926-kube-api-access-6qp56\") pod \"mariadb-client-2-default\" (UID: \"3f04de83-846f-4690-95ae-bce61054e926\") " pod="openstack/mariadb-client-2-default" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.592429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qp56\" (UniqueName: \"kubernetes.io/projected/3f04de83-846f-4690-95ae-bce61054e926-kube-api-access-6qp56\") pod \"mariadb-client-2-default\" (UID: \"3f04de83-846f-4690-95ae-bce61054e926\") " pod="openstack/mariadb-client-2-default" Nov 22 09:42:49 crc kubenswrapper[4743]: I1122 09:42:49.729187 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 22 09:42:50 crc kubenswrapper[4743]: I1122 09:42:50.244356 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 22 09:42:50 crc kubenswrapper[4743]: I1122 09:42:50.483682 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"3f04de83-846f-4690-95ae-bce61054e926","Type":"ContainerStarted","Data":"89fe5ad61e4acf6a36eaa7287ba40af4308ac5cd551699233a3633da6c9a93df"} Nov 22 09:42:50 crc kubenswrapper[4743]: I1122 09:42:50.483739 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"3f04de83-846f-4690-95ae-bce61054e926","Type":"ContainerStarted","Data":"00ccb9fafe10aca2593d33f3564a56c642247c19e02a344da0f6be9463c95f17"} Nov 22 09:42:50 crc kubenswrapper[4743]: I1122 09:42:50.502337 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-2-default" podStartSLOduration=1.5023017520000002 podStartE2EDuration="1.502301752s" podCreationTimestamp="2025-11-22 09:42:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:42:50.49701025 +0000 UTC m=+4844.203371322" watchObservedRunningTime="2025-11-22 09:42:50.502301752 +0000 UTC m=+4844.208662804" Nov 22 09:42:51 crc kubenswrapper[4743]: I1122 09:42:51.492866 4743 generic.go:334] "Generic (PLEG): container finished" podID="3f04de83-846f-4690-95ae-bce61054e926" containerID="89fe5ad61e4acf6a36eaa7287ba40af4308ac5cd551699233a3633da6c9a93df" exitCode=1 Nov 22 09:42:51 crc kubenswrapper[4743]: I1122 09:42:51.492934 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"3f04de83-846f-4690-95ae-bce61054e926","Type":"ContainerDied","Data":"89fe5ad61e4acf6a36eaa7287ba40af4308ac5cd551699233a3633da6c9a93df"} Nov 22 09:42:52 crc kubenswrapper[4743]: I1122 09:42:52.962494 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.000885 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.005547 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.043217 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qp56\" (UniqueName: \"kubernetes.io/projected/3f04de83-846f-4690-95ae-bce61054e926-kube-api-access-6qp56\") pod \"3f04de83-846f-4690-95ae-bce61054e926\" (UID: \"3f04de83-846f-4690-95ae-bce61054e926\") " Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.051414 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f04de83-846f-4690-95ae-bce61054e926-kube-api-access-6qp56" (OuterVolumeSpecName: "kube-api-access-6qp56") pod "3f04de83-846f-4690-95ae-bce61054e926" (UID: "3f04de83-846f-4690-95ae-bce61054e926"). InnerVolumeSpecName "kube-api-access-6qp56". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.145336 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qp56\" (UniqueName: \"kubernetes.io/projected/3f04de83-846f-4690-95ae-bce61054e926-kube-api-access-6qp56\") on node \"crc\" DevicePath \"\"" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.167731 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f04de83-846f-4690-95ae-bce61054e926" path="/var/lib/kubelet/pods/3f04de83-846f-4690-95ae-bce61054e926/volumes" Nov 22 09:42:53 crc kubenswrapper[4743]: E1122 09:42:53.350105 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f04de83_846f_4690_95ae_bce61054e926.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f04de83_846f_4690_95ae_bce61054e926.slice/crio-00ccb9fafe10aca2593d33f3564a56c642247c19e02a344da0f6be9463c95f17\": RecentStats: unable to find data in memory cache]" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.428274 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Nov 22 09:42:53 crc kubenswrapper[4743]: E1122 09:42:53.428684 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f04de83-846f-4690-95ae-bce61054e926" containerName="mariadb-client-2-default" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.428705 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f04de83-846f-4690-95ae-bce61054e926" containerName="mariadb-client-2-default" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.428908 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f04de83-846f-4690-95ae-bce61054e926" containerName="mariadb-client-2-default" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.429502 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.453249 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.512635 4743 scope.go:117] "RemoveContainer" containerID="89fe5ad61e4acf6a36eaa7287ba40af4308ac5cd551699233a3633da6c9a93df" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.512652 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.552192 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xpcb\" (UniqueName: \"kubernetes.io/projected/928e010f-1df6-4e94-8ece-f1ed470105b2-kube-api-access-4xpcb\") pod \"mariadb-client-1\" (UID: \"928e010f-1df6-4e94-8ece-f1ed470105b2\") " pod="openstack/mariadb-client-1" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.653855 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xpcb\" (UniqueName: \"kubernetes.io/projected/928e010f-1df6-4e94-8ece-f1ed470105b2-kube-api-access-4xpcb\") pod \"mariadb-client-1\" (UID: \"928e010f-1df6-4e94-8ece-f1ed470105b2\") " pod="openstack/mariadb-client-1" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.674994 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xpcb\" (UniqueName: \"kubernetes.io/projected/928e010f-1df6-4e94-8ece-f1ed470105b2-kube-api-access-4xpcb\") pod \"mariadb-client-1\" (UID: \"928e010f-1df6-4e94-8ece-f1ed470105b2\") " pod="openstack/mariadb-client-1" Nov 22 09:42:53 crc kubenswrapper[4743]: I1122 09:42:53.754120 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 22 09:42:54 crc kubenswrapper[4743]: I1122 09:42:54.085671 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Nov 22 09:42:54 crc kubenswrapper[4743]: W1122 09:42:54.089911 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod928e010f_1df6_4e94_8ece_f1ed470105b2.slice/crio-7396b2f8d54e011e85e937ea04336f40e60fddd436fc189fac303f1581c6751e WatchSource:0}: Error finding container 7396b2f8d54e011e85e937ea04336f40e60fddd436fc189fac303f1581c6751e: Status 404 returned error can't find the container with id 7396b2f8d54e011e85e937ea04336f40e60fddd436fc189fac303f1581c6751e Nov 22 09:42:54 crc kubenswrapper[4743]: I1122 09:42:54.530875 4743 generic.go:334] "Generic (PLEG): container finished" podID="928e010f-1df6-4e94-8ece-f1ed470105b2" containerID="da30802932f11d5acb6469776270b33a83e33b97c4805d6b42ab438ee6550422" exitCode=0 Nov 22 09:42:54 crc kubenswrapper[4743]: I1122 09:42:54.531599 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"928e010f-1df6-4e94-8ece-f1ed470105b2","Type":"ContainerDied","Data":"da30802932f11d5acb6469776270b33a83e33b97c4805d6b42ab438ee6550422"} Nov 22 09:42:54 crc kubenswrapper[4743]: I1122 09:42:54.531827 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"928e010f-1df6-4e94-8ece-f1ed470105b2","Type":"ContainerStarted","Data":"7396b2f8d54e011e85e937ea04336f40e60fddd436fc189fac303f1581c6751e"} Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.039101 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.061963 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_928e010f-1df6-4e94-8ece-f1ed470105b2/mariadb-client-1/0.log" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.105171 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.118562 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.131652 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xpcb\" (UniqueName: \"kubernetes.io/projected/928e010f-1df6-4e94-8ece-f1ed470105b2-kube-api-access-4xpcb\") pod \"928e010f-1df6-4e94-8ece-f1ed470105b2\" (UID: \"928e010f-1df6-4e94-8ece-f1ed470105b2\") " Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.144970 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928e010f-1df6-4e94-8ece-f1ed470105b2-kube-api-access-4xpcb" (OuterVolumeSpecName: "kube-api-access-4xpcb") pod "928e010f-1df6-4e94-8ece-f1ed470105b2" (UID: "928e010f-1df6-4e94-8ece-f1ed470105b2"). InnerVolumeSpecName "kube-api-access-4xpcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.234174 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xpcb\" (UniqueName: \"kubernetes.io/projected/928e010f-1df6-4e94-8ece-f1ed470105b2-kube-api-access-4xpcb\") on node \"crc\" DevicePath \"\"" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.552657 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.552561 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7396b2f8d54e011e85e937ea04336f40e60fddd436fc189fac303f1581c6751e" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.634906 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Nov 22 09:42:56 crc kubenswrapper[4743]: E1122 09:42:56.635334 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928e010f-1df6-4e94-8ece-f1ed470105b2" containerName="mariadb-client-1" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.635352 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="928e010f-1df6-4e94-8ece-f1ed470105b2" containerName="mariadb-client-1" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.635556 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="928e010f-1df6-4e94-8ece-f1ed470105b2" containerName="mariadb-client-1" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.636383 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.640922 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nnsnp" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.649024 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.742534 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwxc\" (UniqueName: \"kubernetes.io/projected/0a87094f-5e30-4e8c-903e-481c9d7dc709-kube-api-access-npwxc\") pod \"mariadb-client-4-default\" (UID: \"0a87094f-5e30-4e8c-903e-481c9d7dc709\") " pod="openstack/mariadb-client-4-default" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.845026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwxc\" (UniqueName: \"kubernetes.io/projected/0a87094f-5e30-4e8c-903e-481c9d7dc709-kube-api-access-npwxc\") pod \"mariadb-client-4-default\" (UID: \"0a87094f-5e30-4e8c-903e-481c9d7dc709\") " pod="openstack/mariadb-client-4-default" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.874797 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwxc\" (UniqueName: \"kubernetes.io/projected/0a87094f-5e30-4e8c-903e-481c9d7dc709-kube-api-access-npwxc\") pod \"mariadb-client-4-default\" (UID: \"0a87094f-5e30-4e8c-903e-481c9d7dc709\") " pod="openstack/mariadb-client-4-default" Nov 22 09:42:56 crc kubenswrapper[4743]: I1122 09:42:56.982328 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 22 09:42:57 crc kubenswrapper[4743]: I1122 09:42:57.167113 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928e010f-1df6-4e94-8ece-f1ed470105b2" path="/var/lib/kubelet/pods/928e010f-1df6-4e94-8ece-f1ed470105b2/volumes" Nov 22 09:42:57 crc kubenswrapper[4743]: W1122 09:42:57.368895 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a87094f_5e30_4e8c_903e_481c9d7dc709.slice/crio-42a86bf2726a6a6a7d7b06de2d6d155986a8b248894abbbbf28efd01241bfd70 WatchSource:0}: Error finding container 42a86bf2726a6a6a7d7b06de2d6d155986a8b248894abbbbf28efd01241bfd70: Status 404 returned error can't find the container with id 42a86bf2726a6a6a7d7b06de2d6d155986a8b248894abbbbf28efd01241bfd70 Nov 22 09:42:57 crc kubenswrapper[4743]: I1122 09:42:57.371176 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 22 09:42:57 crc kubenswrapper[4743]: I1122 09:42:57.564406 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"0a87094f-5e30-4e8c-903e-481c9d7dc709","Type":"ContainerStarted","Data":"3825cced0ab5f9704258432dc4c4a3c1728c79f5c0d74e56fe2b292bff9185a9"} Nov 22 09:42:57 crc kubenswrapper[4743]: I1122 09:42:57.565003 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"0a87094f-5e30-4e8c-903e-481c9d7dc709","Type":"ContainerStarted","Data":"42a86bf2726a6a6a7d7b06de2d6d155986a8b248894abbbbf28efd01241bfd70"} Nov 22 09:42:57 crc kubenswrapper[4743]: I1122 09:42:57.579507 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-4-default" podStartSLOduration=1.579488191 podStartE2EDuration="1.579488191s" podCreationTimestamp="2025-11-22 09:42:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:42:57.575592609 +0000 UTC m=+4851.281953681" watchObservedRunningTime="2025-11-22 09:42:57.579488191 +0000 UTC m=+4851.285849243" Nov 22 09:42:57 crc kubenswrapper[4743]: I1122 09:42:57.623201 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_0a87094f-5e30-4e8c-903e-481c9d7dc709/mariadb-client-4-default/0.log" Nov 22 09:42:58 crc kubenswrapper[4743]: I1122 09:42:58.578444 4743 generic.go:334] "Generic (PLEG): container finished" podID="0a87094f-5e30-4e8c-903e-481c9d7dc709" containerID="3825cced0ab5f9704258432dc4c4a3c1728c79f5c0d74e56fe2b292bff9185a9" exitCode=0 Nov 22 09:42:58 crc kubenswrapper[4743]: I1122 09:42:58.578635 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"0a87094f-5e30-4e8c-903e-481c9d7dc709","Type":"ContainerDied","Data":"3825cced0ab5f9704258432dc4c4a3c1728c79f5c0d74e56fe2b292bff9185a9"} Nov 22 09:42:59 crc kubenswrapper[4743]: I1122 09:42:59.953034 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 22 09:43:00 crc kubenswrapper[4743]: I1122 09:43:00.002190 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 22 09:43:00 crc kubenswrapper[4743]: I1122 09:43:00.020951 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Nov 22 09:43:00 crc kubenswrapper[4743]: I1122 09:43:00.106536 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npwxc\" (UniqueName: \"kubernetes.io/projected/0a87094f-5e30-4e8c-903e-481c9d7dc709-kube-api-access-npwxc\") pod \"0a87094f-5e30-4e8c-903e-481c9d7dc709\" (UID: \"0a87094f-5e30-4e8c-903e-481c9d7dc709\") " Nov 22 09:43:00 crc kubenswrapper[4743]: I1122 09:43:00.114932 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a87094f-5e30-4e8c-903e-481c9d7dc709-kube-api-access-npwxc" (OuterVolumeSpecName: "kube-api-access-npwxc") pod "0a87094f-5e30-4e8c-903e-481c9d7dc709" (UID: "0a87094f-5e30-4e8c-903e-481c9d7dc709"). InnerVolumeSpecName "kube-api-access-npwxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:43:00 crc kubenswrapper[4743]: I1122 09:43:00.208788 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npwxc\" (UniqueName: \"kubernetes.io/projected/0a87094f-5e30-4e8c-903e-481c9d7dc709-kube-api-access-npwxc\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:00 crc kubenswrapper[4743]: I1122 09:43:00.604200 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42a86bf2726a6a6a7d7b06de2d6d155986a8b248894abbbbf28efd01241bfd70" Nov 22 09:43:00 crc kubenswrapper[4743]: I1122 09:43:00.604273 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Nov 22 09:43:01 crc kubenswrapper[4743]: I1122 09:43:01.185435 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a87094f-5e30-4e8c-903e-481c9d7dc709" path="/var/lib/kubelet/pods/0a87094f-5e30-4e8c-903e-481c9d7dc709/volumes" Nov 22 09:43:01 crc kubenswrapper[4743]: I1122 09:43:01.241337 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:43:01 crc kubenswrapper[4743]: I1122 09:43:01.241488 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:43:04 crc kubenswrapper[4743]: I1122 09:43:04.657287 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Nov 22 09:43:04 crc kubenswrapper[4743]: E1122 09:43:04.658107 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a87094f-5e30-4e8c-903e-481c9d7dc709" containerName="mariadb-client-4-default" Nov 22 09:43:04 crc kubenswrapper[4743]: I1122 09:43:04.658120 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a87094f-5e30-4e8c-903e-481c9d7dc709" containerName="mariadb-client-4-default" Nov 22 09:43:04 crc kubenswrapper[4743]: I1122 09:43:04.658252 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a87094f-5e30-4e8c-903e-481c9d7dc709" containerName="mariadb-client-4-default" Nov 22 09:43:04 crc kubenswrapper[4743]: I1122 09:43:04.658773 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 22 09:43:04 crc kubenswrapper[4743]: I1122 09:43:04.660815 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nnsnp" Nov 22 09:43:04 crc kubenswrapper[4743]: I1122 09:43:04.678698 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 22 09:43:04 crc kubenswrapper[4743]: I1122 09:43:04.787278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92vz8\" (UniqueName: \"kubernetes.io/projected/a62eaf82-4002-473e-bf4f-d82ce4a85cd0-kube-api-access-92vz8\") pod \"mariadb-client-5-default\" (UID: \"a62eaf82-4002-473e-bf4f-d82ce4a85cd0\") " pod="openstack/mariadb-client-5-default" Nov 22 09:43:04 crc kubenswrapper[4743]: I1122 09:43:04.888556 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92vz8\" (UniqueName: \"kubernetes.io/projected/a62eaf82-4002-473e-bf4f-d82ce4a85cd0-kube-api-access-92vz8\") pod \"mariadb-client-5-default\" (UID: \"a62eaf82-4002-473e-bf4f-d82ce4a85cd0\") " pod="openstack/mariadb-client-5-default" Nov 22 09:43:04 crc kubenswrapper[4743]: I1122 09:43:04.911093 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92vz8\" (UniqueName: \"kubernetes.io/projected/a62eaf82-4002-473e-bf4f-d82ce4a85cd0-kube-api-access-92vz8\") pod \"mariadb-client-5-default\" (UID: \"a62eaf82-4002-473e-bf4f-d82ce4a85cd0\") " pod="openstack/mariadb-client-5-default" Nov 22 09:43:04 crc kubenswrapper[4743]: I1122 09:43:04.982457 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 22 09:43:05 crc kubenswrapper[4743]: I1122 09:43:05.363751 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 22 09:43:05 crc kubenswrapper[4743]: I1122 09:43:05.647308 4743 generic.go:334] "Generic (PLEG): container finished" podID="a62eaf82-4002-473e-bf4f-d82ce4a85cd0" containerID="6a114d18d9af3eb833259b7655204dd978f5e9f25b443ee736d767257190d893" exitCode=0 Nov 22 09:43:05 crc kubenswrapper[4743]: I1122 09:43:05.647353 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"a62eaf82-4002-473e-bf4f-d82ce4a85cd0","Type":"ContainerDied","Data":"6a114d18d9af3eb833259b7655204dd978f5e9f25b443ee736d767257190d893"} Nov 22 09:43:05 crc kubenswrapper[4743]: I1122 09:43:05.647381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"a62eaf82-4002-473e-bf4f-d82ce4a85cd0","Type":"ContainerStarted","Data":"8ef8e1235ec6256a31a915ca17bb58fe6436b20e4c9e8ba32e0d886941c471e7"} Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.041925 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.065338 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_a62eaf82-4002-473e-bf4f-d82ce4a85cd0/mariadb-client-5-default/0.log" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.096375 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.104228 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.126876 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92vz8\" (UniqueName: \"kubernetes.io/projected/a62eaf82-4002-473e-bf4f-d82ce4a85cd0-kube-api-access-92vz8\") pod \"a62eaf82-4002-473e-bf4f-d82ce4a85cd0\" (UID: \"a62eaf82-4002-473e-bf4f-d82ce4a85cd0\") " Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.135069 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62eaf82-4002-473e-bf4f-d82ce4a85cd0-kube-api-access-92vz8" (OuterVolumeSpecName: "kube-api-access-92vz8") pod "a62eaf82-4002-473e-bf4f-d82ce4a85cd0" (UID: "a62eaf82-4002-473e-bf4f-d82ce4a85cd0"). InnerVolumeSpecName "kube-api-access-92vz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.174667 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a62eaf82-4002-473e-bf4f-d82ce4a85cd0" path="/var/lib/kubelet/pods/a62eaf82-4002-473e-bf4f-d82ce4a85cd0/volumes" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.228961 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92vz8\" (UniqueName: \"kubernetes.io/projected/a62eaf82-4002-473e-bf4f-d82ce4a85cd0-kube-api-access-92vz8\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.262450 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Nov 22 09:43:07 crc kubenswrapper[4743]: E1122 09:43:07.262918 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62eaf82-4002-473e-bf4f-d82ce4a85cd0" containerName="mariadb-client-5-default" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.262935 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62eaf82-4002-473e-bf4f-d82ce4a85cd0" containerName="mariadb-client-5-default" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.263134 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a62eaf82-4002-473e-bf4f-d82ce4a85cd0" containerName="mariadb-client-5-default" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.263898 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.270393 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.330443 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgg7b\" (UniqueName: \"kubernetes.io/projected/5521954f-16dc-4d49-bf60-cf6f45f16315-kube-api-access-lgg7b\") pod \"mariadb-client-6-default\" (UID: \"5521954f-16dc-4d49-bf60-cf6f45f16315\") " pod="openstack/mariadb-client-6-default" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.432026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgg7b\" (UniqueName: \"kubernetes.io/projected/5521954f-16dc-4d49-bf60-cf6f45f16315-kube-api-access-lgg7b\") pod \"mariadb-client-6-default\" (UID: \"5521954f-16dc-4d49-bf60-cf6f45f16315\") " pod="openstack/mariadb-client-6-default" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.452380 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgg7b\" (UniqueName: \"kubernetes.io/projected/5521954f-16dc-4d49-bf60-cf6f45f16315-kube-api-access-lgg7b\") pod \"mariadb-client-6-default\" (UID: \"5521954f-16dc-4d49-bf60-cf6f45f16315\") " pod="openstack/mariadb-client-6-default" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.584341 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.668103 4743 scope.go:117] "RemoveContainer" containerID="6a114d18d9af3eb833259b7655204dd978f5e9f25b443ee736d767257190d893" Nov 22 09:43:07 crc kubenswrapper[4743]: I1122 09:43:07.668251 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Nov 22 09:43:08 crc kubenswrapper[4743]: I1122 09:43:08.197838 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 22 09:43:08 crc kubenswrapper[4743]: W1122 09:43:08.205460 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5521954f_16dc_4d49_bf60_cf6f45f16315.slice/crio-483affe5689dc3fa735031d96dd9e1da46f1a24206cd3040deea347094bf8db6 WatchSource:0}: Error finding container 483affe5689dc3fa735031d96dd9e1da46f1a24206cd3040deea347094bf8db6: Status 404 returned error can't find the container with id 483affe5689dc3fa735031d96dd9e1da46f1a24206cd3040deea347094bf8db6 Nov 22 09:43:08 crc kubenswrapper[4743]: I1122 09:43:08.678867 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5521954f-16dc-4d49-bf60-cf6f45f16315","Type":"ContainerStarted","Data":"ea6f8664d83708a8020da7b04b588f603641b2056f660e6943a7e246085f6c78"} Nov 22 09:43:08 crc kubenswrapper[4743]: I1122 09:43:08.679474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5521954f-16dc-4d49-bf60-cf6f45f16315","Type":"ContainerStarted","Data":"483affe5689dc3fa735031d96dd9e1da46f1a24206cd3040deea347094bf8db6"} Nov 22 09:43:08 crc kubenswrapper[4743]: I1122 09:43:08.696517 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.696492561 podStartE2EDuration="1.696492561s" podCreationTimestamp="2025-11-22 09:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:43:08.69507247 +0000 UTC m=+4862.401433532" watchObservedRunningTime="2025-11-22 09:43:08.696492561 +0000 UTC m=+4862.402853613" Nov 22 09:43:09 crc kubenswrapper[4743]: I1122 09:43:09.696992 4743 generic.go:334] "Generic (PLEG): container finished" podID="5521954f-16dc-4d49-bf60-cf6f45f16315" containerID="ea6f8664d83708a8020da7b04b588f603641b2056f660e6943a7e246085f6c78" exitCode=1 Nov 22 09:43:09 crc kubenswrapper[4743]: I1122 09:43:09.697046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"5521954f-16dc-4d49-bf60-cf6f45f16315","Type":"ContainerDied","Data":"ea6f8664d83708a8020da7b04b588f603641b2056f660e6943a7e246085f6c78"} Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.155467 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.195971 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.201141 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.319073 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgg7b\" (UniqueName: \"kubernetes.io/projected/5521954f-16dc-4d49-bf60-cf6f45f16315-kube-api-access-lgg7b\") pod \"5521954f-16dc-4d49-bf60-cf6f45f16315\" (UID: \"5521954f-16dc-4d49-bf60-cf6f45f16315\") " Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.330811 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5521954f-16dc-4d49-bf60-cf6f45f16315-kube-api-access-lgg7b" (OuterVolumeSpecName: "kube-api-access-lgg7b") pod "5521954f-16dc-4d49-bf60-cf6f45f16315" (UID: "5521954f-16dc-4d49-bf60-cf6f45f16315"). InnerVolumeSpecName "kube-api-access-lgg7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.347760 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Nov 22 09:43:11 crc kubenswrapper[4743]: E1122 09:43:11.348096 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5521954f-16dc-4d49-bf60-cf6f45f16315" containerName="mariadb-client-6-default" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.348115 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5521954f-16dc-4d49-bf60-cf6f45f16315" containerName="mariadb-client-6-default" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.348253 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5521954f-16dc-4d49-bf60-cf6f45f16315" containerName="mariadb-client-6-default" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.348754 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.354336 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.422239 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgg7b\" (UniqueName: \"kubernetes.io/projected/5521954f-16dc-4d49-bf60-cf6f45f16315-kube-api-access-lgg7b\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.523731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spdlb\" (UniqueName: \"kubernetes.io/projected/787e3025-bf42-429d-94a8-04af96a8120d-kube-api-access-spdlb\") pod \"mariadb-client-7-default\" (UID: \"787e3025-bf42-429d-94a8-04af96a8120d\") " pod="openstack/mariadb-client-7-default" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.625219 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spdlb\" (UniqueName: \"kubernetes.io/projected/787e3025-bf42-429d-94a8-04af96a8120d-kube-api-access-spdlb\") pod \"mariadb-client-7-default\" (UID: \"787e3025-bf42-429d-94a8-04af96a8120d\") " pod="openstack/mariadb-client-7-default" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.655255 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spdlb\" (UniqueName: \"kubernetes.io/projected/787e3025-bf42-429d-94a8-04af96a8120d-kube-api-access-spdlb\") pod \"mariadb-client-7-default\" (UID: \"787e3025-bf42-429d-94a8-04af96a8120d\") " pod="openstack/mariadb-client-7-default" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.683835 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.716682 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="483affe5689dc3fa735031d96dd9e1da46f1a24206cd3040deea347094bf8db6" Nov 22 09:43:11 crc kubenswrapper[4743]: I1122 09:43:11.716806 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Nov 22 09:43:12 crc kubenswrapper[4743]: W1122 09:43:12.290824 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod787e3025_bf42_429d_94a8_04af96a8120d.slice/crio-229181ea6d260a7b9cdaf0cb700ea57084cad8d17318c6c080338952bee30ba3 WatchSource:0}: Error finding container 229181ea6d260a7b9cdaf0cb700ea57084cad8d17318c6c080338952bee30ba3: Status 404 returned error can't find the container with id 229181ea6d260a7b9cdaf0cb700ea57084cad8d17318c6c080338952bee30ba3 Nov 22 09:43:12 crc kubenswrapper[4743]: I1122 09:43:12.291392 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 22 09:43:12 crc kubenswrapper[4743]: I1122 09:43:12.724873 4743 generic.go:334] "Generic (PLEG): container finished" podID="787e3025-bf42-429d-94a8-04af96a8120d" containerID="5f052d4dd791b113a47dc232d2ce4a7ff66a08fe0ce8fa891354c8cc4631668a" exitCode=0 Nov 22 09:43:12 crc kubenswrapper[4743]: I1122 09:43:12.724980 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"787e3025-bf42-429d-94a8-04af96a8120d","Type":"ContainerDied","Data":"5f052d4dd791b113a47dc232d2ce4a7ff66a08fe0ce8fa891354c8cc4631668a"} Nov 22 09:43:12 crc kubenswrapper[4743]: I1122 09:43:12.725195 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"787e3025-bf42-429d-94a8-04af96a8120d","Type":"ContainerStarted","Data":"229181ea6d260a7b9cdaf0cb700ea57084cad8d17318c6c080338952bee30ba3"} Nov 22 09:43:13 crc kubenswrapper[4743]: I1122 09:43:13.167503 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5521954f-16dc-4d49-bf60-cf6f45f16315" path="/var/lib/kubelet/pods/5521954f-16dc-4d49-bf60-cf6f45f16315/volumes" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.132432 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.150870 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_787e3025-bf42-429d-94a8-04af96a8120d/mariadb-client-7-default/0.log" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.172289 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.177025 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.288318 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spdlb\" (UniqueName: \"kubernetes.io/projected/787e3025-bf42-429d-94a8-04af96a8120d-kube-api-access-spdlb\") pod \"787e3025-bf42-429d-94a8-04af96a8120d\" (UID: \"787e3025-bf42-429d-94a8-04af96a8120d\") " Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.295093 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/787e3025-bf42-429d-94a8-04af96a8120d-kube-api-access-spdlb" (OuterVolumeSpecName: "kube-api-access-spdlb") pod "787e3025-bf42-429d-94a8-04af96a8120d" (UID: "787e3025-bf42-429d-94a8-04af96a8120d"). InnerVolumeSpecName "kube-api-access-spdlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.345712 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Nov 22 09:43:14 crc kubenswrapper[4743]: E1122 09:43:14.346273 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="787e3025-bf42-429d-94a8-04af96a8120d" containerName="mariadb-client-7-default" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.346297 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="787e3025-bf42-429d-94a8-04af96a8120d" containerName="mariadb-client-7-default" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.346508 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="787e3025-bf42-429d-94a8-04af96a8120d" containerName="mariadb-client-7-default" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.354265 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.354409 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.390997 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbpzv\" (UniqueName: \"kubernetes.io/projected/e77f2169-8f5d-4f4e-8cf9-e832fc287b8f-kube-api-access-tbpzv\") pod \"mariadb-client-2\" (UID: \"e77f2169-8f5d-4f4e-8cf9-e832fc287b8f\") " pod="openstack/mariadb-client-2" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.391104 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spdlb\" (UniqueName: \"kubernetes.io/projected/787e3025-bf42-429d-94a8-04af96a8120d-kube-api-access-spdlb\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.492395 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbpzv\" (UniqueName: \"kubernetes.io/projected/e77f2169-8f5d-4f4e-8cf9-e832fc287b8f-kube-api-access-tbpzv\") pod \"mariadb-client-2\" (UID: \"e77f2169-8f5d-4f4e-8cf9-e832fc287b8f\") " pod="openstack/mariadb-client-2" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.517164 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbpzv\" (UniqueName: \"kubernetes.io/projected/e77f2169-8f5d-4f4e-8cf9-e832fc287b8f-kube-api-access-tbpzv\") pod \"mariadb-client-2\" (UID: \"e77f2169-8f5d-4f4e-8cf9-e832fc287b8f\") " pod="openstack/mariadb-client-2" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.677298 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.771964 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="229181ea6d260a7b9cdaf0cb700ea57084cad8d17318c6c080338952bee30ba3" Nov 22 09:43:14 crc kubenswrapper[4743]: I1122 09:43:14.772061 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Nov 22 09:43:15 crc kubenswrapper[4743]: I1122 09:43:15.166627 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="787e3025-bf42-429d-94a8-04af96a8120d" path="/var/lib/kubelet/pods/787e3025-bf42-429d-94a8-04af96a8120d/volumes" Nov 22 09:43:15 crc kubenswrapper[4743]: I1122 09:43:15.261762 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Nov 22 09:43:15 crc kubenswrapper[4743]: W1122 09:43:15.271055 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode77f2169_8f5d_4f4e_8cf9_e832fc287b8f.slice/crio-eeb0d1ef1b8a465b1cffc871bed2bc58aeffbcf3c15e4c7fcd82844577072960 WatchSource:0}: Error finding container eeb0d1ef1b8a465b1cffc871bed2bc58aeffbcf3c15e4c7fcd82844577072960: Status 404 returned error can't find the container with id eeb0d1ef1b8a465b1cffc871bed2bc58aeffbcf3c15e4c7fcd82844577072960 Nov 22 09:43:15 crc kubenswrapper[4743]: I1122 09:43:15.782822 4743 generic.go:334] "Generic (PLEG): container finished" podID="e77f2169-8f5d-4f4e-8cf9-e832fc287b8f" containerID="a5e0510666351e4e061fdf752a6ac8aba616018430171a77a6142944a2d00a1e" exitCode=0 Nov 22 09:43:15 crc kubenswrapper[4743]: I1122 09:43:15.782889 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"e77f2169-8f5d-4f4e-8cf9-e832fc287b8f","Type":"ContainerDied","Data":"a5e0510666351e4e061fdf752a6ac8aba616018430171a77a6142944a2d00a1e"} Nov 22 09:43:15 crc kubenswrapper[4743]: I1122 09:43:15.782933 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"e77f2169-8f5d-4f4e-8cf9-e832fc287b8f","Type":"ContainerStarted","Data":"eeb0d1ef1b8a465b1cffc871bed2bc58aeffbcf3c15e4c7fcd82844577072960"} Nov 22 09:43:17 crc kubenswrapper[4743]: I1122 09:43:17.264174 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 22 09:43:17 crc kubenswrapper[4743]: I1122 09:43:17.282613 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_e77f2169-8f5d-4f4e-8cf9-e832fc287b8f/mariadb-client-2/0.log" Nov 22 09:43:17 crc kubenswrapper[4743]: I1122 09:43:17.322203 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Nov 22 09:43:17 crc kubenswrapper[4743]: I1122 09:43:17.329587 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Nov 22 09:43:17 crc kubenswrapper[4743]: I1122 09:43:17.448297 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbpzv\" (UniqueName: \"kubernetes.io/projected/e77f2169-8f5d-4f4e-8cf9-e832fc287b8f-kube-api-access-tbpzv\") pod \"e77f2169-8f5d-4f4e-8cf9-e832fc287b8f\" (UID: \"e77f2169-8f5d-4f4e-8cf9-e832fc287b8f\") " Nov 22 09:43:17 crc kubenswrapper[4743]: I1122 09:43:17.458501 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e77f2169-8f5d-4f4e-8cf9-e832fc287b8f-kube-api-access-tbpzv" (OuterVolumeSpecName: "kube-api-access-tbpzv") pod "e77f2169-8f5d-4f4e-8cf9-e832fc287b8f" (UID: "e77f2169-8f5d-4f4e-8cf9-e832fc287b8f"). InnerVolumeSpecName "kube-api-access-tbpzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:43:17 crc kubenswrapper[4743]: I1122 09:43:17.550765 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbpzv\" (UniqueName: \"kubernetes.io/projected/e77f2169-8f5d-4f4e-8cf9-e832fc287b8f-kube-api-access-tbpzv\") on node \"crc\" DevicePath \"\"" Nov 22 09:43:17 crc kubenswrapper[4743]: I1122 09:43:17.800670 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eeb0d1ef1b8a465b1cffc871bed2bc58aeffbcf3c15e4c7fcd82844577072960" Nov 22 09:43:17 crc kubenswrapper[4743]: I1122 09:43:17.800712 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Nov 22 09:43:19 crc kubenswrapper[4743]: I1122 09:43:19.170090 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e77f2169-8f5d-4f4e-8cf9-e832fc287b8f" path="/var/lib/kubelet/pods/e77f2169-8f5d-4f4e-8cf9-e832fc287b8f/volumes" Nov 22 09:43:31 crc kubenswrapper[4743]: I1122 09:43:31.241892 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:43:31 crc kubenswrapper[4743]: I1122 09:43:31.242754 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:43:31 crc kubenswrapper[4743]: I1122 09:43:31.748714 4743 scope.go:117] "RemoveContainer" containerID="4fa8b8a76399829b269c653da64f2d72e76f48f14658f0bc2f878402abcc19f6" Nov 22 09:44:01 crc kubenswrapper[4743]: I1122 09:44:01.241514 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:44:01 crc kubenswrapper[4743]: I1122 09:44:01.242654 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:44:01 crc kubenswrapper[4743]: I1122 09:44:01.242721 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:44:01 crc kubenswrapper[4743]: I1122 09:44:01.243501 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4146b0d7d55e01b0e59622b589a7a214eef363f3e0d0b2a21abc2d4eaf2f55f5"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:44:01 crc kubenswrapper[4743]: I1122 09:44:01.243566 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://4146b0d7d55e01b0e59622b589a7a214eef363f3e0d0b2a21abc2d4eaf2f55f5" gracePeriod=600 Nov 22 09:44:02 crc kubenswrapper[4743]: I1122 09:44:02.237167 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="4146b0d7d55e01b0e59622b589a7a214eef363f3e0d0b2a21abc2d4eaf2f55f5" exitCode=0 Nov 22 09:44:02 crc kubenswrapper[4743]: I1122 09:44:02.237246 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"4146b0d7d55e01b0e59622b589a7a214eef363f3e0d0b2a21abc2d4eaf2f55f5"} Nov 22 09:44:02 crc kubenswrapper[4743]: I1122 09:44:02.237751 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56"} Nov 22 09:44:02 crc kubenswrapper[4743]: I1122 09:44:02.237781 4743 scope.go:117] "RemoveContainer" containerID="81751a02fdf5f710ffee28651d9e39ea51d9826ab4e8d7a0a2d1c7ed264189ae" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.156709 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7"] Nov 22 09:45:00 crc kubenswrapper[4743]: E1122 09:45:00.158019 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e77f2169-8f5d-4f4e-8cf9-e832fc287b8f" containerName="mariadb-client-2" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.158040 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77f2169-8f5d-4f4e-8cf9-e832fc287b8f" containerName="mariadb-client-2" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.158271 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e77f2169-8f5d-4f4e-8cf9-e832fc287b8f" containerName="mariadb-client-2" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.159095 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.164480 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.164740 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.179032 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7"] Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.303466 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b2454e-4ff4-42b3-aed8-fe654256639a-config-volume\") pod \"collect-profiles-29396745-t9fk7\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.303559 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxgfx\" (UniqueName: \"kubernetes.io/projected/d9b2454e-4ff4-42b3-aed8-fe654256639a-kube-api-access-fxgfx\") pod \"collect-profiles-29396745-t9fk7\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.303837 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b2454e-4ff4-42b3-aed8-fe654256639a-secret-volume\") pod \"collect-profiles-29396745-t9fk7\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.405969 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b2454e-4ff4-42b3-aed8-fe654256639a-config-volume\") pod \"collect-profiles-29396745-t9fk7\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.406046 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxgfx\" (UniqueName: \"kubernetes.io/projected/d9b2454e-4ff4-42b3-aed8-fe654256639a-kube-api-access-fxgfx\") pod \"collect-profiles-29396745-t9fk7\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.406081 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b2454e-4ff4-42b3-aed8-fe654256639a-secret-volume\") pod \"collect-profiles-29396745-t9fk7\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.407379 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b2454e-4ff4-42b3-aed8-fe654256639a-config-volume\") pod \"collect-profiles-29396745-t9fk7\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.430747 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b2454e-4ff4-42b3-aed8-fe654256639a-secret-volume\") pod \"collect-profiles-29396745-t9fk7\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.460202 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxgfx\" (UniqueName: \"kubernetes.io/projected/d9b2454e-4ff4-42b3-aed8-fe654256639a-kube-api-access-fxgfx\") pod \"collect-profiles-29396745-t9fk7\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.487874 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:00 crc kubenswrapper[4743]: I1122 09:45:00.914868 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7"] Nov 22 09:45:01 crc kubenswrapper[4743]: I1122 09:45:01.827856 4743 generic.go:334] "Generic (PLEG): container finished" podID="d9b2454e-4ff4-42b3-aed8-fe654256639a" containerID="fc19a6ab174358a3de2825506376a50ca1ce69796262542d83926eef3cbe9193" exitCode=0 Nov 22 09:45:01 crc kubenswrapper[4743]: I1122 09:45:01.828105 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" event={"ID":"d9b2454e-4ff4-42b3-aed8-fe654256639a","Type":"ContainerDied","Data":"fc19a6ab174358a3de2825506376a50ca1ce69796262542d83926eef3cbe9193"} Nov 22 09:45:01 crc kubenswrapper[4743]: I1122 09:45:01.828384 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" event={"ID":"d9b2454e-4ff4-42b3-aed8-fe654256639a","Type":"ContainerStarted","Data":"e9d75ebc88b22350839a3b375f4481fb399f44f87690752c067e8961de3575b5"} Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.222594 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.265049 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b2454e-4ff4-42b3-aed8-fe654256639a-config-volume\") pod \"d9b2454e-4ff4-42b3-aed8-fe654256639a\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.265124 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b2454e-4ff4-42b3-aed8-fe654256639a-secret-volume\") pod \"d9b2454e-4ff4-42b3-aed8-fe654256639a\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.265182 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxgfx\" (UniqueName: \"kubernetes.io/projected/d9b2454e-4ff4-42b3-aed8-fe654256639a-kube-api-access-fxgfx\") pod \"d9b2454e-4ff4-42b3-aed8-fe654256639a\" (UID: \"d9b2454e-4ff4-42b3-aed8-fe654256639a\") " Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.266962 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9b2454e-4ff4-42b3-aed8-fe654256639a-config-volume" (OuterVolumeSpecName: "config-volume") pod "d9b2454e-4ff4-42b3-aed8-fe654256639a" (UID: "d9b2454e-4ff4-42b3-aed8-fe654256639a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.273791 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9b2454e-4ff4-42b3-aed8-fe654256639a-kube-api-access-fxgfx" (OuterVolumeSpecName: "kube-api-access-fxgfx") pod "d9b2454e-4ff4-42b3-aed8-fe654256639a" (UID: "d9b2454e-4ff4-42b3-aed8-fe654256639a"). InnerVolumeSpecName "kube-api-access-fxgfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.273990 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9b2454e-4ff4-42b3-aed8-fe654256639a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d9b2454e-4ff4-42b3-aed8-fe654256639a" (UID: "d9b2454e-4ff4-42b3-aed8-fe654256639a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.367431 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9b2454e-4ff4-42b3-aed8-fe654256639a-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.367512 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxgfx\" (UniqueName: \"kubernetes.io/projected/d9b2454e-4ff4-42b3-aed8-fe654256639a-kube-api-access-fxgfx\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.367537 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9b2454e-4ff4-42b3-aed8-fe654256639a-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.852348 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" event={"ID":"d9b2454e-4ff4-42b3-aed8-fe654256639a","Type":"ContainerDied","Data":"e9d75ebc88b22350839a3b375f4481fb399f44f87690752c067e8961de3575b5"} Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.852396 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9d75ebc88b22350839a3b375f4481fb399f44f87690752c067e8961de3575b5" Nov 22 09:45:03 crc kubenswrapper[4743]: I1122 09:45:03.852480 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7" Nov 22 09:45:04 crc kubenswrapper[4743]: I1122 09:45:04.313432 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6"] Nov 22 09:45:04 crc kubenswrapper[4743]: I1122 09:45:04.325766 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396700-8m5c6"] Nov 22 09:45:05 crc kubenswrapper[4743]: I1122 09:45:05.163858 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7efdba6-7144-4590-855a-3b93a8edd588" path="/var/lib/kubelet/pods/c7efdba6-7144-4590-855a-3b93a8edd588/volumes" Nov 22 09:45:31 crc kubenswrapper[4743]: I1122 09:45:31.857569 4743 scope.go:117] "RemoveContainer" containerID="7d80e9fa94f3a615cb8b38f8b50e944d3b11f628ab538f3277f8eb46ed6bc1eb" Nov 22 09:46:01 crc kubenswrapper[4743]: I1122 09:46:01.241950 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:46:01 crc kubenswrapper[4743]: I1122 09:46:01.243101 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:46:31 crc kubenswrapper[4743]: I1122 09:46:31.241900 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:46:31 crc kubenswrapper[4743]: I1122 09:46:31.242552 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:47:01 crc kubenswrapper[4743]: I1122 09:47:01.241796 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:47:01 crc kubenswrapper[4743]: I1122 09:47:01.242401 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:47:01 crc kubenswrapper[4743]: I1122 09:47:01.242451 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:47:01 crc kubenswrapper[4743]: I1122 09:47:01.243117 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:47:01 crc kubenswrapper[4743]: I1122 09:47:01.243181 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" gracePeriod=600 Nov 22 09:47:01 crc kubenswrapper[4743]: E1122 09:47:01.361001 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:47:02 crc kubenswrapper[4743]: I1122 09:47:02.084960 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" exitCode=0 Nov 22 09:47:02 crc kubenswrapper[4743]: I1122 09:47:02.085032 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56"} Nov 22 09:47:02 crc kubenswrapper[4743]: I1122 09:47:02.085082 4743 scope.go:117] "RemoveContainer" containerID="4146b0d7d55e01b0e59622b589a7a214eef363f3e0d0b2a21abc2d4eaf2f55f5" Nov 22 09:47:02 crc kubenswrapper[4743]: I1122 09:47:02.085842 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:47:02 crc kubenswrapper[4743]: E1122 09:47:02.086218 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:47:16 crc kubenswrapper[4743]: I1122 09:47:16.151327 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:47:16 crc kubenswrapper[4743]: E1122 09:47:16.151993 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:47:30 crc kubenswrapper[4743]: I1122 09:47:30.153003 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:47:30 crc kubenswrapper[4743]: E1122 09:47:30.154459 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:47:31 crc kubenswrapper[4743]: I1122 09:47:31.951631 4743 scope.go:117] "RemoveContainer" containerID="968adc6c9d405475d4d1b1c69c86a0ec387890f0b790fce61334cfba80d49542" Nov 22 09:47:31 crc kubenswrapper[4743]: I1122 09:47:31.984869 4743 scope.go:117] "RemoveContainer" containerID="d8be5d0cd3105bbfbececc0c4f7f6731429ddf5049ca3a78edf22dd37d12ba92" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.555030 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Nov 22 09:47:38 crc kubenswrapper[4743]: E1122 09:47:38.555903 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9b2454e-4ff4-42b3-aed8-fe654256639a" containerName="collect-profiles" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.555919 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9b2454e-4ff4-42b3-aed8-fe654256639a" containerName="collect-profiles" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.556075 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9b2454e-4ff4-42b3-aed8-fe654256639a" containerName="collect-profiles" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.556686 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.561432 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-nnsnp" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.564503 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.641888 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8s4n\" (UniqueName: \"kubernetes.io/projected/443c2a25-4980-472c-ab82-682e852ee9ba-kube-api-access-r8s4n\") pod \"mariadb-copy-data\" (UID: \"443c2a25-4980-472c-ab82-682e852ee9ba\") " pod="openstack/mariadb-copy-data" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.642370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-75e6afb1-8e09-4829-9796-7e506a806635\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75e6afb1-8e09-4829-9796-7e506a806635\") pod \"mariadb-copy-data\" (UID: \"443c2a25-4980-472c-ab82-682e852ee9ba\") " pod="openstack/mariadb-copy-data" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.743436 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-75e6afb1-8e09-4829-9796-7e506a806635\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75e6afb1-8e09-4829-9796-7e506a806635\") pod \"mariadb-copy-data\" (UID: \"443c2a25-4980-472c-ab82-682e852ee9ba\") " pod="openstack/mariadb-copy-data" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.743635 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8s4n\" (UniqueName: \"kubernetes.io/projected/443c2a25-4980-472c-ab82-682e852ee9ba-kube-api-access-r8s4n\") pod \"mariadb-copy-data\" (UID: \"443c2a25-4980-472c-ab82-682e852ee9ba\") " pod="openstack/mariadb-copy-data" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.745902 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.745944 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-75e6afb1-8e09-4829-9796-7e506a806635\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75e6afb1-8e09-4829-9796-7e506a806635\") pod \"mariadb-copy-data\" (UID: \"443c2a25-4980-472c-ab82-682e852ee9ba\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6e761de8a26cf65fcf58b427314142009078954bbcd6ac917ee4b04e8fbc9645/globalmount\"" pod="openstack/mariadb-copy-data" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.766571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8s4n\" (UniqueName: \"kubernetes.io/projected/443c2a25-4980-472c-ab82-682e852ee9ba-kube-api-access-r8s4n\") pod \"mariadb-copy-data\" (UID: \"443c2a25-4980-472c-ab82-682e852ee9ba\") " pod="openstack/mariadb-copy-data" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.780041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-75e6afb1-8e09-4829-9796-7e506a806635\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75e6afb1-8e09-4829-9796-7e506a806635\") pod \"mariadb-copy-data\" (UID: \"443c2a25-4980-472c-ab82-682e852ee9ba\") " pod="openstack/mariadb-copy-data" Nov 22 09:47:38 crc kubenswrapper[4743]: I1122 09:47:38.874690 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 22 09:47:39 crc kubenswrapper[4743]: I1122 09:47:39.393062 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Nov 22 09:47:39 crc kubenswrapper[4743]: I1122 09:47:39.459671 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"443c2a25-4980-472c-ab82-682e852ee9ba","Type":"ContainerStarted","Data":"1598ee464443275dc11c09d52cbc6886bfc577dd15c7411a5248e78782c257a8"} Nov 22 09:47:40 crc kubenswrapper[4743]: I1122 09:47:40.470243 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"443c2a25-4980-472c-ab82-682e852ee9ba","Type":"ContainerStarted","Data":"a1d512483f974358dbb81edbdb2fa7f5965b74263ea1b2745b16187ba9db430a"} Nov 22 09:47:40 crc kubenswrapper[4743]: I1122 09:47:40.485398 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.485382325 podStartE2EDuration="3.485382325s" podCreationTimestamp="2025-11-22 09:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:47:40.482838332 +0000 UTC m=+5134.189199424" watchObservedRunningTime="2025-11-22 09:47:40.485382325 +0000 UTC m=+5134.191743377" Nov 22 09:47:42 crc kubenswrapper[4743]: I1122 09:47:42.151502 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:47:42 crc kubenswrapper[4743]: E1122 09:47:42.152011 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:47:43 crc kubenswrapper[4743]: I1122 09:47:43.269406 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Nov 22 09:47:43 crc kubenswrapper[4743]: I1122 09:47:43.270401 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 22 09:47:43 crc kubenswrapper[4743]: I1122 09:47:43.277299 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 22 09:47:43 crc kubenswrapper[4743]: I1122 09:47:43.331258 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms55c\" (UniqueName: \"kubernetes.io/projected/1d58b76d-95be-4fc1-8c64-4c3f8d4875c0-kube-api-access-ms55c\") pod \"mariadb-client\" (UID: \"1d58b76d-95be-4fc1-8c64-4c3f8d4875c0\") " pod="openstack/mariadb-client" Nov 22 09:47:43 crc kubenswrapper[4743]: I1122 09:47:43.432668 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms55c\" (UniqueName: \"kubernetes.io/projected/1d58b76d-95be-4fc1-8c64-4c3f8d4875c0-kube-api-access-ms55c\") pod \"mariadb-client\" (UID: \"1d58b76d-95be-4fc1-8c64-4c3f8d4875c0\") " pod="openstack/mariadb-client" Nov 22 09:47:43 crc kubenswrapper[4743]: I1122 09:47:43.455629 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms55c\" (UniqueName: \"kubernetes.io/projected/1d58b76d-95be-4fc1-8c64-4c3f8d4875c0-kube-api-access-ms55c\") pod \"mariadb-client\" (UID: \"1d58b76d-95be-4fc1-8c64-4c3f8d4875c0\") " pod="openstack/mariadb-client" Nov 22 09:47:43 crc kubenswrapper[4743]: I1122 09:47:43.586419 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 22 09:47:44 crc kubenswrapper[4743]: I1122 09:47:44.066722 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 22 09:47:44 crc kubenswrapper[4743]: I1122 09:47:44.507939 4743 generic.go:334] "Generic (PLEG): container finished" podID="1d58b76d-95be-4fc1-8c64-4c3f8d4875c0" containerID="bfd35d76be549bdae885bf19af23e2b8e1ae9823e7f17c923a7be247870e26f7" exitCode=0 Nov 22 09:47:44 crc kubenswrapper[4743]: I1122 09:47:44.508273 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"1d58b76d-95be-4fc1-8c64-4c3f8d4875c0","Type":"ContainerDied","Data":"bfd35d76be549bdae885bf19af23e2b8e1ae9823e7f17c923a7be247870e26f7"} Nov 22 09:47:44 crc kubenswrapper[4743]: I1122 09:47:44.508310 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"1d58b76d-95be-4fc1-8c64-4c3f8d4875c0","Type":"ContainerStarted","Data":"03e8a79c2380448a5da0b0d4e18a3e161eafa0764fcbd6c527b6cfbd72e454ba"} Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.811974 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.830745 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_1d58b76d-95be-4fc1-8c64-4c3f8d4875c0/mariadb-client/0.log" Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.859669 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.863506 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.873551 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms55c\" (UniqueName: \"kubernetes.io/projected/1d58b76d-95be-4fc1-8c64-4c3f8d4875c0-kube-api-access-ms55c\") pod \"1d58b76d-95be-4fc1-8c64-4c3f8d4875c0\" (UID: \"1d58b76d-95be-4fc1-8c64-4c3f8d4875c0\") " Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.879697 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d58b76d-95be-4fc1-8c64-4c3f8d4875c0-kube-api-access-ms55c" (OuterVolumeSpecName: "kube-api-access-ms55c") pod "1d58b76d-95be-4fc1-8c64-4c3f8d4875c0" (UID: "1d58b76d-95be-4fc1-8c64-4c3f8d4875c0"). InnerVolumeSpecName "kube-api-access-ms55c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.975470 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms55c\" (UniqueName: \"kubernetes.io/projected/1d58b76d-95be-4fc1-8c64-4c3f8d4875c0-kube-api-access-ms55c\") on node \"crc\" DevicePath \"\"" Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.981732 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dzn5z"] Nov 22 09:47:45 crc kubenswrapper[4743]: E1122 09:47:45.982116 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d58b76d-95be-4fc1-8c64-4c3f8d4875c0" containerName="mariadb-client" Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.982136 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d58b76d-95be-4fc1-8c64-4c3f8d4875c0" containerName="mariadb-client" Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.982332 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d58b76d-95be-4fc1-8c64-4c3f8d4875c0" containerName="mariadb-client" Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.983739 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:45 crc kubenswrapper[4743]: I1122 09:47:45.998441 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzn5z"] Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.052804 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.054195 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.064476 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.076705 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-catalog-content\") pod \"redhat-operators-dzn5z\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.076953 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-utilities\") pod \"redhat-operators-dzn5z\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.077085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mr28\" (UniqueName: \"kubernetes.io/projected/d2282a55-171b-4836-ade6-5064b814816e-kube-api-access-9mr28\") pod \"redhat-operators-dzn5z\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.178432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-utilities\") pod \"redhat-operators-dzn5z\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.178502 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mr28\" (UniqueName: \"kubernetes.io/projected/d2282a55-171b-4836-ade6-5064b814816e-kube-api-access-9mr28\") pod \"redhat-operators-dzn5z\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.178542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-catalog-content\") pod \"redhat-operators-dzn5z\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.178566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxmsj\" (UniqueName: \"kubernetes.io/projected/4a0e1faf-eddd-41d3-944c-85045c5050d4-kube-api-access-lxmsj\") pod \"mariadb-client\" (UID: \"4a0e1faf-eddd-41d3-944c-85045c5050d4\") " pod="openstack/mariadb-client" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.179034 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-utilities\") pod \"redhat-operators-dzn5z\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.179086 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-catalog-content\") pod \"redhat-operators-dzn5z\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.203564 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mr28\" (UniqueName: \"kubernetes.io/projected/d2282a55-171b-4836-ade6-5064b814816e-kube-api-access-9mr28\") pod \"redhat-operators-dzn5z\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.279631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxmsj\" (UniqueName: \"kubernetes.io/projected/4a0e1faf-eddd-41d3-944c-85045c5050d4-kube-api-access-lxmsj\") pod \"mariadb-client\" (UID: \"4a0e1faf-eddd-41d3-944c-85045c5050d4\") " pod="openstack/mariadb-client" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.301419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxmsj\" (UniqueName: \"kubernetes.io/projected/4a0e1faf-eddd-41d3-944c-85045c5050d4-kube-api-access-lxmsj\") pod \"mariadb-client\" (UID: \"4a0e1faf-eddd-41d3-944c-85045c5050d4\") " pod="openstack/mariadb-client" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.312238 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.372714 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.544385 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03e8a79c2380448a5da0b0d4e18a3e161eafa0764fcbd6c527b6cfbd72e454ba" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.544693 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.566205 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="1d58b76d-95be-4fc1-8c64-4c3f8d4875c0" podUID="4a0e1faf-eddd-41d3-944c-85045c5050d4" Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.610257 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Nov 22 09:47:46 crc kubenswrapper[4743]: W1122 09:47:46.613097 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a0e1faf_eddd_41d3_944c_85045c5050d4.slice/crio-a07431d5c1c26b47c17d62df662f17bc4709b318a530b6750a9c61a1358bf909 WatchSource:0}: Error finding container a07431d5c1c26b47c17d62df662f17bc4709b318a530b6750a9c61a1358bf909: Status 404 returned error can't find the container with id a07431d5c1c26b47c17d62df662f17bc4709b318a530b6750a9c61a1358bf909 Nov 22 09:47:46 crc kubenswrapper[4743]: I1122 09:47:46.864097 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dzn5z"] Nov 22 09:47:46 crc kubenswrapper[4743]: W1122 09:47:46.884202 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2282a55_171b_4836_ade6_5064b814816e.slice/crio-78ed7cce5c87d24fab45e7307cc08c229c07077d42eebf5581f338eebb8ea272 WatchSource:0}: Error finding container 78ed7cce5c87d24fab45e7307cc08c229c07077d42eebf5581f338eebb8ea272: Status 404 returned error can't find the container with id 78ed7cce5c87d24fab45e7307cc08c229c07077d42eebf5581f338eebb8ea272 Nov 22 09:47:47 crc kubenswrapper[4743]: I1122 09:47:47.159411 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d58b76d-95be-4fc1-8c64-4c3f8d4875c0" path="/var/lib/kubelet/pods/1d58b76d-95be-4fc1-8c64-4c3f8d4875c0/volumes" Nov 22 09:47:47 crc kubenswrapper[4743]: I1122 09:47:47.551644 4743 generic.go:334] "Generic (PLEG): container finished" podID="4a0e1faf-eddd-41d3-944c-85045c5050d4" containerID="86cd7b2a602c09641dd742a88ff32e784a5edc747b70dfe69a8d078bf4cf0b75" exitCode=0 Nov 22 09:47:47 crc kubenswrapper[4743]: I1122 09:47:47.551702 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4a0e1faf-eddd-41d3-944c-85045c5050d4","Type":"ContainerDied","Data":"86cd7b2a602c09641dd742a88ff32e784a5edc747b70dfe69a8d078bf4cf0b75"} Nov 22 09:47:47 crc kubenswrapper[4743]: I1122 09:47:47.551728 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"4a0e1faf-eddd-41d3-944c-85045c5050d4","Type":"ContainerStarted","Data":"a07431d5c1c26b47c17d62df662f17bc4709b318a530b6750a9c61a1358bf909"} Nov 22 09:47:47 crc kubenswrapper[4743]: I1122 09:47:47.553314 4743 generic.go:334] "Generic (PLEG): container finished" podID="d2282a55-171b-4836-ade6-5064b814816e" containerID="89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270" exitCode=0 Nov 22 09:47:47 crc kubenswrapper[4743]: I1122 09:47:47.553342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzn5z" event={"ID":"d2282a55-171b-4836-ade6-5064b814816e","Type":"ContainerDied","Data":"89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270"} Nov 22 09:47:47 crc kubenswrapper[4743]: I1122 09:47:47.553358 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzn5z" event={"ID":"d2282a55-171b-4836-ade6-5064b814816e","Type":"ContainerStarted","Data":"78ed7cce5c87d24fab45e7307cc08c229c07077d42eebf5581f338eebb8ea272"} Nov 22 09:47:47 crc kubenswrapper[4743]: I1122 09:47:47.555479 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:47:48 crc kubenswrapper[4743]: I1122 09:47:48.564223 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzn5z" event={"ID":"d2282a55-171b-4836-ade6-5064b814816e","Type":"ContainerStarted","Data":"4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a"} Nov 22 09:47:48 crc kubenswrapper[4743]: I1122 09:47:48.915163 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 22 09:47:48 crc kubenswrapper[4743]: I1122 09:47:48.934740 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_4a0e1faf-eddd-41d3-944c-85045c5050d4/mariadb-client/0.log" Nov 22 09:47:48 crc kubenswrapper[4743]: I1122 09:47:48.961009 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Nov 22 09:47:48 crc kubenswrapper[4743]: I1122 09:47:48.966657 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Nov 22 09:47:49 crc kubenswrapper[4743]: I1122 09:47:49.022022 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxmsj\" (UniqueName: \"kubernetes.io/projected/4a0e1faf-eddd-41d3-944c-85045c5050d4-kube-api-access-lxmsj\") pod \"4a0e1faf-eddd-41d3-944c-85045c5050d4\" (UID: \"4a0e1faf-eddd-41d3-944c-85045c5050d4\") " Nov 22 09:47:49 crc kubenswrapper[4743]: I1122 09:47:49.041423 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a0e1faf-eddd-41d3-944c-85045c5050d4-kube-api-access-lxmsj" (OuterVolumeSpecName: "kube-api-access-lxmsj") pod "4a0e1faf-eddd-41d3-944c-85045c5050d4" (UID: "4a0e1faf-eddd-41d3-944c-85045c5050d4"). InnerVolumeSpecName "kube-api-access-lxmsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:47:49 crc kubenswrapper[4743]: I1122 09:47:49.123631 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxmsj\" (UniqueName: \"kubernetes.io/projected/4a0e1faf-eddd-41d3-944c-85045c5050d4-kube-api-access-lxmsj\") on node \"crc\" DevicePath \"\"" Nov 22 09:47:49 crc kubenswrapper[4743]: I1122 09:47:49.160669 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a0e1faf-eddd-41d3-944c-85045c5050d4" path="/var/lib/kubelet/pods/4a0e1faf-eddd-41d3-944c-85045c5050d4/volumes" Nov 22 09:47:49 crc kubenswrapper[4743]: I1122 09:47:49.574488 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Nov 22 09:47:49 crc kubenswrapper[4743]: I1122 09:47:49.574498 4743 scope.go:117] "RemoveContainer" containerID="86cd7b2a602c09641dd742a88ff32e784a5edc747b70dfe69a8d078bf4cf0b75" Nov 22 09:47:49 crc kubenswrapper[4743]: I1122 09:47:49.578088 4743 generic.go:334] "Generic (PLEG): container finished" podID="d2282a55-171b-4836-ade6-5064b814816e" containerID="4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a" exitCode=0 Nov 22 09:47:49 crc kubenswrapper[4743]: I1122 09:47:49.578133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzn5z" event={"ID":"d2282a55-171b-4836-ade6-5064b814816e","Type":"ContainerDied","Data":"4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a"} Nov 22 09:47:50 crc kubenswrapper[4743]: I1122 09:47:50.588423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzn5z" event={"ID":"d2282a55-171b-4836-ade6-5064b814816e","Type":"ContainerStarted","Data":"7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5"} Nov 22 09:47:50 crc kubenswrapper[4743]: I1122 09:47:50.609149 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dzn5z" podStartSLOduration=3.175210464 podStartE2EDuration="5.609128699s" podCreationTimestamp="2025-11-22 09:47:45 +0000 UTC" firstStartedPulling="2025-11-22 09:47:47.555249025 +0000 UTC m=+5141.261610077" lastFinishedPulling="2025-11-22 09:47:49.98916726 +0000 UTC m=+5143.695528312" observedRunningTime="2025-11-22 09:47:50.606848303 +0000 UTC m=+5144.313209355" watchObservedRunningTime="2025-11-22 09:47:50.609128699 +0000 UTC m=+5144.315489751" Nov 22 09:47:53 crc kubenswrapper[4743]: I1122 09:47:53.151264 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:47:53 crc kubenswrapper[4743]: E1122 09:47:53.151686 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:47:56 crc kubenswrapper[4743]: I1122 09:47:56.312849 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:56 crc kubenswrapper[4743]: I1122 09:47:56.313238 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:56 crc kubenswrapper[4743]: I1122 09:47:56.357515 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:56 crc kubenswrapper[4743]: I1122 09:47:56.684114 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:56 crc kubenswrapper[4743]: I1122 09:47:56.744289 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzn5z"] Nov 22 09:47:58 crc kubenswrapper[4743]: I1122 09:47:58.650326 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dzn5z" podUID="d2282a55-171b-4836-ade6-5064b814816e" containerName="registry-server" containerID="cri-o://7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5" gracePeriod=2 Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.078184 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.133893 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-utilities\") pod \"d2282a55-171b-4836-ade6-5064b814816e\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.134201 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mr28\" (UniqueName: \"kubernetes.io/projected/d2282a55-171b-4836-ade6-5064b814816e-kube-api-access-9mr28\") pod \"d2282a55-171b-4836-ade6-5064b814816e\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.134488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-catalog-content\") pod \"d2282a55-171b-4836-ade6-5064b814816e\" (UID: \"d2282a55-171b-4836-ade6-5064b814816e\") " Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.134981 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-utilities" (OuterVolumeSpecName: "utilities") pod "d2282a55-171b-4836-ade6-5064b814816e" (UID: "d2282a55-171b-4836-ade6-5064b814816e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.140253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2282a55-171b-4836-ade6-5064b814816e-kube-api-access-9mr28" (OuterVolumeSpecName: "kube-api-access-9mr28") pod "d2282a55-171b-4836-ade6-5064b814816e" (UID: "d2282a55-171b-4836-ade6-5064b814816e"). InnerVolumeSpecName "kube-api-access-9mr28". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.236643 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.236686 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mr28\" (UniqueName: \"kubernetes.io/projected/d2282a55-171b-4836-ade6-5064b814816e-kube-api-access-9mr28\") on node \"crc\" DevicePath \"\"" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.658500 4743 generic.go:334] "Generic (PLEG): container finished" podID="d2282a55-171b-4836-ade6-5064b814816e" containerID="7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5" exitCode=0 Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.658553 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzn5z" event={"ID":"d2282a55-171b-4836-ade6-5064b814816e","Type":"ContainerDied","Data":"7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5"} Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.658608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dzn5z" event={"ID":"d2282a55-171b-4836-ade6-5064b814816e","Type":"ContainerDied","Data":"78ed7cce5c87d24fab45e7307cc08c229c07077d42eebf5581f338eebb8ea272"} Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.658619 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dzn5z" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.658633 4743 scope.go:117] "RemoveContainer" containerID="7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.677078 4743 scope.go:117] "RemoveContainer" containerID="4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.694198 4743 scope.go:117] "RemoveContainer" containerID="89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.727436 4743 scope.go:117] "RemoveContainer" containerID="7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5" Nov 22 09:47:59 crc kubenswrapper[4743]: E1122 09:47:59.727856 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5\": container with ID starting with 7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5 not found: ID does not exist" containerID="7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.727886 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5"} err="failed to get container status \"7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5\": rpc error: code = NotFound desc = could not find container \"7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5\": container with ID starting with 7fc7aa18fd85c7275cab2315a47860867894a67da0e288ec3e6aa1058941e7f5 not found: ID does not exist" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.727906 4743 scope.go:117] "RemoveContainer" containerID="4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a" Nov 22 09:47:59 crc kubenswrapper[4743]: E1122 09:47:59.728160 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a\": container with ID starting with 4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a not found: ID does not exist" containerID="4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.728177 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a"} err="failed to get container status \"4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a\": rpc error: code = NotFound desc = could not find container \"4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a\": container with ID starting with 4d3580a5daa745ff138e62dff3336b439c52054e047896f5ed6a8541ad66b09a not found: ID does not exist" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.728190 4743 scope.go:117] "RemoveContainer" containerID="89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270" Nov 22 09:47:59 crc kubenswrapper[4743]: E1122 09:47:59.728393 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270\": container with ID starting with 89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270 not found: ID does not exist" containerID="89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270" Nov 22 09:47:59 crc kubenswrapper[4743]: I1122 09:47:59.728413 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270"} err="failed to get container status \"89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270\": rpc error: code = NotFound desc = could not find container \"89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270\": container with ID starting with 89b46886d96259e43366e27f3b1fcb9ad4010024d3b939d23d2a78eaaed9d270 not found: ID does not exist" Nov 22 09:48:00 crc kubenswrapper[4743]: I1122 09:48:00.214211 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2282a55-171b-4836-ade6-5064b814816e" (UID: "d2282a55-171b-4836-ade6-5064b814816e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:48:00 crc kubenswrapper[4743]: I1122 09:48:00.251507 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2282a55-171b-4836-ade6-5064b814816e-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:00 crc kubenswrapper[4743]: I1122 09:48:00.295989 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dzn5z"] Nov 22 09:48:00 crc kubenswrapper[4743]: I1122 09:48:00.303223 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dzn5z"] Nov 22 09:48:01 crc kubenswrapper[4743]: I1122 09:48:01.173414 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2282a55-171b-4836-ade6-5064b814816e" path="/var/lib/kubelet/pods/d2282a55-171b-4836-ade6-5064b814816e/volumes" Nov 22 09:48:07 crc kubenswrapper[4743]: I1122 09:48:07.157088 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:48:07 crc kubenswrapper[4743]: E1122 09:48:07.158186 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:48:18 crc kubenswrapper[4743]: I1122 09:48:18.152396 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:48:18 crc kubenswrapper[4743]: E1122 09:48:18.154132 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:48:20 crc kubenswrapper[4743]: E1122 09:48:20.718470 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.245:42014->38.102.83.245:33143: write tcp 38.102.83.245:42014->38.102.83.245:33143: write: broken pipe Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.220647 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 09:48:23 crc kubenswrapper[4743]: E1122 09:48:23.221712 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2282a55-171b-4836-ade6-5064b814816e" containerName="extract-utilities" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.221747 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2282a55-171b-4836-ade6-5064b814816e" containerName="extract-utilities" Nov 22 09:48:23 crc kubenswrapper[4743]: E1122 09:48:23.221781 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2282a55-171b-4836-ade6-5064b814816e" containerName="registry-server" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.221803 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2282a55-171b-4836-ade6-5064b814816e" containerName="registry-server" Nov 22 09:48:23 crc kubenswrapper[4743]: E1122 09:48:23.221851 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0e1faf-eddd-41d3-944c-85045c5050d4" containerName="mariadb-client" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.221870 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0e1faf-eddd-41d3-944c-85045c5050d4" containerName="mariadb-client" Nov 22 09:48:23 crc kubenswrapper[4743]: E1122 09:48:23.221923 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2282a55-171b-4836-ade6-5064b814816e" containerName="extract-content" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.221940 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2282a55-171b-4836-ade6-5064b814816e" containerName="extract-content" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.222323 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0e1faf-eddd-41d3-944c-85045c5050d4" containerName="mariadb-client" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.222377 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2282a55-171b-4836-ade6-5064b814816e" containerName="registry-server" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.224426 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.226831 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-xmxlf" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.227669 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.228686 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.231073 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.238861 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.240946 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.249187 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.251314 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.298237 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.330010 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.346821 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/207fcbef-06d2-4cd9-85d1-f6114591092f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.346913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d13534a-43da-4352-b61e-40779ab62237-config\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.346970 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc62c\" (UniqueName: \"kubernetes.io/projected/207fcbef-06d2-4cd9-85d1-f6114591092f-kube-api-access-lc62c\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.346995 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d13534a-43da-4352-b61e-40779ab62237-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347029 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c5e138b-6d40-45d7-b138-bf86c812bd0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c5e138b-6d40-45d7-b138-bf86c812bd0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cm45\" (UniqueName: \"kubernetes.io/projected/3d13534a-43da-4352-b61e-40779ab62237-kube-api-access-6cm45\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347337 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5e138b-6d40-45d7-b138-bf86c812bd0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-416cfd28-caa4-4d8a-87c5-27069461bd12\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-416cfd28-caa4-4d8a-87c5-27069461bd12\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347401 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9648fb58-d2ec-4217-b64e-64f01a904a01\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9648fb58-d2ec-4217-b64e-64f01a904a01\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347421 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25057d5c-8c6a-4893-90b1-29bffc162ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25057d5c-8c6a-4893-90b1-29bffc162ff7\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347446 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d13534a-43da-4352-b61e-40779ab62237-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347466 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c5e138b-6d40-45d7-b138-bf86c812bd0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347482 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/207fcbef-06d2-4cd9-85d1-f6114591092f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347553 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207fcbef-06d2-4cd9-85d1-f6114591092f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.347599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/207fcbef-06d2-4cd9-85d1-f6114591092f-config\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.348459 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp78l\" (UniqueName: \"kubernetes.io/projected/0c5e138b-6d40-45d7-b138-bf86c812bd0c-kube-api-access-sp78l\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.348549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d13534a-43da-4352-b61e-40779ab62237-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.406125 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.409387 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.412191 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-f8v8s" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.412545 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.419043 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.421279 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.436921 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.438441 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.450979 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.451800 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c5e138b-6d40-45d7-b138-bf86c812bd0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.451876 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1d9ffdb2-0a23-4fca-9405-7152c8885c02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1d9ffdb2-0a23-4fca-9405-7152c8885c02\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.451925 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c5e138b-6d40-45d7-b138-bf86c812bd0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.451954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cm45\" (UniqueName: \"kubernetes.io/projected/3d13534a-43da-4352-b61e-40779ab62237-kube-api-access-6cm45\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.451988 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5e138b-6d40-45d7-b138-bf86c812bd0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452026 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-416cfd28-caa4-4d8a-87c5-27069461bd12\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-416cfd28-caa4-4d8a-87c5-27069461bd12\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9648fb58-d2ec-4217-b64e-64f01a904a01\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9648fb58-d2ec-4217-b64e-64f01a904a01\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25057d5c-8c6a-4893-90b1-29bffc162ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25057d5c-8c6a-4893-90b1-29bffc162ff7\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452113 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d13534a-43da-4352-b61e-40779ab62237-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c5e138b-6d40-45d7-b138-bf86c812bd0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452156 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/207fcbef-06d2-4cd9-85d1-f6114591092f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452179 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ac2f2b-4109-4ed3-868d-ea3572055751-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81ac2f2b-4109-4ed3-868d-ea3572055751-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207fcbef-06d2-4cd9-85d1-f6114591092f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452268 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/207fcbef-06d2-4cd9-85d1-f6114591092f-config\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452298 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp78l\" (UniqueName: \"kubernetes.io/projected/0c5e138b-6d40-45d7-b138-bf86c812bd0c-kube-api-access-sp78l\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452323 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d13534a-43da-4352-b61e-40779ab62237-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452352 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/207fcbef-06d2-4cd9-85d1-f6114591092f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452374 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ac2f2b-4109-4ed3-868d-ea3572055751-config\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452398 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81ac2f2b-4109-4ed3-868d-ea3572055751-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452429 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs69f\" (UniqueName: \"kubernetes.io/projected/81ac2f2b-4109-4ed3-868d-ea3572055751-kube-api-access-vs69f\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d13534a-43da-4352-b61e-40779ab62237-config\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452481 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc62c\" (UniqueName: \"kubernetes.io/projected/207fcbef-06d2-4cd9-85d1-f6114591092f-kube-api-access-lc62c\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d13534a-43da-4352-b61e-40779ab62237-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.452770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c5e138b-6d40-45d7-b138-bf86c812bd0c-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.453570 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.453789 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c5e138b-6d40-45d7-b138-bf86c812bd0c-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.454260 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3d13534a-43da-4352-b61e-40779ab62237-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.454289 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/207fcbef-06d2-4cd9-85d1-f6114591092f-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.455050 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d13534a-43da-4352-b61e-40779ab62237-config\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.455232 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/207fcbef-06d2-4cd9-85d1-f6114591092f-config\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.455238 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/207fcbef-06d2-4cd9-85d1-f6114591092f-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.455798 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d13534a-43da-4352-b61e-40779ab62237-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.456023 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.456127 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25057d5c-8c6a-4893-90b1-29bffc162ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25057d5c-8c6a-4893-90b1-29bffc162ff7\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/22d84d6cc5496ee59f33eeddf9da83fb099c2efe79e7df790ba4d91efda011d0/globalmount\"" pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.456592 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c5e138b-6d40-45d7-b138-bf86c812bd0c-config\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.465595 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/207fcbef-06d2-4cd9-85d1-f6114591092f-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.466568 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.467544 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-416cfd28-caa4-4d8a-87c5-27069461bd12\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-416cfd28-caa4-4d8a-87c5-27069461bd12\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/10fb76105df6a6b103db44a67e8748a15b45fb3a237fd888cc42510ce14eb594/globalmount\"" pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.466676 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.467848 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9648fb58-d2ec-4217-b64e-64f01a904a01\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9648fb58-d2ec-4217-b64e-64f01a904a01\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7226064d41e2bbf759b8a3593fddf2c630f6647b9aacb8014fdf02114ebe40b6/globalmount\"" pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.469428 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d13534a-43da-4352-b61e-40779ab62237-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.473388 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.478504 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c5e138b-6d40-45d7-b138-bf86c812bd0c-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.481496 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc62c\" (UniqueName: \"kubernetes.io/projected/207fcbef-06d2-4cd9-85d1-f6114591092f-kube-api-access-lc62c\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.484699 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cm45\" (UniqueName: \"kubernetes.io/projected/3d13534a-43da-4352-b61e-40779ab62237-kube-api-access-6cm45\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.491450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp78l\" (UniqueName: \"kubernetes.io/projected/0c5e138b-6d40-45d7-b138-bf86c812bd0c-kube-api-access-sp78l\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.499317 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.514882 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-416cfd28-caa4-4d8a-87c5-27069461bd12\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-416cfd28-caa4-4d8a-87c5-27069461bd12\") pod \"ovsdbserver-nb-1\" (UID: \"3d13534a-43da-4352-b61e-40779ab62237\") " pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.518365 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9648fb58-d2ec-4217-b64e-64f01a904a01\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9648fb58-d2ec-4217-b64e-64f01a904a01\") pod \"ovsdbserver-nb-0\" (UID: \"207fcbef-06d2-4cd9-85d1-f6114591092f\") " pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.518515 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25057d5c-8c6a-4893-90b1-29bffc162ff7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25057d5c-8c6a-4893-90b1-29bffc162ff7\") pod \"ovsdbserver-nb-2\" (UID: \"0c5e138b-6d40-45d7-b138-bf86c812bd0c\") " pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.553726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1d9ffdb2-0a23-4fca-9405-7152c8885c02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1d9ffdb2-0a23-4fca-9405-7152c8885c02\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.553783 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2a8ce6c0-8ac9-44d3-a18f-7590893217bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a8ce6c0-8ac9-44d3-a18f-7590893217bb\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.554017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.554221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-config\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.554497 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ac2f2b-4109-4ed3-868d-ea3572055751-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.554538 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81ac2f2b-4109-4ed3-868d-ea3572055751-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.554656 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.554709 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.554752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ac2f2b-4109-4ed3-868d-ea3572055751-config\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.554791 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81ac2f2b-4109-4ed3-868d-ea3572055751-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.554823 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs69f\" (UniqueName: \"kubernetes.io/projected/81ac2f2b-4109-4ed3-868d-ea3572055751-kube-api-access-vs69f\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.554854 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bcb6\" (UniqueName: \"kubernetes.io/projected/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-kube-api-access-2bcb6\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.555284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/81ac2f2b-4109-4ed3-868d-ea3572055751-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.555855 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81ac2f2b-4109-4ed3-868d-ea3572055751-config\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.555904 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/81ac2f2b-4109-4ed3-868d-ea3572055751-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.556011 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.556073 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1d9ffdb2-0a23-4fca-9405-7152c8885c02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1d9ffdb2-0a23-4fca-9405-7152c8885c02\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8e5ee916633488df8ad320e7a811643d7b81a380edd6da03e419d3bba2b0c75e/globalmount\"" pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.562234 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.562509 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81ac2f2b-4109-4ed3-868d-ea3572055751-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.571910 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs69f\" (UniqueName: \"kubernetes.io/projected/81ac2f2b-4109-4ed3-868d-ea3572055751-kube-api-access-vs69f\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.585103 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.599607 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.607087 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1d9ffdb2-0a23-4fca-9405-7152c8885c02\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1d9ffdb2-0a23-4fca-9405-7152c8885c02\") pod \"ovsdbserver-sb-0\" (UID: \"81ac2f2b-4109-4ed3-868d-ea3572055751\") " pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.656273 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bcb6\" (UniqueName: \"kubernetes.io/projected/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-kube-api-access-2bcb6\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.656758 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-656078a7-cc16-4a40-b3ee-14a4a8747156\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-656078a7-cc16-4a40-b3ee-14a4a8747156\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.656817 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2a8ce6c0-8ac9-44d3-a18f-7590893217bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a8ce6c0-8ac9-44d3-a18f-7590893217bb\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.656848 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.656875 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-config\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.656977 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.657011 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbxrf\" (UniqueName: \"kubernetes.io/projected/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-kube-api-access-qbxrf\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.657041 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-config\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.657078 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.657110 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.657146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.657173 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.659343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.659636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.659669 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-config\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.660938 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.660968 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2a8ce6c0-8ac9-44d3-a18f-7590893217bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a8ce6c0-8ac9-44d3-a18f-7590893217bb\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3d6c417d4187be1eebc7e347c7afff1a20805fa814e57aee83e282a5a3d3bd78/globalmount\"" pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.662805 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.687012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bcb6\" (UniqueName: \"kubernetes.io/projected/64b18e0c-c33c-4f05-93e6-3b7ffc82e811-kube-api-access-2bcb6\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.706447 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2a8ce6c0-8ac9-44d3-a18f-7590893217bb\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2a8ce6c0-8ac9-44d3-a18f-7590893217bb\") pod \"ovsdbserver-sb-1\" (UID: \"64b18e0c-c33c-4f05-93e6-3b7ffc82e811\") " pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.743372 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.758377 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.758565 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-656078a7-cc16-4a40-b3ee-14a4a8747156\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-656078a7-cc16-4a40-b3ee-14a4a8747156\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.758812 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.758838 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-config\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.758911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbxrf\" (UniqueName: \"kubernetes.io/projected/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-kube-api-access-qbxrf\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.758948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.759039 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.760571 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.760910 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.760934 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-656078a7-cc16-4a40-b3ee-14a4a8747156\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-656078a7-cc16-4a40-b3ee-14a4a8747156\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ee98d03ff28a2f681dc667a3eb10fcace487c6db07ad47f6da7ef5c2ecc91ee1/globalmount\"" pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.763607 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-config\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.764775 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.776851 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbxrf\" (UniqueName: \"kubernetes.io/projected/2acf2bf5-0ed1-4513-ba48-a5e7a63a6002-kube-api-access-qbxrf\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.798108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-656078a7-cc16-4a40-b3ee-14a4a8747156\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-656078a7-cc16-4a40-b3ee-14a4a8747156\") pod \"ovsdbserver-sb-2\" (UID: \"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002\") " pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:23 crc kubenswrapper[4743]: I1122 09:48:23.929327 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.064214 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.156617 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.304002 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.420447 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Nov 22 09:48:24 crc kubenswrapper[4743]: W1122 09:48:24.428143 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64b18e0c_c33c_4f05_93e6_3b7ffc82e811.slice/crio-4730b26d27e14bccda87388c39da291ed88c54cd3ed8cf8267db480cb3964146 WatchSource:0}: Error finding container 4730b26d27e14bccda87388c39da291ed88c54cd3ed8cf8267db480cb3964146: Status 404 returned error can't find the container with id 4730b26d27e14bccda87388c39da291ed88c54cd3ed8cf8267db480cb3964146 Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.681535 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Nov 22 09:48:24 crc kubenswrapper[4743]: W1122 09:48:24.690726 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2acf2bf5_0ed1_4513_ba48_a5e7a63a6002.slice/crio-e58701dd10063d115d31728c58638c597a50ab11229a6d58047f18b3ce6b1c37 WatchSource:0}: Error finding container e58701dd10063d115d31728c58638c597a50ab11229a6d58047f18b3ce6b1c37: Status 404 returned error can't find the container with id e58701dd10063d115d31728c58638c597a50ab11229a6d58047f18b3ce6b1c37 Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.844246 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Nov 22 09:48:24 crc kubenswrapper[4743]: W1122 09:48:24.848026 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81ac2f2b_4109_4ed3_868d_ea3572055751.slice/crio-621ff012493113655758a95d385c5f93cccf4086ce9058d10ecaacf17c4f9b9a WatchSource:0}: Error finding container 621ff012493113655758a95d385c5f93cccf4086ce9058d10ecaacf17c4f9b9a: Status 404 returned error can't find the container with id 621ff012493113655758a95d385c5f93cccf4086ce9058d10ecaacf17c4f9b9a Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.884991 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002","Type":"ContainerStarted","Data":"29e3ca41a1970a90fac64115f427f36fe7321fbdef0d2e455ceaf09e3fa8a837"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.885050 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002","Type":"ContainerStarted","Data":"e58701dd10063d115d31728c58638c597a50ab11229a6d58047f18b3ce6b1c37"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.893321 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"207fcbef-06d2-4cd9-85d1-f6114591092f","Type":"ContainerStarted","Data":"178d13e392813f67ac6976d862e91b6369414f4614d6fbd369cc19d30d3b9166"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.893364 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"207fcbef-06d2-4cd9-85d1-f6114591092f","Type":"ContainerStarted","Data":"2557d9d4a46a52f75ea0d8fbd2ac9a398a8cf092b67ae31566d06a36fe21be06"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.893373 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"207fcbef-06d2-4cd9-85d1-f6114591092f","Type":"ContainerStarted","Data":"d383cc3abacc1fef635bf593d4a6f71555e5102f51ee9271af59446c22545780"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.894758 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3d13534a-43da-4352-b61e-40779ab62237","Type":"ContainerStarted","Data":"659ee7dc25f554b52fbdd362994f4100a22d19e43b6e8b00ef20ceec5960aff9"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.894785 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3d13534a-43da-4352-b61e-40779ab62237","Type":"ContainerStarted","Data":"b261932b9ac62f15753fa77654519640e5ccdc4cf3be955f11da70325373eb47"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.894796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"3d13534a-43da-4352-b61e-40779ab62237","Type":"ContainerStarted","Data":"2710df29f575de2b77b93d73c99bfd86bc99acdc2a6c37ce877992908e34f168"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.897366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"81ac2f2b-4109-4ed3-868d-ea3572055751","Type":"ContainerStarted","Data":"621ff012493113655758a95d385c5f93cccf4086ce9058d10ecaacf17c4f9b9a"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.901024 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"64b18e0c-c33c-4f05-93e6-3b7ffc82e811","Type":"ContainerStarted","Data":"5d3c7dceede156d762c2519cc5bc14417ceebb4c5d5e6bcb98e1f5724b4ed04d"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.901055 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"64b18e0c-c33c-4f05-93e6-3b7ffc82e811","Type":"ContainerStarted","Data":"d2448834d72332642f462625a566a9643ee6f782a74e9390a6f86617d94abb38"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.901067 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"64b18e0c-c33c-4f05-93e6-3b7ffc82e811","Type":"ContainerStarted","Data":"4730b26d27e14bccda87388c39da291ed88c54cd3ed8cf8267db480cb3964146"} Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.934840 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=2.934815933 podStartE2EDuration="2.934815933s" podCreationTimestamp="2025-11-22 09:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:48:24.913312205 +0000 UTC m=+5178.619673287" watchObservedRunningTime="2025-11-22 09:48:24.934815933 +0000 UTC m=+5178.641176985" Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.935284 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=2.935277476 podStartE2EDuration="2.935277476s" podCreationTimestamp="2025-11-22 09:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:48:24.927399109 +0000 UTC m=+5178.633760161" watchObservedRunningTime="2025-11-22 09:48:24.935277476 +0000 UTC m=+5178.641638528" Nov 22 09:48:24 crc kubenswrapper[4743]: I1122 09:48:24.948161 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=2.9481396159999997 podStartE2EDuration="2.948139616s" podCreationTimestamp="2025-11-22 09:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:48:24.943311347 +0000 UTC m=+5178.649672399" watchObservedRunningTime="2025-11-22 09:48:24.948139616 +0000 UTC m=+5178.654500668" Nov 22 09:48:25 crc kubenswrapper[4743]: I1122 09:48:25.012273 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Nov 22 09:48:25 crc kubenswrapper[4743]: W1122 09:48:25.026274 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c5e138b_6d40_45d7_b138_bf86c812bd0c.slice/crio-c801328e2b2afdeec2946c2f5f6ce1d702c6730cc4f5935fe62baf8abb6d7ff9 WatchSource:0}: Error finding container c801328e2b2afdeec2946c2f5f6ce1d702c6730cc4f5935fe62baf8abb6d7ff9: Status 404 returned error can't find the container with id c801328e2b2afdeec2946c2f5f6ce1d702c6730cc4f5935fe62baf8abb6d7ff9 Nov 22 09:48:25 crc kubenswrapper[4743]: I1122 09:48:25.912321 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"81ac2f2b-4109-4ed3-868d-ea3572055751","Type":"ContainerStarted","Data":"0fce8dca912a87209ddd08a7a16621e6e00486d1313f1a0b66147063b6eb2e27"} Nov 22 09:48:25 crc kubenswrapper[4743]: I1122 09:48:25.912686 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"81ac2f2b-4109-4ed3-868d-ea3572055751","Type":"ContainerStarted","Data":"bb057d68848c904f376f996710443cbaaed7fa58d96d668160b75a8b28d33ee0"} Nov 22 09:48:25 crc kubenswrapper[4743]: I1122 09:48:25.914206 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0c5e138b-6d40-45d7-b138-bf86c812bd0c","Type":"ContainerStarted","Data":"fba616a40928926a7416a7d8804d0c12d6d5ab3488ba20fc44655086e38c82c6"} Nov 22 09:48:25 crc kubenswrapper[4743]: I1122 09:48:25.914244 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0c5e138b-6d40-45d7-b138-bf86c812bd0c","Type":"ContainerStarted","Data":"431dc7e3462c7b02732c394af006bbab84c6d59f2810aa1087dc4b384ccc602e"} Nov 22 09:48:25 crc kubenswrapper[4743]: I1122 09:48:25.914255 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0c5e138b-6d40-45d7-b138-bf86c812bd0c","Type":"ContainerStarted","Data":"c801328e2b2afdeec2946c2f5f6ce1d702c6730cc4f5935fe62baf8abb6d7ff9"} Nov 22 09:48:25 crc kubenswrapper[4743]: I1122 09:48:25.918755 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"2acf2bf5-0ed1-4513-ba48-a5e7a63a6002","Type":"ContainerStarted","Data":"a8369068e12ca87bed119ff7bca2cdc78ba69610eb55b2e6abc1be6a22160c79"} Nov 22 09:48:25 crc kubenswrapper[4743]: I1122 09:48:25.938518 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.93849473 podStartE2EDuration="3.93849473s" podCreationTimestamp="2025-11-22 09:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:48:25.936544884 +0000 UTC m=+5179.642905966" watchObservedRunningTime="2025-11-22 09:48:25.93849473 +0000 UTC m=+5179.644855792" Nov 22 09:48:25 crc kubenswrapper[4743]: I1122 09:48:25.957162 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.9571432460000002 podStartE2EDuration="3.957143246s" podCreationTimestamp="2025-11-22 09:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:48:25.956357524 +0000 UTC m=+5179.662718636" watchObservedRunningTime="2025-11-22 09:48:25.957143246 +0000 UTC m=+5179.663504298" Nov 22 09:48:25 crc kubenswrapper[4743]: I1122 09:48:25.985300 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.985281645 podStartE2EDuration="3.985281645s" podCreationTimestamp="2025-11-22 09:48:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:48:25.982883586 +0000 UTC m=+5179.689244658" watchObservedRunningTime="2025-11-22 09:48:25.985281645 +0000 UTC m=+5179.691642697" Nov 22 09:48:26 crc kubenswrapper[4743]: I1122 09:48:26.563501 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:26 crc kubenswrapper[4743]: I1122 09:48:26.585621 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:26 crc kubenswrapper[4743]: I1122 09:48:26.599798 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:26 crc kubenswrapper[4743]: I1122 09:48:26.744420 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:26 crc kubenswrapper[4743]: I1122 09:48:26.930478 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:27 crc kubenswrapper[4743]: I1122 09:48:27.065790 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:27 crc kubenswrapper[4743]: I1122 09:48:27.106306 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:27 crc kubenswrapper[4743]: I1122 09:48:27.942393 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:28 crc kubenswrapper[4743]: I1122 09:48:28.563695 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:28 crc kubenswrapper[4743]: I1122 09:48:28.586230 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:28 crc kubenswrapper[4743]: I1122 09:48:28.600548 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:28 crc kubenswrapper[4743]: I1122 09:48:28.744528 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:28 crc kubenswrapper[4743]: I1122 09:48:28.929553 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.135898 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.445315 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cd49575f7-9gxm7"] Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.447649 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.450318 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.452889 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cd49575f7-9gxm7"] Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.591988 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-ovsdbserver-sb\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.592031 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5z2l\" (UniqueName: \"kubernetes.io/projected/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-kube-api-access-n5z2l\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.592106 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-dns-svc\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.592143 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-config\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.612996 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.644709 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.651133 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.660897 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.692164 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.693261 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-ovsdbserver-sb\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.693319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5z2l\" (UniqueName: \"kubernetes.io/projected/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-kube-api-access-n5z2l\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.694097 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-dns-svc\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.694233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-config\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.694368 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-ovsdbserver-sb\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.695092 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-dns-svc\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.695138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-config\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.719772 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5z2l\" (UniqueName: \"kubernetes.io/projected/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-kube-api-access-n5z2l\") pod \"dnsmasq-dns-7cd49575f7-9gxm7\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.780653 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.789167 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:29 crc kubenswrapper[4743]: I1122 09:48:29.961858 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cd49575f7-9gxm7"] Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.016982 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.018321 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864bc46885-j5859"] Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.023406 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.041956 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.061389 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864bc46885-j5859"] Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.086785 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.090039 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.110948 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzkxp\" (UniqueName: \"kubernetes.io/projected/035ba0cf-ceee-4cf3-8628-652a1bf5975f-kube-api-access-wzkxp\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.111119 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-config\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.111144 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-dns-svc\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.111172 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-nb\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.111204 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-sb\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.160117 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:48:30 crc kubenswrapper[4743]: E1122 09:48:30.165078 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.213020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-sb\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.213455 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzkxp\" (UniqueName: \"kubernetes.io/projected/035ba0cf-ceee-4cf3-8628-652a1bf5975f-kube-api-access-wzkxp\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.213684 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-config\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.213715 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-dns-svc\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.213772 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-sb\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.213789 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-nb\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.214429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-config\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.214695 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-nb\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.215120 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-dns-svc\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.257449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzkxp\" (UniqueName: \"kubernetes.io/projected/035ba0cf-ceee-4cf3-8628-652a1bf5975f-kube-api-access-wzkxp\") pod \"dnsmasq-dns-864bc46885-j5859\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.346269 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cd49575f7-9gxm7"] Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.367026 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.656217 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gbjtn"] Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.658718 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.678670 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbjtn"] Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.720650 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-catalog-content\") pod \"redhat-marketplace-gbjtn\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.721038 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-utilities\") pod \"redhat-marketplace-gbjtn\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.721222 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f672\" (UniqueName: \"kubernetes.io/projected/4876708f-7c5a-41dc-85f2-55d1980ff8ed-kube-api-access-6f672\") pod \"redhat-marketplace-gbjtn\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: E1122 09:48:30.740640 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99e31d12_1ec3_42d3_a9fd_4f95f15dc51c.slice/crio-2bb501632108e45c8b4afa9fc15918389ea27079c8e68a3dec89a2100e456123.scope\": RecentStats: unable to find data in memory cache]" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.822657 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-catalog-content\") pod \"redhat-marketplace-gbjtn\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.822778 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-utilities\") pod \"redhat-marketplace-gbjtn\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.823373 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-catalog-content\") pod \"redhat-marketplace-gbjtn\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.823384 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-utilities\") pod \"redhat-marketplace-gbjtn\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.822850 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f672\" (UniqueName: \"kubernetes.io/projected/4876708f-7c5a-41dc-85f2-55d1980ff8ed-kube-api-access-6f672\") pod \"redhat-marketplace-gbjtn\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.840655 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f672\" (UniqueName: \"kubernetes.io/projected/4876708f-7c5a-41dc-85f2-55d1980ff8ed-kube-api-access-6f672\") pod \"redhat-marketplace-gbjtn\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.840980 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864bc46885-j5859"] Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.979637 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.982679 4743 generic.go:334] "Generic (PLEG): container finished" podID="99e31d12-1ec3-42d3-a9fd-4f95f15dc51c" containerID="2bb501632108e45c8b4afa9fc15918389ea27079c8e68a3dec89a2100e456123" exitCode=0 Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.983188 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" event={"ID":"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c","Type":"ContainerDied","Data":"2bb501632108e45c8b4afa9fc15918389ea27079c8e68a3dec89a2100e456123"} Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.983224 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" event={"ID":"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c","Type":"ContainerStarted","Data":"d07a6290562458a5cbafebce5642546cffaf1a444039687d9cbe8e01d8d642a6"} Nov 22 09:48:30 crc kubenswrapper[4743]: I1122 09:48:30.984365 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864bc46885-j5859" event={"ID":"035ba0cf-ceee-4cf3-8628-652a1bf5975f","Type":"ContainerStarted","Data":"1ddacb0c224f6ce10e8f764a9ff9740094f8d63558d402b52367caf8a488ccfe"} Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.286137 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.331967 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-ovsdbserver-sb\") pod \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.332043 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5z2l\" (UniqueName: \"kubernetes.io/projected/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-kube-api-access-n5z2l\") pod \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.332244 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-config\") pod \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.332268 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-dns-svc\") pod \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\" (UID: \"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c\") " Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.335974 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-kube-api-access-n5z2l" (OuterVolumeSpecName: "kube-api-access-n5z2l") pod "99e31d12-1ec3-42d3-a9fd-4f95f15dc51c" (UID: "99e31d12-1ec3-42d3-a9fd-4f95f15dc51c"). InnerVolumeSpecName "kube-api-access-n5z2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.349597 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-config" (OuterVolumeSpecName: "config") pod "99e31d12-1ec3-42d3-a9fd-4f95f15dc51c" (UID: "99e31d12-1ec3-42d3-a9fd-4f95f15dc51c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.349609 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99e31d12-1ec3-42d3-a9fd-4f95f15dc51c" (UID: "99e31d12-1ec3-42d3-a9fd-4f95f15dc51c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.351366 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99e31d12-1ec3-42d3-a9fd-4f95f15dc51c" (UID: "99e31d12-1ec3-42d3-a9fd-4f95f15dc51c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.434686 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.434736 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.434753 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.434766 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5z2l\" (UniqueName: \"kubernetes.io/projected/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c-kube-api-access-n5z2l\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:31 crc kubenswrapper[4743]: I1122 09:48:31.436008 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbjtn"] Nov 22 09:48:31 crc kubenswrapper[4743]: W1122 09:48:31.441907 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4876708f_7c5a_41dc_85f2_55d1980ff8ed.slice/crio-e695e367ee4a800383f1f4ed3c3a8128468ffe6584a0c734009124792c338a18 WatchSource:0}: Error finding container e695e367ee4a800383f1f4ed3c3a8128468ffe6584a0c734009124792c338a18: Status 404 returned error can't find the container with id e695e367ee4a800383f1f4ed3c3a8128468ffe6584a0c734009124792c338a18 Nov 22 09:48:32 crc kubenswrapper[4743]: I1122 09:48:31.999740 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" Nov 22 09:48:32 crc kubenswrapper[4743]: I1122 09:48:31.999756 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cd49575f7-9gxm7" event={"ID":"99e31d12-1ec3-42d3-a9fd-4f95f15dc51c","Type":"ContainerDied","Data":"d07a6290562458a5cbafebce5642546cffaf1a444039687d9cbe8e01d8d642a6"} Nov 22 09:48:32 crc kubenswrapper[4743]: I1122 09:48:32.000682 4743 scope.go:117] "RemoveContainer" containerID="2bb501632108e45c8b4afa9fc15918389ea27079c8e68a3dec89a2100e456123" Nov 22 09:48:32 crc kubenswrapper[4743]: I1122 09:48:32.003024 4743 generic.go:334] "Generic (PLEG): container finished" podID="035ba0cf-ceee-4cf3-8628-652a1bf5975f" containerID="b097dd51036660b0e306e9b435c8b0d3a8b6fa2623c942e3f8d62426107b4e45" exitCode=0 Nov 22 09:48:32 crc kubenswrapper[4743]: I1122 09:48:32.003126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864bc46885-j5859" event={"ID":"035ba0cf-ceee-4cf3-8628-652a1bf5975f","Type":"ContainerDied","Data":"b097dd51036660b0e306e9b435c8b0d3a8b6fa2623c942e3f8d62426107b4e45"} Nov 22 09:48:32 crc kubenswrapper[4743]: I1122 09:48:32.009034 4743 generic.go:334] "Generic (PLEG): container finished" podID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerID="ec9a199d36f033a8d787cf0f0a26aab8bddbdee0fdf71316f9d4b1e8bffcfdfd" exitCode=0 Nov 22 09:48:32 crc kubenswrapper[4743]: I1122 09:48:32.009218 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbjtn" event={"ID":"4876708f-7c5a-41dc-85f2-55d1980ff8ed","Type":"ContainerDied","Data":"ec9a199d36f033a8d787cf0f0a26aab8bddbdee0fdf71316f9d4b1e8bffcfdfd"} Nov 22 09:48:32 crc kubenswrapper[4743]: I1122 09:48:32.011186 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbjtn" event={"ID":"4876708f-7c5a-41dc-85f2-55d1980ff8ed","Type":"ContainerStarted","Data":"e695e367ee4a800383f1f4ed3c3a8128468ffe6584a0c734009124792c338a18"} Nov 22 09:48:32 crc kubenswrapper[4743]: I1122 09:48:32.201650 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cd49575f7-9gxm7"] Nov 22 09:48:32 crc kubenswrapper[4743]: I1122 09:48:32.211164 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cd49575f7-9gxm7"] Nov 22 09:48:33 crc kubenswrapper[4743]: I1122 09:48:33.028414 4743 generic.go:334] "Generic (PLEG): container finished" podID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerID="b3efe17b74488845bfa6a7b3e745348b8b8839bf6d0985283812194495bb5da9" exitCode=0 Nov 22 09:48:33 crc kubenswrapper[4743]: I1122 09:48:33.028491 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbjtn" event={"ID":"4876708f-7c5a-41dc-85f2-55d1980ff8ed","Type":"ContainerDied","Data":"b3efe17b74488845bfa6a7b3e745348b8b8839bf6d0985283812194495bb5da9"} Nov 22 09:48:33 crc kubenswrapper[4743]: I1122 09:48:33.032977 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864bc46885-j5859" event={"ID":"035ba0cf-ceee-4cf3-8628-652a1bf5975f","Type":"ContainerStarted","Data":"3ca9dfd9bbf7236a679c41a503118c33a01eed7e7d8f001202e04701cc88569c"} Nov 22 09:48:33 crc kubenswrapper[4743]: I1122 09:48:33.033537 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:33 crc kubenswrapper[4743]: I1122 09:48:33.087638 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864bc46885-j5859" podStartSLOduration=4.087602526 podStartE2EDuration="4.087602526s" podCreationTimestamp="2025-11-22 09:48:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:48:33.080501042 +0000 UTC m=+5186.786862124" watchObservedRunningTime="2025-11-22 09:48:33.087602526 +0000 UTC m=+5186.793963608" Nov 22 09:48:33 crc kubenswrapper[4743]: I1122 09:48:33.175050 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e31d12-1ec3-42d3-a9fd-4f95f15dc51c" path="/var/lib/kubelet/pods/99e31d12-1ec3-42d3-a9fd-4f95f15dc51c/volumes" Nov 22 09:48:33 crc kubenswrapper[4743]: I1122 09:48:33.655058 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Nov 22 09:48:34 crc kubenswrapper[4743]: I1122 09:48:34.043300 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbjtn" event={"ID":"4876708f-7c5a-41dc-85f2-55d1980ff8ed","Type":"ContainerStarted","Data":"6ede4f8b45589042f7402033ace8d1ee1cf95424c25a5104882c08d339cc274f"} Nov 22 09:48:34 crc kubenswrapper[4743]: I1122 09:48:34.066638 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gbjtn" podStartSLOduration=2.564378557 podStartE2EDuration="4.066621314s" podCreationTimestamp="2025-11-22 09:48:30 +0000 UTC" firstStartedPulling="2025-11-22 09:48:32.013269088 +0000 UTC m=+5185.719630180" lastFinishedPulling="2025-11-22 09:48:33.515511845 +0000 UTC m=+5187.221872937" observedRunningTime="2025-11-22 09:48:34.064678258 +0000 UTC m=+5187.771039310" watchObservedRunningTime="2025-11-22 09:48:34.066621314 +0000 UTC m=+5187.772982366" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.660632 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Nov 22 09:48:36 crc kubenswrapper[4743]: E1122 09:48:36.703289 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e31d12-1ec3-42d3-a9fd-4f95f15dc51c" containerName="init" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.703607 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e31d12-1ec3-42d3-a9fd-4f95f15dc51c" containerName="init" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.704187 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e31d12-1ec3-42d3-a9fd-4f95f15dc51c" containerName="init" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.705744 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.705900 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.710430 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.831081 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkzf5\" (UniqueName: \"kubernetes.io/projected/019b3b66-2805-4684-bf84-50705fbbdaf8-kube-api-access-mkzf5\") pod \"ovn-copy-data\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " pod="openstack/ovn-copy-data" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.831270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\") pod \"ovn-copy-data\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " pod="openstack/ovn-copy-data" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.831394 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/019b3b66-2805-4684-bf84-50705fbbdaf8-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " pod="openstack/ovn-copy-data" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.933334 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkzf5\" (UniqueName: \"kubernetes.io/projected/019b3b66-2805-4684-bf84-50705fbbdaf8-kube-api-access-mkzf5\") pod \"ovn-copy-data\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " pod="openstack/ovn-copy-data" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.933478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\") pod \"ovn-copy-data\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " pod="openstack/ovn-copy-data" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.933562 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/019b3b66-2805-4684-bf84-50705fbbdaf8-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " pod="openstack/ovn-copy-data" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.938776 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.938823 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\") pod \"ovn-copy-data\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/58749c2ea4e612b2fb3c03ef35cf64ae5d9753c90f99811c842f8136899f8a0b/globalmount\"" pod="openstack/ovn-copy-data" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.942126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/019b3b66-2805-4684-bf84-50705fbbdaf8-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " pod="openstack/ovn-copy-data" Nov 22 09:48:36 crc kubenswrapper[4743]: I1122 09:48:36.963133 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkzf5\" (UniqueName: \"kubernetes.io/projected/019b3b66-2805-4684-bf84-50705fbbdaf8-kube-api-access-mkzf5\") pod \"ovn-copy-data\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " pod="openstack/ovn-copy-data" Nov 22 09:48:37 crc kubenswrapper[4743]: I1122 09:48:37.001640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\") pod \"ovn-copy-data\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " pod="openstack/ovn-copy-data" Nov 22 09:48:37 crc kubenswrapper[4743]: I1122 09:48:37.027420 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 22 09:48:37 crc kubenswrapper[4743]: W1122 09:48:37.576728 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod019b3b66_2805_4684_bf84_50705fbbdaf8.slice/crio-5ccb357b5cb4fdd99de7e29494807ac86ac1760203a634f9c8faba84f4ce2007 WatchSource:0}: Error finding container 5ccb357b5cb4fdd99de7e29494807ac86ac1760203a634f9c8faba84f4ce2007: Status 404 returned error can't find the container with id 5ccb357b5cb4fdd99de7e29494807ac86ac1760203a634f9c8faba84f4ce2007 Nov 22 09:48:37 crc kubenswrapper[4743]: I1122 09:48:37.578531 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Nov 22 09:48:38 crc kubenswrapper[4743]: I1122 09:48:38.081240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"019b3b66-2805-4684-bf84-50705fbbdaf8","Type":"ContainerStarted","Data":"4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c"} Nov 22 09:48:38 crc kubenswrapper[4743]: I1122 09:48:38.081722 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"019b3b66-2805-4684-bf84-50705fbbdaf8","Type":"ContainerStarted","Data":"5ccb357b5cb4fdd99de7e29494807ac86ac1760203a634f9c8faba84f4ce2007"} Nov 22 09:48:38 crc kubenswrapper[4743]: I1122 09:48:38.106520 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.106504917 podStartE2EDuration="3.106504917s" podCreationTimestamp="2025-11-22 09:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:48:38.102880793 +0000 UTC m=+5191.809241845" watchObservedRunningTime="2025-11-22 09:48:38.106504917 +0000 UTC m=+5191.812865969" Nov 22 09:48:40 crc kubenswrapper[4743]: I1122 09:48:40.368720 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:48:40 crc kubenswrapper[4743]: I1122 09:48:40.431565 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2wbsk"] Nov 22 09:48:40 crc kubenswrapper[4743]: I1122 09:48:40.431819 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" podUID="9c859fc4-c111-4a3b-aa17-5af4446f2edf" containerName="dnsmasq-dns" containerID="cri-o://a0f5020c6a50793b4fdd6b8fe473589a2cee580ea6d7cf060900bb51c6b827a1" gracePeriod=10 Nov 22 09:48:40 crc kubenswrapper[4743]: I1122 09:48:40.979996 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:40 crc kubenswrapper[4743]: I1122 09:48:40.980301 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.022698 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.111713 4743 generic.go:334] "Generic (PLEG): container finished" podID="9c859fc4-c111-4a3b-aa17-5af4446f2edf" containerID="a0f5020c6a50793b4fdd6b8fe473589a2cee580ea6d7cf060900bb51c6b827a1" exitCode=0 Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.111786 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" event={"ID":"9c859fc4-c111-4a3b-aa17-5af4446f2edf","Type":"ContainerDied","Data":"a0f5020c6a50793b4fdd6b8fe473589a2cee580ea6d7cf060900bb51c6b827a1"} Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.149557 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.258077 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbjtn"] Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.523525 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.656628 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwdxz\" (UniqueName: \"kubernetes.io/projected/9c859fc4-c111-4a3b-aa17-5af4446f2edf-kube-api-access-vwdxz\") pod \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.656741 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-dns-svc\") pod \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.656911 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-config\") pod \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\" (UID: \"9c859fc4-c111-4a3b-aa17-5af4446f2edf\") " Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.663437 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c859fc4-c111-4a3b-aa17-5af4446f2edf-kube-api-access-vwdxz" (OuterVolumeSpecName: "kube-api-access-vwdxz") pod "9c859fc4-c111-4a3b-aa17-5af4446f2edf" (UID: "9c859fc4-c111-4a3b-aa17-5af4446f2edf"). InnerVolumeSpecName "kube-api-access-vwdxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.700316 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c859fc4-c111-4a3b-aa17-5af4446f2edf" (UID: "9c859fc4-c111-4a3b-aa17-5af4446f2edf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.730833 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-config" (OuterVolumeSpecName: "config") pod "9c859fc4-c111-4a3b-aa17-5af4446f2edf" (UID: "9c859fc4-c111-4a3b-aa17-5af4446f2edf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.759183 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.759223 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwdxz\" (UniqueName: \"kubernetes.io/projected/9c859fc4-c111-4a3b-aa17-5af4446f2edf-kube-api-access-vwdxz\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:41 crc kubenswrapper[4743]: I1122 09:48:41.759238 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c859fc4-c111-4a3b-aa17-5af4446f2edf-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:42 crc kubenswrapper[4743]: I1122 09:48:42.122914 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" event={"ID":"9c859fc4-c111-4a3b-aa17-5af4446f2edf","Type":"ContainerDied","Data":"7338e2b87884e50253422f038ace19903e603f3268b4ef6b56424a636f34c908"} Nov 22 09:48:42 crc kubenswrapper[4743]: I1122 09:48:42.122952 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-2wbsk" Nov 22 09:48:42 crc kubenswrapper[4743]: I1122 09:48:42.122987 4743 scope.go:117] "RemoveContainer" containerID="a0f5020c6a50793b4fdd6b8fe473589a2cee580ea6d7cf060900bb51c6b827a1" Nov 22 09:48:42 crc kubenswrapper[4743]: I1122 09:48:42.142984 4743 scope.go:117] "RemoveContainer" containerID="8b50c90349ce49f9d440fd03409b7ee5970aa905ec157d175e6b9f08e2849888" Nov 22 09:48:42 crc kubenswrapper[4743]: I1122 09:48:42.162972 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2wbsk"] Nov 22 09:48:42 crc kubenswrapper[4743]: I1122 09:48:42.171851 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-2wbsk"] Nov 22 09:48:43 crc kubenswrapper[4743]: I1122 09:48:43.135678 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gbjtn" podUID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerName="registry-server" containerID="cri-o://6ede4f8b45589042f7402033ace8d1ee1cf95424c25a5104882c08d339cc274f" gracePeriod=2 Nov 22 09:48:43 crc kubenswrapper[4743]: I1122 09:48:43.166535 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c859fc4-c111-4a3b-aa17-5af4446f2edf" path="/var/lib/kubelet/pods/9c859fc4-c111-4a3b-aa17-5af4446f2edf/volumes" Nov 22 09:48:44 crc kubenswrapper[4743]: I1122 09:48:44.152290 4743 generic.go:334] "Generic (PLEG): container finished" podID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerID="6ede4f8b45589042f7402033ace8d1ee1cf95424c25a5104882c08d339cc274f" exitCode=0 Nov 22 09:48:44 crc kubenswrapper[4743]: I1122 09:48:44.152336 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbjtn" event={"ID":"4876708f-7c5a-41dc-85f2-55d1980ff8ed","Type":"ContainerDied","Data":"6ede4f8b45589042f7402033ace8d1ee1cf95424c25a5104882c08d339cc274f"} Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.151920 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:48:45 crc kubenswrapper[4743]: E1122 09:48:45.152626 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.512846 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.624181 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f672\" (UniqueName: \"kubernetes.io/projected/4876708f-7c5a-41dc-85f2-55d1980ff8ed-kube-api-access-6f672\") pod \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.624368 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-utilities\") pod \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.624407 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-catalog-content\") pod \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\" (UID: \"4876708f-7c5a-41dc-85f2-55d1980ff8ed\") " Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.625923 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-utilities" (OuterVolumeSpecName: "utilities") pod "4876708f-7c5a-41dc-85f2-55d1980ff8ed" (UID: "4876708f-7c5a-41dc-85f2-55d1980ff8ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.634889 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4876708f-7c5a-41dc-85f2-55d1980ff8ed-kube-api-access-6f672" (OuterVolumeSpecName: "kube-api-access-6f672") pod "4876708f-7c5a-41dc-85f2-55d1980ff8ed" (UID: "4876708f-7c5a-41dc-85f2-55d1980ff8ed"). InnerVolumeSpecName "kube-api-access-6f672". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.651970 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4876708f-7c5a-41dc-85f2-55d1980ff8ed" (UID: "4876708f-7c5a-41dc-85f2-55d1980ff8ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.727382 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f672\" (UniqueName: \"kubernetes.io/projected/4876708f-7c5a-41dc-85f2-55d1980ff8ed-kube-api-access-6f672\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.727561 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:45 crc kubenswrapper[4743]: I1122 09:48:45.727625 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4876708f-7c5a-41dc-85f2-55d1980ff8ed-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.174202 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbjtn" event={"ID":"4876708f-7c5a-41dc-85f2-55d1980ff8ed","Type":"ContainerDied","Data":"e695e367ee4a800383f1f4ed3c3a8128468ffe6584a0c734009124792c338a18"} Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.174265 4743 scope.go:117] "RemoveContainer" containerID="6ede4f8b45589042f7402033ace8d1ee1cf95424c25a5104882c08d339cc274f" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.174398 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbjtn" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.219127 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbjtn"] Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.224565 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbjtn"] Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.226241 4743 scope.go:117] "RemoveContainer" containerID="b3efe17b74488845bfa6a7b3e745348b8b8839bf6d0985283812194495bb5da9" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.252862 4743 scope.go:117] "RemoveContainer" containerID="ec9a199d36f033a8d787cf0f0a26aab8bddbdee0fdf71316f9d4b1e8bffcfdfd" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.486877 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Nov 22 09:48:46 crc kubenswrapper[4743]: E1122 09:48:46.487643 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerName="extract-content" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.487657 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerName="extract-content" Nov 22 09:48:46 crc kubenswrapper[4743]: E1122 09:48:46.487683 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c859fc4-c111-4a3b-aa17-5af4446f2edf" containerName="dnsmasq-dns" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.487690 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c859fc4-c111-4a3b-aa17-5af4446f2edf" containerName="dnsmasq-dns" Nov 22 09:48:46 crc kubenswrapper[4743]: E1122 09:48:46.487705 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerName="registry-server" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.487713 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerName="registry-server" Nov 22 09:48:46 crc kubenswrapper[4743]: E1122 09:48:46.487732 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerName="extract-utilities" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.487758 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerName="extract-utilities" Nov 22 09:48:46 crc kubenswrapper[4743]: E1122 09:48:46.487785 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c859fc4-c111-4a3b-aa17-5af4446f2edf" containerName="init" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.487791 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c859fc4-c111-4a3b-aa17-5af4446f2edf" containerName="init" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.487989 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c859fc4-c111-4a3b-aa17-5af4446f2edf" containerName="dnsmasq-dns" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.488010 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" containerName="registry-server" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.494911 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.497749 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.497896 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.497979 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-r46bx" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.518971 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.641883 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-config\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.641948 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-scripts\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.641970 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.642216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwx8d\" (UniqueName: \"kubernetes.io/projected/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-kube-api-access-hwx8d\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.642277 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.745206 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-config\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.745311 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-scripts\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.745365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.745466 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwx8d\" (UniqueName: \"kubernetes.io/projected/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-kube-api-access-hwx8d\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.745498 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.746557 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.747842 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-config\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.747779 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-scripts\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.753317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.763740 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwx8d\" (UniqueName: \"kubernetes.io/projected/dd13203f-a0d7-40f3-8e55-62f38fdc76fe-kube-api-access-hwx8d\") pod \"ovn-northd-0\" (UID: \"dd13203f-a0d7-40f3-8e55-62f38fdc76fe\") " pod="openstack/ovn-northd-0" Nov 22 09:48:46 crc kubenswrapper[4743]: I1122 09:48:46.826301 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Nov 22 09:48:47 crc kubenswrapper[4743]: I1122 09:48:47.160977 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4876708f-7c5a-41dc-85f2-55d1980ff8ed" path="/var/lib/kubelet/pods/4876708f-7c5a-41dc-85f2-55d1980ff8ed/volumes" Nov 22 09:48:47 crc kubenswrapper[4743]: I1122 09:48:47.258969 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Nov 22 09:48:47 crc kubenswrapper[4743]: W1122 09:48:47.260213 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd13203f_a0d7_40f3_8e55_62f38fdc76fe.slice/crio-4462fafc8056f2bce7a10f92eedb5581cdb5450b7e7cebd1111d6ba6560e8ae6 WatchSource:0}: Error finding container 4462fafc8056f2bce7a10f92eedb5581cdb5450b7e7cebd1111d6ba6560e8ae6: Status 404 returned error can't find the container with id 4462fafc8056f2bce7a10f92eedb5581cdb5450b7e7cebd1111d6ba6560e8ae6 Nov 22 09:48:48 crc kubenswrapper[4743]: I1122 09:48:48.199440 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dd13203f-a0d7-40f3-8e55-62f38fdc76fe","Type":"ContainerStarted","Data":"d2da87caf3902be1011400ba81abc09724a4db9cf7d99fde763f44cfc345e4b3"} Nov 22 09:48:48 crc kubenswrapper[4743]: I1122 09:48:48.199871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dd13203f-a0d7-40f3-8e55-62f38fdc76fe","Type":"ContainerStarted","Data":"4462fafc8056f2bce7a10f92eedb5581cdb5450b7e7cebd1111d6ba6560e8ae6"} Nov 22 09:48:49 crc kubenswrapper[4743]: I1122 09:48:49.212228 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dd13203f-a0d7-40f3-8e55-62f38fdc76fe","Type":"ContainerStarted","Data":"bf3f4f7610be721d0cc37db0fe9b9d400cb3025a554fcf24b24b05d33b289d87"} Nov 22 09:48:49 crc kubenswrapper[4743]: I1122 09:48:49.212665 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Nov 22 09:48:49 crc kubenswrapper[4743]: I1122 09:48:49.234900 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.234875864 podStartE2EDuration="3.234875864s" podCreationTimestamp="2025-11-22 09:48:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:48:49.232757133 +0000 UTC m=+5202.939118225" watchObservedRunningTime="2025-11-22 09:48:49.234875864 +0000 UTC m=+5202.941236956" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.194538 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-796kq"] Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.196484 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-796kq" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.205266 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-796kq"] Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.244247 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvn5h\" (UniqueName: \"kubernetes.io/projected/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-kube-api-access-bvn5h\") pod \"keystone-db-create-796kq\" (UID: \"329ae2e9-ea30-483f-ab8d-c35659e1fc6d\") " pod="openstack/keystone-db-create-796kq" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.244431 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-operator-scripts\") pod \"keystone-db-create-796kq\" (UID: \"329ae2e9-ea30-483f-ab8d-c35659e1fc6d\") " pod="openstack/keystone-db-create-796kq" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.284356 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2ac5-account-create-9fhb7"] Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.285765 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ac5-account-create-9fhb7" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.288846 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.297671 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2ac5-account-create-9fhb7"] Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.345910 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-565h8\" (UniqueName: \"kubernetes.io/projected/ff060041-1da3-4a68-8ec6-87d2e80387d1-kube-api-access-565h8\") pod \"keystone-2ac5-account-create-9fhb7\" (UID: \"ff060041-1da3-4a68-8ec6-87d2e80387d1\") " pod="openstack/keystone-2ac5-account-create-9fhb7" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.346269 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff060041-1da3-4a68-8ec6-87d2e80387d1-operator-scripts\") pod \"keystone-2ac5-account-create-9fhb7\" (UID: \"ff060041-1da3-4a68-8ec6-87d2e80387d1\") " pod="openstack/keystone-2ac5-account-create-9fhb7" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.346444 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvn5h\" (UniqueName: \"kubernetes.io/projected/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-kube-api-access-bvn5h\") pod \"keystone-db-create-796kq\" (UID: \"329ae2e9-ea30-483f-ab8d-c35659e1fc6d\") " pod="openstack/keystone-db-create-796kq" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.346525 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-operator-scripts\") pod \"keystone-db-create-796kq\" (UID: \"329ae2e9-ea30-483f-ab8d-c35659e1fc6d\") " pod="openstack/keystone-db-create-796kq" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.347281 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-operator-scripts\") pod \"keystone-db-create-796kq\" (UID: \"329ae2e9-ea30-483f-ab8d-c35659e1fc6d\") " pod="openstack/keystone-db-create-796kq" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.365729 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvn5h\" (UniqueName: \"kubernetes.io/projected/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-kube-api-access-bvn5h\") pod \"keystone-db-create-796kq\" (UID: \"329ae2e9-ea30-483f-ab8d-c35659e1fc6d\") " pod="openstack/keystone-db-create-796kq" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.448294 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff060041-1da3-4a68-8ec6-87d2e80387d1-operator-scripts\") pod \"keystone-2ac5-account-create-9fhb7\" (UID: \"ff060041-1da3-4a68-8ec6-87d2e80387d1\") " pod="openstack/keystone-2ac5-account-create-9fhb7" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.448902 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-565h8\" (UniqueName: \"kubernetes.io/projected/ff060041-1da3-4a68-8ec6-87d2e80387d1-kube-api-access-565h8\") pod \"keystone-2ac5-account-create-9fhb7\" (UID: \"ff060041-1da3-4a68-8ec6-87d2e80387d1\") " pod="openstack/keystone-2ac5-account-create-9fhb7" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.450153 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff060041-1da3-4a68-8ec6-87d2e80387d1-operator-scripts\") pod \"keystone-2ac5-account-create-9fhb7\" (UID: \"ff060041-1da3-4a68-8ec6-87d2e80387d1\") " pod="openstack/keystone-2ac5-account-create-9fhb7" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.465938 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-565h8\" (UniqueName: \"kubernetes.io/projected/ff060041-1da3-4a68-8ec6-87d2e80387d1-kube-api-access-565h8\") pod \"keystone-2ac5-account-create-9fhb7\" (UID: \"ff060041-1da3-4a68-8ec6-87d2e80387d1\") " pod="openstack/keystone-2ac5-account-create-9fhb7" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.547024 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-796kq" Nov 22 09:48:53 crc kubenswrapper[4743]: I1122 09:48:53.611754 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ac5-account-create-9fhb7" Nov 22 09:48:54 crc kubenswrapper[4743]: I1122 09:48:54.137783 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2ac5-account-create-9fhb7"] Nov 22 09:48:54 crc kubenswrapper[4743]: I1122 09:48:54.197518 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-796kq"] Nov 22 09:48:54 crc kubenswrapper[4743]: W1122 09:48:54.198316 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod329ae2e9_ea30_483f_ab8d_c35659e1fc6d.slice/crio-b485b6564fc4a84acacda83b7f33d2d80b80de589f31ee85f8da2a4343519ef1 WatchSource:0}: Error finding container b485b6564fc4a84acacda83b7f33d2d80b80de589f31ee85f8da2a4343519ef1: Status 404 returned error can't find the container with id b485b6564fc4a84acacda83b7f33d2d80b80de589f31ee85f8da2a4343519ef1 Nov 22 09:48:54 crc kubenswrapper[4743]: I1122 09:48:54.264613 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-796kq" event={"ID":"329ae2e9-ea30-483f-ab8d-c35659e1fc6d","Type":"ContainerStarted","Data":"b485b6564fc4a84acacda83b7f33d2d80b80de589f31ee85f8da2a4343519ef1"} Nov 22 09:48:54 crc kubenswrapper[4743]: I1122 09:48:54.266866 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ac5-account-create-9fhb7" event={"ID":"ff060041-1da3-4a68-8ec6-87d2e80387d1","Type":"ContainerStarted","Data":"d80ae7852b2eb443e3576dc65535fe353589ab690c41c36a8d1e713043de82b0"} Nov 22 09:48:55 crc kubenswrapper[4743]: I1122 09:48:55.284065 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ac5-account-create-9fhb7" event={"ID":"ff060041-1da3-4a68-8ec6-87d2e80387d1","Type":"ContainerStarted","Data":"700120de0c73ba804db0c2824490ffe15dae21c5db97102a4323d0d0c4eadc59"} Nov 22 09:48:55 crc kubenswrapper[4743]: I1122 09:48:55.286254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-796kq" event={"ID":"329ae2e9-ea30-483f-ab8d-c35659e1fc6d","Type":"ContainerStarted","Data":"e1d1c9ad481460496e72f548e011da607d3c4ba5806b5842a2e2f1cb2e868cad"} Nov 22 09:48:55 crc kubenswrapper[4743]: I1122 09:48:55.308413 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-2ac5-account-create-9fhb7" podStartSLOduration=2.308390226 podStartE2EDuration="2.308390226s" podCreationTimestamp="2025-11-22 09:48:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:48:55.305466812 +0000 UTC m=+5209.011827894" watchObservedRunningTime="2025-11-22 09:48:55.308390226 +0000 UTC m=+5209.014751308" Nov 22 09:48:56 crc kubenswrapper[4743]: I1122 09:48:56.300135 4743 generic.go:334] "Generic (PLEG): container finished" podID="ff060041-1da3-4a68-8ec6-87d2e80387d1" containerID="700120de0c73ba804db0c2824490ffe15dae21c5db97102a4323d0d0c4eadc59" exitCode=0 Nov 22 09:48:56 crc kubenswrapper[4743]: I1122 09:48:56.300207 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ac5-account-create-9fhb7" event={"ID":"ff060041-1da3-4a68-8ec6-87d2e80387d1","Type":"ContainerDied","Data":"700120de0c73ba804db0c2824490ffe15dae21c5db97102a4323d0d0c4eadc59"} Nov 22 09:48:56 crc kubenswrapper[4743]: I1122 09:48:56.304158 4743 generic.go:334] "Generic (PLEG): container finished" podID="329ae2e9-ea30-483f-ab8d-c35659e1fc6d" containerID="e1d1c9ad481460496e72f548e011da607d3c4ba5806b5842a2e2f1cb2e868cad" exitCode=0 Nov 22 09:48:56 crc kubenswrapper[4743]: I1122 09:48:56.304379 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-796kq" event={"ID":"329ae2e9-ea30-483f-ab8d-c35659e1fc6d","Type":"ContainerDied","Data":"e1d1c9ad481460496e72f548e011da607d3c4ba5806b5842a2e2f1cb2e868cad"} Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.747103 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-796kq" Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.835725 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvn5h\" (UniqueName: \"kubernetes.io/projected/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-kube-api-access-bvn5h\") pod \"329ae2e9-ea30-483f-ab8d-c35659e1fc6d\" (UID: \"329ae2e9-ea30-483f-ab8d-c35659e1fc6d\") " Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.835826 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-operator-scripts\") pod \"329ae2e9-ea30-483f-ab8d-c35659e1fc6d\" (UID: \"329ae2e9-ea30-483f-ab8d-c35659e1fc6d\") " Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.836503 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "329ae2e9-ea30-483f-ab8d-c35659e1fc6d" (UID: "329ae2e9-ea30-483f-ab8d-c35659e1fc6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.840925 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-kube-api-access-bvn5h" (OuterVolumeSpecName: "kube-api-access-bvn5h") pod "329ae2e9-ea30-483f-ab8d-c35659e1fc6d" (UID: "329ae2e9-ea30-483f-ab8d-c35659e1fc6d"). InnerVolumeSpecName "kube-api-access-bvn5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.887074 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ac5-account-create-9fhb7" Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.937688 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff060041-1da3-4a68-8ec6-87d2e80387d1-operator-scripts\") pod \"ff060041-1da3-4a68-8ec6-87d2e80387d1\" (UID: \"ff060041-1da3-4a68-8ec6-87d2e80387d1\") " Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.937904 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-565h8\" (UniqueName: \"kubernetes.io/projected/ff060041-1da3-4a68-8ec6-87d2e80387d1-kube-api-access-565h8\") pod \"ff060041-1da3-4a68-8ec6-87d2e80387d1\" (UID: \"ff060041-1da3-4a68-8ec6-87d2e80387d1\") " Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.938143 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff060041-1da3-4a68-8ec6-87d2e80387d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff060041-1da3-4a68-8ec6-87d2e80387d1" (UID: "ff060041-1da3-4a68-8ec6-87d2e80387d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.938360 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.938383 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvn5h\" (UniqueName: \"kubernetes.io/projected/329ae2e9-ea30-483f-ab8d-c35659e1fc6d-kube-api-access-bvn5h\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.938394 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff060041-1da3-4a68-8ec6-87d2e80387d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:57 crc kubenswrapper[4743]: I1122 09:48:57.940767 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff060041-1da3-4a68-8ec6-87d2e80387d1-kube-api-access-565h8" (OuterVolumeSpecName: "kube-api-access-565h8") pod "ff060041-1da3-4a68-8ec6-87d2e80387d1" (UID: "ff060041-1da3-4a68-8ec6-87d2e80387d1"). InnerVolumeSpecName "kube-api-access-565h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:48:58 crc kubenswrapper[4743]: I1122 09:48:58.040054 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-565h8\" (UniqueName: \"kubernetes.io/projected/ff060041-1da3-4a68-8ec6-87d2e80387d1-kube-api-access-565h8\") on node \"crc\" DevicePath \"\"" Nov 22 09:48:58 crc kubenswrapper[4743]: I1122 09:48:58.151625 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:48:58 crc kubenswrapper[4743]: E1122 09:48:58.151946 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:48:58 crc kubenswrapper[4743]: I1122 09:48:58.346856 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-796kq" event={"ID":"329ae2e9-ea30-483f-ab8d-c35659e1fc6d","Type":"ContainerDied","Data":"b485b6564fc4a84acacda83b7f33d2d80b80de589f31ee85f8da2a4343519ef1"} Nov 22 09:48:58 crc kubenswrapper[4743]: I1122 09:48:58.346920 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b485b6564fc4a84acacda83b7f33d2d80b80de589f31ee85f8da2a4343519ef1" Nov 22 09:48:58 crc kubenswrapper[4743]: I1122 09:48:58.347011 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-796kq" Nov 22 09:48:58 crc kubenswrapper[4743]: I1122 09:48:58.349270 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2ac5-account-create-9fhb7" event={"ID":"ff060041-1da3-4a68-8ec6-87d2e80387d1","Type":"ContainerDied","Data":"d80ae7852b2eb443e3576dc65535fe353589ab690c41c36a8d1e713043de82b0"} Nov 22 09:48:58 crc kubenswrapper[4743]: I1122 09:48:58.349317 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d80ae7852b2eb443e3576dc65535fe353589ab690c41c36a8d1e713043de82b0" Nov 22 09:48:58 crc kubenswrapper[4743]: I1122 09:48:58.349347 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2ac5-account-create-9fhb7" Nov 22 09:49:01 crc kubenswrapper[4743]: I1122 09:49:01.940899 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.817526 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-zcjpk"] Nov 22 09:49:03 crc kubenswrapper[4743]: E1122 09:49:03.818263 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff060041-1da3-4a68-8ec6-87d2e80387d1" containerName="mariadb-account-create" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.818280 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff060041-1da3-4a68-8ec6-87d2e80387d1" containerName="mariadb-account-create" Nov 22 09:49:03 crc kubenswrapper[4743]: E1122 09:49:03.818300 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="329ae2e9-ea30-483f-ab8d-c35659e1fc6d" containerName="mariadb-database-create" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.818308 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="329ae2e9-ea30-483f-ab8d-c35659e1fc6d" containerName="mariadb-database-create" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.818519 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff060041-1da3-4a68-8ec6-87d2e80387d1" containerName="mariadb-account-create" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.818534 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="329ae2e9-ea30-483f-ab8d-c35659e1fc6d" containerName="mariadb-database-create" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.819303 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.828713 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.828973 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.829176 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.829337 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6kb4h" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.840052 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zcjpk"] Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.844587 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-combined-ca-bundle\") pod \"keystone-db-sync-zcjpk\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.844634 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psd29\" (UniqueName: \"kubernetes.io/projected/5267038d-28f8-49a0-b908-677a4f11e493-kube-api-access-psd29\") pod \"keystone-db-sync-zcjpk\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.844665 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-config-data\") pod \"keystone-db-sync-zcjpk\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.946261 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-combined-ca-bundle\") pod \"keystone-db-sync-zcjpk\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.946311 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psd29\" (UniqueName: \"kubernetes.io/projected/5267038d-28f8-49a0-b908-677a4f11e493-kube-api-access-psd29\") pod \"keystone-db-sync-zcjpk\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.946344 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-config-data\") pod \"keystone-db-sync-zcjpk\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.955212 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-config-data\") pod \"keystone-db-sync-zcjpk\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.957867 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-combined-ca-bundle\") pod \"keystone-db-sync-zcjpk\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:03 crc kubenswrapper[4743]: I1122 09:49:03.965409 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psd29\" (UniqueName: \"kubernetes.io/projected/5267038d-28f8-49a0-b908-677a4f11e493-kube-api-access-psd29\") pod \"keystone-db-sync-zcjpk\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:04 crc kubenswrapper[4743]: I1122 09:49:04.146401 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:04 crc kubenswrapper[4743]: I1122 09:49:04.582165 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zcjpk"] Nov 22 09:49:05 crc kubenswrapper[4743]: I1122 09:49:05.399783 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zcjpk" event={"ID":"5267038d-28f8-49a0-b908-677a4f11e493","Type":"ContainerStarted","Data":"9840829eda763bffcc00181d2397c321389a4a4e85764eb6bdd4164c5141ef04"} Nov 22 09:49:05 crc kubenswrapper[4743]: I1122 09:49:05.400098 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zcjpk" event={"ID":"5267038d-28f8-49a0-b908-677a4f11e493","Type":"ContainerStarted","Data":"6d6261c0e9c9f68b19880f537f1a758840afe06130ff76bf2c3429353d5aa2d8"} Nov 22 09:49:05 crc kubenswrapper[4743]: I1122 09:49:05.423808 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-zcjpk" podStartSLOduration=2.423789389 podStartE2EDuration="2.423789389s" podCreationTimestamp="2025-11-22 09:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:49:05.41999557 +0000 UTC m=+5219.126356612" watchObservedRunningTime="2025-11-22 09:49:05.423789389 +0000 UTC m=+5219.130150441" Nov 22 09:49:07 crc kubenswrapper[4743]: I1122 09:49:07.415557 4743 generic.go:334] "Generic (PLEG): container finished" podID="5267038d-28f8-49a0-b908-677a4f11e493" containerID="9840829eda763bffcc00181d2397c321389a4a4e85764eb6bdd4164c5141ef04" exitCode=0 Nov 22 09:49:07 crc kubenswrapper[4743]: I1122 09:49:07.415677 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zcjpk" event={"ID":"5267038d-28f8-49a0-b908-677a4f11e493","Type":"ContainerDied","Data":"9840829eda763bffcc00181d2397c321389a4a4e85764eb6bdd4164c5141ef04"} Nov 22 09:49:08 crc kubenswrapper[4743]: I1122 09:49:08.740630 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:08 crc kubenswrapper[4743]: I1122 09:49:08.832885 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psd29\" (UniqueName: \"kubernetes.io/projected/5267038d-28f8-49a0-b908-677a4f11e493-kube-api-access-psd29\") pod \"5267038d-28f8-49a0-b908-677a4f11e493\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " Nov 22 09:49:08 crc kubenswrapper[4743]: I1122 09:49:08.833221 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-config-data\") pod \"5267038d-28f8-49a0-b908-677a4f11e493\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " Nov 22 09:49:08 crc kubenswrapper[4743]: I1122 09:49:08.833253 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-combined-ca-bundle\") pod \"5267038d-28f8-49a0-b908-677a4f11e493\" (UID: \"5267038d-28f8-49a0-b908-677a4f11e493\") " Nov 22 09:49:08 crc kubenswrapper[4743]: I1122 09:49:08.839099 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5267038d-28f8-49a0-b908-677a4f11e493-kube-api-access-psd29" (OuterVolumeSpecName: "kube-api-access-psd29") pod "5267038d-28f8-49a0-b908-677a4f11e493" (UID: "5267038d-28f8-49a0-b908-677a4f11e493"). InnerVolumeSpecName "kube-api-access-psd29". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:49:08 crc kubenswrapper[4743]: I1122 09:49:08.857385 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5267038d-28f8-49a0-b908-677a4f11e493" (UID: "5267038d-28f8-49a0-b908-677a4f11e493"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:08 crc kubenswrapper[4743]: I1122 09:49:08.874134 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-config-data" (OuterVolumeSpecName: "config-data") pod "5267038d-28f8-49a0-b908-677a4f11e493" (UID: "5267038d-28f8-49a0-b908-677a4f11e493"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:08 crc kubenswrapper[4743]: I1122 09:49:08.935520 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psd29\" (UniqueName: \"kubernetes.io/projected/5267038d-28f8-49a0-b908-677a4f11e493-kube-api-access-psd29\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:08 crc kubenswrapper[4743]: I1122 09:49:08.935547 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:08 crc kubenswrapper[4743]: I1122 09:49:08.935570 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5267038d-28f8-49a0-b908-677a4f11e493-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:09 crc kubenswrapper[4743]: I1122 09:49:09.434739 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zcjpk" event={"ID":"5267038d-28f8-49a0-b908-677a4f11e493","Type":"ContainerDied","Data":"6d6261c0e9c9f68b19880f537f1a758840afe06130ff76bf2c3429353d5aa2d8"} Nov 22 09:49:09 crc kubenswrapper[4743]: I1122 09:49:09.434816 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d6261c0e9c9f68b19880f537f1a758840afe06130ff76bf2c3429353d5aa2d8" Nov 22 09:49:09 crc kubenswrapper[4743]: I1122 09:49:09.434872 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zcjpk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.005972 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58865cd75-276bp"] Nov 22 09:49:10 crc kubenswrapper[4743]: E1122 09:49:10.006305 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5267038d-28f8-49a0-b908-677a4f11e493" containerName="keystone-db-sync" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.006318 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5267038d-28f8-49a0-b908-677a4f11e493" containerName="keystone-db-sync" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.006514 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5267038d-28f8-49a0-b908-677a4f11e493" containerName="keystone-db-sync" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.007405 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.029894 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58865cd75-276bp"] Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.056214 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-config\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.056284 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plcxn\" (UniqueName: \"kubernetes.io/projected/f802cf69-1653-49b2-aa50-855b1ee4847f-kube-api-access-plcxn\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.056359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-dns-svc\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.056384 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-nb\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.056405 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-sb\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.063265 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zbwrk"] Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.064696 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.067636 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.067754 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6kb4h" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.067791 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.067996 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.068077 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.068562 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zbwrk"] Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.157663 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-config-data\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.157974 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-nb\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.158005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-sb\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.158025 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-combined-ca-bundle\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.158044 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-credential-keys\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.158070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-config\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.158085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p9x6\" (UniqueName: \"kubernetes.io/projected/7070a95f-1af5-4cad-b910-daa476b1b928-kube-api-access-6p9x6\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.158119 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-fernet-keys\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.158150 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plcxn\" (UniqueName: \"kubernetes.io/projected/f802cf69-1653-49b2-aa50-855b1ee4847f-kube-api-access-plcxn\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.158181 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-scripts\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.158241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-dns-svc\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.159200 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-dns-svc\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.159317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-config\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.159324 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-sb\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.159954 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-nb\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.176656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plcxn\" (UniqueName: \"kubernetes.io/projected/f802cf69-1653-49b2-aa50-855b1ee4847f-kube-api-access-plcxn\") pod \"dnsmasq-dns-58865cd75-276bp\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.259934 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-config-data\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.259998 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-combined-ca-bundle\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.260016 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-credential-keys\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.260051 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p9x6\" (UniqueName: \"kubernetes.io/projected/7070a95f-1af5-4cad-b910-daa476b1b928-kube-api-access-6p9x6\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.260082 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-fernet-keys\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.260160 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-scripts\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.265111 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-fernet-keys\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.265305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-credential-keys\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.268319 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-combined-ca-bundle\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.268489 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-config-data\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.268936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-scripts\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.277110 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p9x6\" (UniqueName: \"kubernetes.io/projected/7070a95f-1af5-4cad-b910-daa476b1b928-kube-api-access-6p9x6\") pod \"keystone-bootstrap-zbwrk\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.326954 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.385710 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.675869 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58865cd75-276bp"] Nov 22 09:49:10 crc kubenswrapper[4743]: W1122 09:49:10.687289 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf802cf69_1653_49b2_aa50_855b1ee4847f.slice/crio-75b89c9e83766e4e625a1b36b8fe49e0e9b87225dc8ea37dd9138e1e3eb6dc86 WatchSource:0}: Error finding container 75b89c9e83766e4e625a1b36b8fe49e0e9b87225dc8ea37dd9138e1e3eb6dc86: Status 404 returned error can't find the container with id 75b89c9e83766e4e625a1b36b8fe49e0e9b87225dc8ea37dd9138e1e3eb6dc86 Nov 22 09:49:10 crc kubenswrapper[4743]: W1122 09:49:10.937397 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7070a95f_1af5_4cad_b910_daa476b1b928.slice/crio-c848051ca803d0402244cf671a18ffb0604c70825af3c5a816efb2efb85aa52d WatchSource:0}: Error finding container c848051ca803d0402244cf671a18ffb0604c70825af3c5a816efb2efb85aa52d: Status 404 returned error can't find the container with id c848051ca803d0402244cf671a18ffb0604c70825af3c5a816efb2efb85aa52d Nov 22 09:49:10 crc kubenswrapper[4743]: I1122 09:49:10.940381 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zbwrk"] Nov 22 09:49:11 crc kubenswrapper[4743]: I1122 09:49:11.459968 4743 generic.go:334] "Generic (PLEG): container finished" podID="f802cf69-1653-49b2-aa50-855b1ee4847f" containerID="8f05eba6204c1f5981bbc4327599753b3f52a12bfa2330da8b9b581b5b919d28" exitCode=0 Nov 22 09:49:11 crc kubenswrapper[4743]: I1122 09:49:11.460676 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58865cd75-276bp" event={"ID":"f802cf69-1653-49b2-aa50-855b1ee4847f","Type":"ContainerDied","Data":"8f05eba6204c1f5981bbc4327599753b3f52a12bfa2330da8b9b581b5b919d28"} Nov 22 09:49:11 crc kubenswrapper[4743]: I1122 09:49:11.463344 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58865cd75-276bp" event={"ID":"f802cf69-1653-49b2-aa50-855b1ee4847f","Type":"ContainerStarted","Data":"75b89c9e83766e4e625a1b36b8fe49e0e9b87225dc8ea37dd9138e1e3eb6dc86"} Nov 22 09:49:11 crc kubenswrapper[4743]: I1122 09:49:11.467859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zbwrk" event={"ID":"7070a95f-1af5-4cad-b910-daa476b1b928","Type":"ContainerStarted","Data":"33e393f310301fb9ee065eff959f6b18430a6ef659cf7dae231730b614e33f68"} Nov 22 09:49:11 crc kubenswrapper[4743]: I1122 09:49:11.467910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zbwrk" event={"ID":"7070a95f-1af5-4cad-b910-daa476b1b928","Type":"ContainerStarted","Data":"c848051ca803d0402244cf671a18ffb0604c70825af3c5a816efb2efb85aa52d"} Nov 22 09:49:11 crc kubenswrapper[4743]: I1122 09:49:11.521329 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zbwrk" podStartSLOduration=1.5213064809999999 podStartE2EDuration="1.521306481s" podCreationTimestamp="2025-11-22 09:49:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:49:11.519655064 +0000 UTC m=+5225.226016126" watchObservedRunningTime="2025-11-22 09:49:11.521306481 +0000 UTC m=+5225.227667533" Nov 22 09:49:12 crc kubenswrapper[4743]: I1122 09:49:12.497825 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58865cd75-276bp" event={"ID":"f802cf69-1653-49b2-aa50-855b1ee4847f","Type":"ContainerStarted","Data":"0337b0a5fa07ca3c3f3dc694684e86ea5099757c21316f140a922cd54aac0f42"} Nov 22 09:49:12 crc kubenswrapper[4743]: I1122 09:49:12.524352 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58865cd75-276bp" podStartSLOduration=3.52432453 podStartE2EDuration="3.52432453s" podCreationTimestamp="2025-11-22 09:49:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:49:12.521506409 +0000 UTC m=+5226.227867471" watchObservedRunningTime="2025-11-22 09:49:12.52432453 +0000 UTC m=+5226.230685612" Nov 22 09:49:13 crc kubenswrapper[4743]: I1122 09:49:13.151534 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:49:13 crc kubenswrapper[4743]: E1122 09:49:13.151968 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:49:13 crc kubenswrapper[4743]: I1122 09:49:13.507361 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:14 crc kubenswrapper[4743]: I1122 09:49:14.516322 4743 generic.go:334] "Generic (PLEG): container finished" podID="7070a95f-1af5-4cad-b910-daa476b1b928" containerID="33e393f310301fb9ee065eff959f6b18430a6ef659cf7dae231730b614e33f68" exitCode=0 Nov 22 09:49:14 crc kubenswrapper[4743]: I1122 09:49:14.516720 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zbwrk" event={"ID":"7070a95f-1af5-4cad-b910-daa476b1b928","Type":"ContainerDied","Data":"33e393f310301fb9ee065eff959f6b18430a6ef659cf7dae231730b614e33f68"} Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.834314 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.888148 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-scripts\") pod \"7070a95f-1af5-4cad-b910-daa476b1b928\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.888318 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-combined-ca-bundle\") pod \"7070a95f-1af5-4cad-b910-daa476b1b928\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.888340 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p9x6\" (UniqueName: \"kubernetes.io/projected/7070a95f-1af5-4cad-b910-daa476b1b928-kube-api-access-6p9x6\") pod \"7070a95f-1af5-4cad-b910-daa476b1b928\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.888391 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-fernet-keys\") pod \"7070a95f-1af5-4cad-b910-daa476b1b928\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.888420 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-credential-keys\") pod \"7070a95f-1af5-4cad-b910-daa476b1b928\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.888480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-config-data\") pod \"7070a95f-1af5-4cad-b910-daa476b1b928\" (UID: \"7070a95f-1af5-4cad-b910-daa476b1b928\") " Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.894548 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7070a95f-1af5-4cad-b910-daa476b1b928" (UID: "7070a95f-1af5-4cad-b910-daa476b1b928"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.895178 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-scripts" (OuterVolumeSpecName: "scripts") pod "7070a95f-1af5-4cad-b910-daa476b1b928" (UID: "7070a95f-1af5-4cad-b910-daa476b1b928"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.895128 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7070a95f-1af5-4cad-b910-daa476b1b928" (UID: "7070a95f-1af5-4cad-b910-daa476b1b928"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.897600 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7070a95f-1af5-4cad-b910-daa476b1b928-kube-api-access-6p9x6" (OuterVolumeSpecName: "kube-api-access-6p9x6") pod "7070a95f-1af5-4cad-b910-daa476b1b928" (UID: "7070a95f-1af5-4cad-b910-daa476b1b928"). InnerVolumeSpecName "kube-api-access-6p9x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.924307 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-config-data" (OuterVolumeSpecName: "config-data") pod "7070a95f-1af5-4cad-b910-daa476b1b928" (UID: "7070a95f-1af5-4cad-b910-daa476b1b928"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.925647 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7070a95f-1af5-4cad-b910-daa476b1b928" (UID: "7070a95f-1af5-4cad-b910-daa476b1b928"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.991146 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.991195 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.991210 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.991222 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.991239 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7070a95f-1af5-4cad-b910-daa476b1b928-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:15 crc kubenswrapper[4743]: I1122 09:49:15.991253 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p9x6\" (UniqueName: \"kubernetes.io/projected/7070a95f-1af5-4cad-b910-daa476b1b928-kube-api-access-6p9x6\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.535263 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zbwrk" event={"ID":"7070a95f-1af5-4cad-b910-daa476b1b928","Type":"ContainerDied","Data":"c848051ca803d0402244cf671a18ffb0604c70825af3c5a816efb2efb85aa52d"} Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.535336 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c848051ca803d0402244cf671a18ffb0604c70825af3c5a816efb2efb85aa52d" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.535426 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zbwrk" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.626434 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zbwrk"] Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.639182 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zbwrk"] Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.714865 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vqf7n"] Nov 22 09:49:16 crc kubenswrapper[4743]: E1122 09:49:16.715299 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7070a95f-1af5-4cad-b910-daa476b1b928" containerName="keystone-bootstrap" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.715318 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7070a95f-1af5-4cad-b910-daa476b1b928" containerName="keystone-bootstrap" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.715515 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7070a95f-1af5-4cad-b910-daa476b1b928" containerName="keystone-bootstrap" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.716206 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.720012 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.720039 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.720199 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.720389 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.720553 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6kb4h" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.727210 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vqf7n"] Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.806633 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-combined-ca-bundle\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.806727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-scripts\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.806916 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fgxp\" (UniqueName: \"kubernetes.io/projected/478564b9-9f44-489e-bacd-c9bb128f8e28-kube-api-access-6fgxp\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.807103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-credential-keys\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.807223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-fernet-keys\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.807439 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-config-data\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.910005 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-config-data\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.911336 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-combined-ca-bundle\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.911505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-scripts\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.911650 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fgxp\" (UniqueName: \"kubernetes.io/projected/478564b9-9f44-489e-bacd-c9bb128f8e28-kube-api-access-6fgxp\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.911798 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-credential-keys\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.911937 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-fernet-keys\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.915073 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-combined-ca-bundle\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.915258 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-config-data\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.923955 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-scripts\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.924091 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-credential-keys\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.930866 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-fernet-keys\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:16 crc kubenswrapper[4743]: I1122 09:49:16.935853 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fgxp\" (UniqueName: \"kubernetes.io/projected/478564b9-9f44-489e-bacd-c9bb128f8e28-kube-api-access-6fgxp\") pod \"keystone-bootstrap-vqf7n\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:17 crc kubenswrapper[4743]: I1122 09:49:17.033049 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:17 crc kubenswrapper[4743]: I1122 09:49:17.166385 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7070a95f-1af5-4cad-b910-daa476b1b928" path="/var/lib/kubelet/pods/7070a95f-1af5-4cad-b910-daa476b1b928/volumes" Nov 22 09:49:17 crc kubenswrapper[4743]: I1122 09:49:17.483877 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vqf7n"] Nov 22 09:49:17 crc kubenswrapper[4743]: I1122 09:49:17.551001 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vqf7n" event={"ID":"478564b9-9f44-489e-bacd-c9bb128f8e28","Type":"ContainerStarted","Data":"6064c0f9ba0b900692504e669a3dc15534126eb5167e6adab2fe4fdb50dd6f9c"} Nov 22 09:49:18 crc kubenswrapper[4743]: I1122 09:49:18.560610 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vqf7n" event={"ID":"478564b9-9f44-489e-bacd-c9bb128f8e28","Type":"ContainerStarted","Data":"139fe17e6226ce28ea3b7c40c59f999dd29e99fb60f95ed5dfbd1359b0fd74b5"} Nov 22 09:49:18 crc kubenswrapper[4743]: I1122 09:49:18.598241 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vqf7n" podStartSLOduration=2.598208842 podStartE2EDuration="2.598208842s" podCreationTimestamp="2025-11-22 09:49:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:49:18.590692026 +0000 UTC m=+5232.297053078" watchObservedRunningTime="2025-11-22 09:49:18.598208842 +0000 UTC m=+5232.304569964" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.327801 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.403124 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864bc46885-j5859"] Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.403356 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864bc46885-j5859" podUID="035ba0cf-ceee-4cf3-8628-652a1bf5975f" containerName="dnsmasq-dns" containerID="cri-o://3ca9dfd9bbf7236a679c41a503118c33a01eed7e7d8f001202e04701cc88569c" gracePeriod=10 Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.582971 4743 generic.go:334] "Generic (PLEG): container finished" podID="035ba0cf-ceee-4cf3-8628-652a1bf5975f" containerID="3ca9dfd9bbf7236a679c41a503118c33a01eed7e7d8f001202e04701cc88569c" exitCode=0 Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.583047 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864bc46885-j5859" event={"ID":"035ba0cf-ceee-4cf3-8628-652a1bf5975f","Type":"ContainerDied","Data":"3ca9dfd9bbf7236a679c41a503118c33a01eed7e7d8f001202e04701cc88569c"} Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.585083 4743 generic.go:334] "Generic (PLEG): container finished" podID="478564b9-9f44-489e-bacd-c9bb128f8e28" containerID="139fe17e6226ce28ea3b7c40c59f999dd29e99fb60f95ed5dfbd1359b0fd74b5" exitCode=0 Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.585132 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vqf7n" event={"ID":"478564b9-9f44-489e-bacd-c9bb128f8e28","Type":"ContainerDied","Data":"139fe17e6226ce28ea3b7c40c59f999dd29e99fb60f95ed5dfbd1359b0fd74b5"} Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.827461 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.892742 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-nb\") pod \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.893469 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-dns-svc\") pod \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.893673 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-sb\") pod \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.893700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzkxp\" (UniqueName: \"kubernetes.io/projected/035ba0cf-ceee-4cf3-8628-652a1bf5975f-kube-api-access-wzkxp\") pod \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.893728 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-config\") pod \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\" (UID: \"035ba0cf-ceee-4cf3-8628-652a1bf5975f\") " Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.898748 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035ba0cf-ceee-4cf3-8628-652a1bf5975f-kube-api-access-wzkxp" (OuterVolumeSpecName: "kube-api-access-wzkxp") pod "035ba0cf-ceee-4cf3-8628-652a1bf5975f" (UID: "035ba0cf-ceee-4cf3-8628-652a1bf5975f"). InnerVolumeSpecName "kube-api-access-wzkxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.929305 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "035ba0cf-ceee-4cf3-8628-652a1bf5975f" (UID: "035ba0cf-ceee-4cf3-8628-652a1bf5975f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.936310 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-config" (OuterVolumeSpecName: "config") pod "035ba0cf-ceee-4cf3-8628-652a1bf5975f" (UID: "035ba0cf-ceee-4cf3-8628-652a1bf5975f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.936517 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "035ba0cf-ceee-4cf3-8628-652a1bf5975f" (UID: "035ba0cf-ceee-4cf3-8628-652a1bf5975f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.942340 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "035ba0cf-ceee-4cf3-8628-652a1bf5975f" (UID: "035ba0cf-ceee-4cf3-8628-652a1bf5975f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.996019 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.996056 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzkxp\" (UniqueName: \"kubernetes.io/projected/035ba0cf-ceee-4cf3-8628-652a1bf5975f-kube-api-access-wzkxp\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.996069 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.996079 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:20 crc kubenswrapper[4743]: I1122 09:49:20.996088 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/035ba0cf-ceee-4cf3-8628-652a1bf5975f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:21 crc kubenswrapper[4743]: I1122 09:49:21.595424 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864bc46885-j5859" event={"ID":"035ba0cf-ceee-4cf3-8628-652a1bf5975f","Type":"ContainerDied","Data":"1ddacb0c224f6ce10e8f764a9ff9740094f8d63558d402b52367caf8a488ccfe"} Nov 22 09:49:21 crc kubenswrapper[4743]: I1122 09:49:21.595490 4743 scope.go:117] "RemoveContainer" containerID="3ca9dfd9bbf7236a679c41a503118c33a01eed7e7d8f001202e04701cc88569c" Nov 22 09:49:21 crc kubenswrapper[4743]: I1122 09:49:21.595448 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864bc46885-j5859" Nov 22 09:49:21 crc kubenswrapper[4743]: I1122 09:49:21.631233 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864bc46885-j5859"] Nov 22 09:49:21 crc kubenswrapper[4743]: I1122 09:49:21.634044 4743 scope.go:117] "RemoveContainer" containerID="b097dd51036660b0e306e9b435c8b0d3a8b6fa2623c942e3f8d62426107b4e45" Nov 22 09:49:21 crc kubenswrapper[4743]: I1122 09:49:21.641147 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864bc46885-j5859"] Nov 22 09:49:21 crc kubenswrapper[4743]: I1122 09:49:21.896280 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.022541 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-fernet-keys\") pod \"478564b9-9f44-489e-bacd-c9bb128f8e28\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.023674 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-combined-ca-bundle\") pod \"478564b9-9f44-489e-bacd-c9bb128f8e28\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.023717 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-scripts\") pod \"478564b9-9f44-489e-bacd-c9bb128f8e28\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.023738 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fgxp\" (UniqueName: \"kubernetes.io/projected/478564b9-9f44-489e-bacd-c9bb128f8e28-kube-api-access-6fgxp\") pod \"478564b9-9f44-489e-bacd-c9bb128f8e28\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.023803 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-credential-keys\") pod \"478564b9-9f44-489e-bacd-c9bb128f8e28\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.023904 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-config-data\") pod \"478564b9-9f44-489e-bacd-c9bb128f8e28\" (UID: \"478564b9-9f44-489e-bacd-c9bb128f8e28\") " Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.028681 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "478564b9-9f44-489e-bacd-c9bb128f8e28" (UID: "478564b9-9f44-489e-bacd-c9bb128f8e28"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.028944 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-scripts" (OuterVolumeSpecName: "scripts") pod "478564b9-9f44-489e-bacd-c9bb128f8e28" (UID: "478564b9-9f44-489e-bacd-c9bb128f8e28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.028983 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/478564b9-9f44-489e-bacd-c9bb128f8e28-kube-api-access-6fgxp" (OuterVolumeSpecName: "kube-api-access-6fgxp") pod "478564b9-9f44-489e-bacd-c9bb128f8e28" (UID: "478564b9-9f44-489e-bacd-c9bb128f8e28"). InnerVolumeSpecName "kube-api-access-6fgxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.029896 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "478564b9-9f44-489e-bacd-c9bb128f8e28" (UID: "478564b9-9f44-489e-bacd-c9bb128f8e28"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.047742 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "478564b9-9f44-489e-bacd-c9bb128f8e28" (UID: "478564b9-9f44-489e-bacd-c9bb128f8e28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.047763 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-config-data" (OuterVolumeSpecName: "config-data") pod "478564b9-9f44-489e-bacd-c9bb128f8e28" (UID: "478564b9-9f44-489e-bacd-c9bb128f8e28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.126511 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.126545 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.126554 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.126566 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.126586 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fgxp\" (UniqueName: \"kubernetes.io/projected/478564b9-9f44-489e-bacd-c9bb128f8e28-kube-api-access-6fgxp\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.126595 4743 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/478564b9-9f44-489e-bacd-c9bb128f8e28-credential-keys\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.609097 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vqf7n" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.609118 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vqf7n" event={"ID":"478564b9-9f44-489e-bacd-c9bb128f8e28","Type":"ContainerDied","Data":"6064c0f9ba0b900692504e669a3dc15534126eb5167e6adab2fe4fdb50dd6f9c"} Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.609170 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6064c0f9ba0b900692504e669a3dc15534126eb5167e6adab2fe4fdb50dd6f9c" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.687726 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b8795ddf-b2cbj"] Nov 22 09:49:22 crc kubenswrapper[4743]: E1122 09:49:22.688041 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="478564b9-9f44-489e-bacd-c9bb128f8e28" containerName="keystone-bootstrap" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.688061 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="478564b9-9f44-489e-bacd-c9bb128f8e28" containerName="keystone-bootstrap" Nov 22 09:49:22 crc kubenswrapper[4743]: E1122 09:49:22.688071 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035ba0cf-ceee-4cf3-8628-652a1bf5975f" containerName="init" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.688077 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="035ba0cf-ceee-4cf3-8628-652a1bf5975f" containerName="init" Nov 22 09:49:22 crc kubenswrapper[4743]: E1122 09:49:22.688094 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035ba0cf-ceee-4cf3-8628-652a1bf5975f" containerName="dnsmasq-dns" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.688101 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="035ba0cf-ceee-4cf3-8628-652a1bf5975f" containerName="dnsmasq-dns" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.688255 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="035ba0cf-ceee-4cf3-8628-652a1bf5975f" containerName="dnsmasq-dns" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.688273 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="478564b9-9f44-489e-bacd-c9bb128f8e28" containerName="keystone-bootstrap" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.688831 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.691334 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.691458 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-6kb4h" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.691619 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.693398 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.707618 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b8795ddf-b2cbj"] Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.735599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-combined-ca-bundle\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.735648 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-config-data\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.735831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-fernet-keys\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.735882 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swgjw\" (UniqueName: \"kubernetes.io/projected/dcc88110-2290-4a35-99da-2ec2d74d262a-kube-api-access-swgjw\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.736127 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-scripts\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.736247 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-credential-keys\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.838327 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-credential-keys\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.838375 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-combined-ca-bundle\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.838406 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-config-data\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.838453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-fernet-keys\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.838470 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swgjw\" (UniqueName: \"kubernetes.io/projected/dcc88110-2290-4a35-99da-2ec2d74d262a-kube-api-access-swgjw\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.838541 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-scripts\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.841782 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-credential-keys\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.841960 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-combined-ca-bundle\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.842020 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-config-data\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.842304 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-scripts\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.842998 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dcc88110-2290-4a35-99da-2ec2d74d262a-fernet-keys\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:22 crc kubenswrapper[4743]: I1122 09:49:22.858998 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swgjw\" (UniqueName: \"kubernetes.io/projected/dcc88110-2290-4a35-99da-2ec2d74d262a-kube-api-access-swgjw\") pod \"keystone-b8795ddf-b2cbj\" (UID: \"dcc88110-2290-4a35-99da-2ec2d74d262a\") " pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:23 crc kubenswrapper[4743]: I1122 09:49:23.006253 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:23 crc kubenswrapper[4743]: I1122 09:49:23.169436 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035ba0cf-ceee-4cf3-8628-652a1bf5975f" path="/var/lib/kubelet/pods/035ba0cf-ceee-4cf3-8628-652a1bf5975f/volumes" Nov 22 09:49:23 crc kubenswrapper[4743]: I1122 09:49:23.444718 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b8795ddf-b2cbj"] Nov 22 09:49:23 crc kubenswrapper[4743]: I1122 09:49:23.621373 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b8795ddf-b2cbj" event={"ID":"dcc88110-2290-4a35-99da-2ec2d74d262a","Type":"ContainerStarted","Data":"d7da93fa771309b24884264fb0ba5b72e879186a980165aa2fe7c7b537df48f0"} Nov 22 09:49:24 crc kubenswrapper[4743]: I1122 09:49:24.632767 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b8795ddf-b2cbj" event={"ID":"dcc88110-2290-4a35-99da-2ec2d74d262a","Type":"ContainerStarted","Data":"95a0f4806ecfb56d8a068c1983fa154498a24360a7ad476856df2f8052a01147"} Nov 22 09:49:24 crc kubenswrapper[4743]: I1122 09:49:24.633143 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:24 crc kubenswrapper[4743]: I1122 09:49:24.659872 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-b8795ddf-b2cbj" podStartSLOduration=2.6598515430000003 podStartE2EDuration="2.659851543s" podCreationTimestamp="2025-11-22 09:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:49:24.655552709 +0000 UTC m=+5238.361913781" watchObservedRunningTime="2025-11-22 09:49:24.659851543 +0000 UTC m=+5238.366212595" Nov 22 09:49:25 crc kubenswrapper[4743]: I1122 09:49:25.151747 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:49:25 crc kubenswrapper[4743]: E1122 09:49:25.152031 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:49:32 crc kubenswrapper[4743]: I1122 09:49:32.124988 4743 scope.go:117] "RemoveContainer" containerID="da30802932f11d5acb6469776270b33a83e33b97c4805d6b42ab438ee6550422" Nov 22 09:49:32 crc kubenswrapper[4743]: I1122 09:49:32.151407 4743 scope.go:117] "RemoveContainer" containerID="3825cced0ab5f9704258432dc4c4a3c1728c79f5c0d74e56fe2b292bff9185a9" Nov 22 09:49:32 crc kubenswrapper[4743]: I1122 09:49:32.211180 4743 scope.go:117] "RemoveContainer" containerID="5f052d4dd791b113a47dc232d2ce4a7ff66a08fe0ce8fa891354c8cc4631668a" Nov 22 09:49:32 crc kubenswrapper[4743]: I1122 09:49:32.259312 4743 scope.go:117] "RemoveContainer" containerID="ea6f8664d83708a8020da7b04b588f603641b2056f660e6943a7e246085f6c78" Nov 22 09:49:32 crc kubenswrapper[4743]: I1122 09:49:32.322654 4743 scope.go:117] "RemoveContainer" containerID="a5e0510666351e4e061fdf752a6ac8aba616018430171a77a6142944a2d00a1e" Nov 22 09:49:39 crc kubenswrapper[4743]: I1122 09:49:39.152549 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:49:39 crc kubenswrapper[4743]: E1122 09:49:39.153885 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:49:50 crc kubenswrapper[4743]: I1122 09:49:50.151880 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:49:50 crc kubenswrapper[4743]: E1122 09:49:50.152614 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.223428 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5wwmp"] Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.226738 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.241043 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wwmp"] Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.296515 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-utilities\") pod \"community-operators-5wwmp\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.296560 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m4hg\" (UniqueName: \"kubernetes.io/projected/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-kube-api-access-9m4hg\") pod \"community-operators-5wwmp\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.296657 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-catalog-content\") pod \"community-operators-5wwmp\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.399239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-catalog-content\") pod \"community-operators-5wwmp\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.400225 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-utilities\") pod \"community-operators-5wwmp\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.400806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m4hg\" (UniqueName: \"kubernetes.io/projected/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-kube-api-access-9m4hg\") pod \"community-operators-5wwmp\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.400759 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-utilities\") pod \"community-operators-5wwmp\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.400027 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-catalog-content\") pod \"community-operators-5wwmp\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.425234 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m4hg\" (UniqueName: \"kubernetes.io/projected/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-kube-api-access-9m4hg\") pod \"community-operators-5wwmp\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:53 crc kubenswrapper[4743]: I1122 09:49:53.557239 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:49:54 crc kubenswrapper[4743]: I1122 09:49:54.138680 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5wwmp"] Nov 22 09:49:54 crc kubenswrapper[4743]: I1122 09:49:54.718829 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-b8795ddf-b2cbj" Nov 22 09:49:54 crc kubenswrapper[4743]: I1122 09:49:54.918085 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerID="21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b" exitCode=0 Nov 22 09:49:54 crc kubenswrapper[4743]: I1122 09:49:54.918150 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wwmp" event={"ID":"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76","Type":"ContainerDied","Data":"21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b"} Nov 22 09:49:54 crc kubenswrapper[4743]: I1122 09:49:54.918179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wwmp" event={"ID":"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76","Type":"ContainerStarted","Data":"6e543e8f93ed3acee4fac4054802fc1488dad58e706f3a1bf2815cac0d1a146d"} Nov 22 09:49:56 crc kubenswrapper[4743]: I1122 09:49:56.945637 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerID="75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b" exitCode=0 Nov 22 09:49:56 crc kubenswrapper[4743]: I1122 09:49:56.945879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wwmp" event={"ID":"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76","Type":"ContainerDied","Data":"75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b"} Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.366531 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.368082 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.370706 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.370835 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.370863 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-q9ps6" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.379424 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.392645 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 22 09:49:58 crc kubenswrapper[4743]: E1122 09:49:58.393277 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-xk2r9 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-xk2r9 openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="974c044b-1ad2-45f9-8a2f-674a2830a762" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.396068 4743 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"974c044b-1ad2-45f9-8a2f-674a2830a762\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:49:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:49:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:49:58Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-11-22T09:49:58Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xk2r9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-11-22T09:49:58Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.400453 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.421728 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.432106 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.440992 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.455566 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="974c044b-1ad2-45f9-8a2f-674a2830a762" podUID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.487964 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk2r9\" (UniqueName: \"kubernetes.io/projected/974c044b-1ad2-45f9-8a2f-674a2830a762-kube-api-access-xk2r9\") pod \"openstackclient\" (UID: \"974c044b-1ad2-45f9-8a2f-674a2830a762\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.488712 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config-secret\") pod \"openstackclient\" (UID: \"974c044b-1ad2-45f9-8a2f-674a2830a762\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.488899 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config\") pod \"openstackclient\" (UID: \"974c044b-1ad2-45f9-8a2f-674a2830a762\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.590228 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config-secret\") pod \"openstackclient\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.590297 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk2r9\" (UniqueName: \"kubernetes.io/projected/974c044b-1ad2-45f9-8a2f-674a2830a762-kube-api-access-xk2r9\") pod \"openstackclient\" (UID: \"974c044b-1ad2-45f9-8a2f-674a2830a762\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.590315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config-secret\") pod \"openstackclient\" (UID: \"974c044b-1ad2-45f9-8a2f-674a2830a762\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.590351 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config\") pod \"openstackclient\" (UID: \"974c044b-1ad2-45f9-8a2f-674a2830a762\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.590388 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config\") pod \"openstackclient\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.590416 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vfv\" (UniqueName: \"kubernetes.io/projected/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-kube-api-access-h2vfv\") pod \"openstackclient\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.591559 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config\") pod \"openstackclient\" (UID: \"974c044b-1ad2-45f9-8a2f-674a2830a762\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: E1122 09:49:58.592448 4743 projected.go:194] Error preparing data for projected volume kube-api-access-xk2r9 for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (974c044b-1ad2-45f9-8a2f-674a2830a762) does not match the UID in record. The object might have been deleted and then recreated Nov 22 09:49:58 crc kubenswrapper[4743]: E1122 09:49:58.592524 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/974c044b-1ad2-45f9-8a2f-674a2830a762-kube-api-access-xk2r9 podName:974c044b-1ad2-45f9-8a2f-674a2830a762 nodeName:}" failed. No retries permitted until 2025-11-22 09:49:59.092503542 +0000 UTC m=+5272.798864594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xk2r9" (UniqueName: "kubernetes.io/projected/974c044b-1ad2-45f9-8a2f-674a2830a762-kube-api-access-xk2r9") pod "openstackclient" (UID: "974c044b-1ad2-45f9-8a2f-674a2830a762") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (974c044b-1ad2-45f9-8a2f-674a2830a762) does not match the UID in record. The object might have been deleted and then recreated Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.600176 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config-secret\") pod \"openstackclient\" (UID: \"974c044b-1ad2-45f9-8a2f-674a2830a762\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.693073 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config-secret\") pod \"openstackclient\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.693377 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config\") pod \"openstackclient\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.693446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vfv\" (UniqueName: \"kubernetes.io/projected/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-kube-api-access-h2vfv\") pod \"openstackclient\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.694627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config\") pod \"openstackclient\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.697125 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config-secret\") pod \"openstackclient\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.718406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vfv\" (UniqueName: \"kubernetes.io/projected/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-kube-api-access-h2vfv\") pod \"openstackclient\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.752118 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.969254 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.969249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wwmp" event={"ID":"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76","Type":"ContainerStarted","Data":"2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87"} Nov 22 09:49:58 crc kubenswrapper[4743]: I1122 09:49:58.988824 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.024373 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="974c044b-1ad2-45f9-8a2f-674a2830a762" podUID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.100484 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config-secret\") pod \"974c044b-1ad2-45f9-8a2f-674a2830a762\" (UID: \"974c044b-1ad2-45f9-8a2f-674a2830a762\") " Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.100738 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config\") pod \"974c044b-1ad2-45f9-8a2f-674a2830a762\" (UID: \"974c044b-1ad2-45f9-8a2f-674a2830a762\") " Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.101178 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk2r9\" (UniqueName: \"kubernetes.io/projected/974c044b-1ad2-45f9-8a2f-674a2830a762-kube-api-access-xk2r9\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.101347 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "974c044b-1ad2-45f9-8a2f-674a2830a762" (UID: "974c044b-1ad2-45f9-8a2f-674a2830a762"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.106726 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "974c044b-1ad2-45f9-8a2f-674a2830a762" (UID: "974c044b-1ad2-45f9-8a2f-674a2830a762"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.160259 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="974c044b-1ad2-45f9-8a2f-674a2830a762" path="/var/lib/kubelet/pods/974c044b-1ad2-45f9-8a2f-674a2830a762/volumes" Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.202525 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5wwmp" podStartSLOduration=3.022882398 podStartE2EDuration="6.202505525s" podCreationTimestamp="2025-11-22 09:49:53 +0000 UTC" firstStartedPulling="2025-11-22 09:49:54.920357989 +0000 UTC m=+5268.626719041" lastFinishedPulling="2025-11-22 09:49:58.099981116 +0000 UTC m=+5271.806342168" observedRunningTime="2025-11-22 09:49:59.005075931 +0000 UTC m=+5272.711437003" watchObservedRunningTime="2025-11-22 09:49:59.202505525 +0000 UTC m=+5272.908866597" Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.206984 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.207032 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/974c044b-1ad2-45f9-8a2f-674a2830a762-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.207379 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 09:49:59 crc kubenswrapper[4743]: W1122 09:49:59.208060 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod740fdd50_f1ff_4415_b473_a5e4a86f2e5a.slice/crio-6bf309ca005fd46fdcc5850af2530624075d909489be68cf71013bbd46ffbcaf WatchSource:0}: Error finding container 6bf309ca005fd46fdcc5850af2530624075d909489be68cf71013bbd46ffbcaf: Status 404 returned error can't find the container with id 6bf309ca005fd46fdcc5850af2530624075d909489be68cf71013bbd46ffbcaf Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.979823 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"740fdd50-f1ff-4415-b473-a5e4a86f2e5a","Type":"ContainerStarted","Data":"dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337"} Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.979851 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.979887 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"740fdd50-f1ff-4415-b473-a5e4a86f2e5a","Type":"ContainerStarted","Data":"6bf309ca005fd46fdcc5850af2530624075d909489be68cf71013bbd46ffbcaf"} Nov 22 09:49:59 crc kubenswrapper[4743]: I1122 09:49:59.984879 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="974c044b-1ad2-45f9-8a2f-674a2830a762" podUID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" Nov 22 09:50:00 crc kubenswrapper[4743]: I1122 09:50:00.002478 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.002450496 podStartE2EDuration="2.002450496s" podCreationTimestamp="2025-11-22 09:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:49:59.997557145 +0000 UTC m=+5273.703918197" watchObservedRunningTime="2025-11-22 09:50:00.002450496 +0000 UTC m=+5273.708811548" Nov 22 09:50:03 crc kubenswrapper[4743]: I1122 09:50:03.558400 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:50:03 crc kubenswrapper[4743]: I1122 09:50:03.559031 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:50:03 crc kubenswrapper[4743]: I1122 09:50:03.638072 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:50:04 crc kubenswrapper[4743]: I1122 09:50:04.066482 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:50:04 crc kubenswrapper[4743]: I1122 09:50:04.116470 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5wwmp"] Nov 22 09:50:05 crc kubenswrapper[4743]: I1122 09:50:05.152130 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:50:05 crc kubenswrapper[4743]: E1122 09:50:05.152695 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.035978 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5wwmp" podUID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerName="registry-server" containerID="cri-o://2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87" gracePeriod=2 Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.522049 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.567959 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m4hg\" (UniqueName: \"kubernetes.io/projected/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-kube-api-access-9m4hg\") pod \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.568074 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-catalog-content\") pod \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.568259 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-utilities\") pod \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\" (UID: \"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76\") " Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.571283 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-utilities" (OuterVolumeSpecName: "utilities") pod "7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" (UID: "7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.590675 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-kube-api-access-9m4hg" (OuterVolumeSpecName: "kube-api-access-9m4hg") pod "7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" (UID: "7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76"). InnerVolumeSpecName "kube-api-access-9m4hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.644895 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" (UID: "7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.671484 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.671618 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m4hg\" (UniqueName: \"kubernetes.io/projected/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-kube-api-access-9m4hg\") on node \"crc\" DevicePath \"\"" Nov 22 09:50:06 crc kubenswrapper[4743]: I1122 09:50:06.671774 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.051739 4743 generic.go:334] "Generic (PLEG): container finished" podID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerID="2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87" exitCode=0 Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.051798 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wwmp" event={"ID":"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76","Type":"ContainerDied","Data":"2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87"} Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.051850 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5wwmp" event={"ID":"7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76","Type":"ContainerDied","Data":"6e543e8f93ed3acee4fac4054802fc1488dad58e706f3a1bf2815cac0d1a146d"} Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.051852 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5wwmp" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.051869 4743 scope.go:117] "RemoveContainer" containerID="2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.085762 4743 scope.go:117] "RemoveContainer" containerID="75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.099213 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5wwmp"] Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.113481 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5wwmp"] Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.131626 4743 scope.go:117] "RemoveContainer" containerID="21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.159631 4743 scope.go:117] "RemoveContainer" containerID="2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87" Nov 22 09:50:07 crc kubenswrapper[4743]: E1122 09:50:07.160091 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87\": container with ID starting with 2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87 not found: ID does not exist" containerID="2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.160146 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87"} err="failed to get container status \"2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87\": rpc error: code = NotFound desc = could not find container \"2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87\": container with ID starting with 2eee43d11f4df2af598a9a660afbeb05d4f65fd3912eecb90294173c4f3d8f87 not found: ID does not exist" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.160184 4743 scope.go:117] "RemoveContainer" containerID="75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b" Nov 22 09:50:07 crc kubenswrapper[4743]: E1122 09:50:07.160719 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b\": container with ID starting with 75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b not found: ID does not exist" containerID="75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.160757 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b"} err="failed to get container status \"75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b\": rpc error: code = NotFound desc = could not find container \"75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b\": container with ID starting with 75b457fe5112aa22f577cfc508ae0aa0b026c877fe7534954682d3f5e538375b not found: ID does not exist" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.160780 4743 scope.go:117] "RemoveContainer" containerID="21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b" Nov 22 09:50:07 crc kubenswrapper[4743]: E1122 09:50:07.162761 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b\": container with ID starting with 21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b not found: ID does not exist" containerID="21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.162795 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b"} err="failed to get container status \"21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b\": rpc error: code = NotFound desc = could not find container \"21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b\": container with ID starting with 21ad8b64b8c6c5563ee670b3959cd84b6aac519b855058d341616abc211d0f2b not found: ID does not exist" Nov 22 09:50:07 crc kubenswrapper[4743]: I1122 09:50:07.164987 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" path="/var/lib/kubelet/pods/7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76/volumes" Nov 22 09:50:20 crc kubenswrapper[4743]: I1122 09:50:20.152132 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:50:20 crc kubenswrapper[4743]: E1122 09:50:20.153423 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:50:33 crc kubenswrapper[4743]: I1122 09:50:33.151495 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:50:33 crc kubenswrapper[4743]: E1122 09:50:33.152361 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:50:48 crc kubenswrapper[4743]: I1122 09:50:48.151839 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:50:48 crc kubenswrapper[4743]: E1122 09:50:48.152767 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:51:00 crc kubenswrapper[4743]: I1122 09:51:00.154142 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:51:00 crc kubenswrapper[4743]: E1122 09:51:00.155623 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:51:14 crc kubenswrapper[4743]: I1122 09:51:14.152470 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:51:14 crc kubenswrapper[4743]: E1122 09:51:14.153727 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:51:26 crc kubenswrapper[4743]: I1122 09:51:26.151989 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:51:26 crc kubenswrapper[4743]: E1122 09:51:26.153789 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.156688 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:51:37 crc kubenswrapper[4743]: E1122 09:51:37.157445 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.310431 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-h9md7"] Nov 22 09:51:37 crc kubenswrapper[4743]: E1122 09:51:37.310918 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerName="registry-server" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.310937 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerName="registry-server" Nov 22 09:51:37 crc kubenswrapper[4743]: E1122 09:51:37.311003 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerName="extract-utilities" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.311012 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerName="extract-utilities" Nov 22 09:51:37 crc kubenswrapper[4743]: E1122 09:51:37.311026 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerName="extract-content" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.311035 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerName="extract-content" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.311191 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9ef9d7-cc5b-4c93-a77f-954b93d0eb76" containerName="registry-server" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.311745 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h9md7" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.322984 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h9md7"] Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.375666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnj6h\" (UniqueName: \"kubernetes.io/projected/db038e67-3aec-4bc7-be68-5f8e3cea3a83-kube-api-access-wnj6h\") pod \"barbican-db-create-h9md7\" (UID: \"db038e67-3aec-4bc7-be68-5f8e3cea3a83\") " pod="openstack/barbican-db-create-h9md7" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.375785 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db038e67-3aec-4bc7-be68-5f8e3cea3a83-operator-scripts\") pod \"barbican-db-create-h9md7\" (UID: \"db038e67-3aec-4bc7-be68-5f8e3cea3a83\") " pod="openstack/barbican-db-create-h9md7" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.409975 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-738f-account-create-42z4p"] Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.411179 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-738f-account-create-42z4p" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.414523 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.419439 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-738f-account-create-42z4p"] Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.477506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db038e67-3aec-4bc7-be68-5f8e3cea3a83-operator-scripts\") pod \"barbican-db-create-h9md7\" (UID: \"db038e67-3aec-4bc7-be68-5f8e3cea3a83\") " pod="openstack/barbican-db-create-h9md7" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.477562 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7vhz\" (UniqueName: \"kubernetes.io/projected/bee26b83-925c-4a05-9064-dda33c5dc513-kube-api-access-z7vhz\") pod \"barbican-738f-account-create-42z4p\" (UID: \"bee26b83-925c-4a05-9064-dda33c5dc513\") " pod="openstack/barbican-738f-account-create-42z4p" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.477635 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bee26b83-925c-4a05-9064-dda33c5dc513-operator-scripts\") pod \"barbican-738f-account-create-42z4p\" (UID: \"bee26b83-925c-4a05-9064-dda33c5dc513\") " pod="openstack/barbican-738f-account-create-42z4p" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.477679 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnj6h\" (UniqueName: \"kubernetes.io/projected/db038e67-3aec-4bc7-be68-5f8e3cea3a83-kube-api-access-wnj6h\") pod \"barbican-db-create-h9md7\" (UID: \"db038e67-3aec-4bc7-be68-5f8e3cea3a83\") " pod="openstack/barbican-db-create-h9md7" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.478656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db038e67-3aec-4bc7-be68-5f8e3cea3a83-operator-scripts\") pod \"barbican-db-create-h9md7\" (UID: \"db038e67-3aec-4bc7-be68-5f8e3cea3a83\") " pod="openstack/barbican-db-create-h9md7" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.499177 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnj6h\" (UniqueName: \"kubernetes.io/projected/db038e67-3aec-4bc7-be68-5f8e3cea3a83-kube-api-access-wnj6h\") pod \"barbican-db-create-h9md7\" (UID: \"db038e67-3aec-4bc7-be68-5f8e3cea3a83\") " pod="openstack/barbican-db-create-h9md7" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.579364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7vhz\" (UniqueName: \"kubernetes.io/projected/bee26b83-925c-4a05-9064-dda33c5dc513-kube-api-access-z7vhz\") pod \"barbican-738f-account-create-42z4p\" (UID: \"bee26b83-925c-4a05-9064-dda33c5dc513\") " pod="openstack/barbican-738f-account-create-42z4p" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.579451 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bee26b83-925c-4a05-9064-dda33c5dc513-operator-scripts\") pod \"barbican-738f-account-create-42z4p\" (UID: \"bee26b83-925c-4a05-9064-dda33c5dc513\") " pod="openstack/barbican-738f-account-create-42z4p" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.580788 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bee26b83-925c-4a05-9064-dda33c5dc513-operator-scripts\") pod \"barbican-738f-account-create-42z4p\" (UID: \"bee26b83-925c-4a05-9064-dda33c5dc513\") " pod="openstack/barbican-738f-account-create-42z4p" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.611515 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7vhz\" (UniqueName: \"kubernetes.io/projected/bee26b83-925c-4a05-9064-dda33c5dc513-kube-api-access-z7vhz\") pod \"barbican-738f-account-create-42z4p\" (UID: \"bee26b83-925c-4a05-9064-dda33c5dc513\") " pod="openstack/barbican-738f-account-create-42z4p" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.630998 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h9md7" Nov 22 09:51:37 crc kubenswrapper[4743]: I1122 09:51:37.725802 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-738f-account-create-42z4p" Nov 22 09:51:38 crc kubenswrapper[4743]: I1122 09:51:38.141012 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-h9md7"] Nov 22 09:51:38 crc kubenswrapper[4743]: I1122 09:51:38.198971 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-738f-account-create-42z4p"] Nov 22 09:51:38 crc kubenswrapper[4743]: I1122 09:51:38.888757 4743 generic.go:334] "Generic (PLEG): container finished" podID="bee26b83-925c-4a05-9064-dda33c5dc513" containerID="c99457232b7d3d01c35d1d2f4096f94b5647e1890b186c840eeda8276f9f7619" exitCode=0 Nov 22 09:51:38 crc kubenswrapper[4743]: I1122 09:51:38.888959 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-738f-account-create-42z4p" event={"ID":"bee26b83-925c-4a05-9064-dda33c5dc513","Type":"ContainerDied","Data":"c99457232b7d3d01c35d1d2f4096f94b5647e1890b186c840eeda8276f9f7619"} Nov 22 09:51:38 crc kubenswrapper[4743]: I1122 09:51:38.889692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-738f-account-create-42z4p" event={"ID":"bee26b83-925c-4a05-9064-dda33c5dc513","Type":"ContainerStarted","Data":"7c9231665b907f095f5b811abfd4ca2999bbde87332d0c8dd750ebd13f078c67"} Nov 22 09:51:38 crc kubenswrapper[4743]: I1122 09:51:38.891659 4743 generic.go:334] "Generic (PLEG): container finished" podID="db038e67-3aec-4bc7-be68-5f8e3cea3a83" containerID="15f06d505d36c1a5d99f67be71b358f32acd46652b4c6977d23e7a83508314df" exitCode=0 Nov 22 09:51:38 crc kubenswrapper[4743]: I1122 09:51:38.891862 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h9md7" event={"ID":"db038e67-3aec-4bc7-be68-5f8e3cea3a83","Type":"ContainerDied","Data":"15f06d505d36c1a5d99f67be71b358f32acd46652b4c6977d23e7a83508314df"} Nov 22 09:51:38 crc kubenswrapper[4743]: I1122 09:51:38.892030 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h9md7" event={"ID":"db038e67-3aec-4bc7-be68-5f8e3cea3a83","Type":"ContainerStarted","Data":"6478c950dcf8692de18e6199b89d276e3d4fdee4a9e7138e15c1ccf3172b0349"} Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.235683 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h9md7" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.244441 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-738f-account-create-42z4p" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.327349 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnj6h\" (UniqueName: \"kubernetes.io/projected/db038e67-3aec-4bc7-be68-5f8e3cea3a83-kube-api-access-wnj6h\") pod \"db038e67-3aec-4bc7-be68-5f8e3cea3a83\" (UID: \"db038e67-3aec-4bc7-be68-5f8e3cea3a83\") " Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.327700 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bee26b83-925c-4a05-9064-dda33c5dc513-operator-scripts\") pod \"bee26b83-925c-4a05-9064-dda33c5dc513\" (UID: \"bee26b83-925c-4a05-9064-dda33c5dc513\") " Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.327761 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db038e67-3aec-4bc7-be68-5f8e3cea3a83-operator-scripts\") pod \"db038e67-3aec-4bc7-be68-5f8e3cea3a83\" (UID: \"db038e67-3aec-4bc7-be68-5f8e3cea3a83\") " Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.327913 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7vhz\" (UniqueName: \"kubernetes.io/projected/bee26b83-925c-4a05-9064-dda33c5dc513-kube-api-access-z7vhz\") pod \"bee26b83-925c-4a05-9064-dda33c5dc513\" (UID: \"bee26b83-925c-4a05-9064-dda33c5dc513\") " Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.328900 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db038e67-3aec-4bc7-be68-5f8e3cea3a83-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db038e67-3aec-4bc7-be68-5f8e3cea3a83" (UID: "db038e67-3aec-4bc7-be68-5f8e3cea3a83"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.328986 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bee26b83-925c-4a05-9064-dda33c5dc513-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bee26b83-925c-4a05-9064-dda33c5dc513" (UID: "bee26b83-925c-4a05-9064-dda33c5dc513"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.333160 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db038e67-3aec-4bc7-be68-5f8e3cea3a83-kube-api-access-wnj6h" (OuterVolumeSpecName: "kube-api-access-wnj6h") pod "db038e67-3aec-4bc7-be68-5f8e3cea3a83" (UID: "db038e67-3aec-4bc7-be68-5f8e3cea3a83"). InnerVolumeSpecName "kube-api-access-wnj6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.333263 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee26b83-925c-4a05-9064-dda33c5dc513-kube-api-access-z7vhz" (OuterVolumeSpecName: "kube-api-access-z7vhz") pod "bee26b83-925c-4a05-9064-dda33c5dc513" (UID: "bee26b83-925c-4a05-9064-dda33c5dc513"). InnerVolumeSpecName "kube-api-access-z7vhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.429598 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnj6h\" (UniqueName: \"kubernetes.io/projected/db038e67-3aec-4bc7-be68-5f8e3cea3a83-kube-api-access-wnj6h\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.429630 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bee26b83-925c-4a05-9064-dda33c5dc513-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.429642 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db038e67-3aec-4bc7-be68-5f8e3cea3a83-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.429652 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7vhz\" (UniqueName: \"kubernetes.io/projected/bee26b83-925c-4a05-9064-dda33c5dc513-kube-api-access-z7vhz\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.908808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-738f-account-create-42z4p" event={"ID":"bee26b83-925c-4a05-9064-dda33c5dc513","Type":"ContainerDied","Data":"7c9231665b907f095f5b811abfd4ca2999bbde87332d0c8dd750ebd13f078c67"} Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.908843 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c9231665b907f095f5b811abfd4ca2999bbde87332d0c8dd750ebd13f078c67" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.908957 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-738f-account-create-42z4p" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.911153 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-h9md7" event={"ID":"db038e67-3aec-4bc7-be68-5f8e3cea3a83","Type":"ContainerDied","Data":"6478c950dcf8692de18e6199b89d276e3d4fdee4a9e7138e15c1ccf3172b0349"} Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.911174 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6478c950dcf8692de18e6199b89d276e3d4fdee4a9e7138e15c1ccf3172b0349" Nov 22 09:51:40 crc kubenswrapper[4743]: I1122 09:51:40.911249 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-h9md7" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.657274 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ssk2b"] Nov 22 09:51:42 crc kubenswrapper[4743]: E1122 09:51:42.658059 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee26b83-925c-4a05-9064-dda33c5dc513" containerName="mariadb-account-create" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.658075 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee26b83-925c-4a05-9064-dda33c5dc513" containerName="mariadb-account-create" Nov 22 09:51:42 crc kubenswrapper[4743]: E1122 09:51:42.658112 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db038e67-3aec-4bc7-be68-5f8e3cea3a83" containerName="mariadb-database-create" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.658121 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="db038e67-3aec-4bc7-be68-5f8e3cea3a83" containerName="mariadb-database-create" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.658358 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="db038e67-3aec-4bc7-be68-5f8e3cea3a83" containerName="mariadb-database-create" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.658380 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee26b83-925c-4a05-9064-dda33c5dc513" containerName="mariadb-account-create" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.659126 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.661693 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.662039 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9vrx2" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.666469 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ssk2b"] Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.768157 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwg6c\" (UniqueName: \"kubernetes.io/projected/452d5af5-a474-4459-b454-a1600d09fba8-kube-api-access-mwg6c\") pod \"barbican-db-sync-ssk2b\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.768248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-combined-ca-bundle\") pod \"barbican-db-sync-ssk2b\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.768273 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-db-sync-config-data\") pod \"barbican-db-sync-ssk2b\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.870374 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-combined-ca-bundle\") pod \"barbican-db-sync-ssk2b\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.870420 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-db-sync-config-data\") pod \"barbican-db-sync-ssk2b\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.870546 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwg6c\" (UniqueName: \"kubernetes.io/projected/452d5af5-a474-4459-b454-a1600d09fba8-kube-api-access-mwg6c\") pod \"barbican-db-sync-ssk2b\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.885921 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-db-sync-config-data\") pod \"barbican-db-sync-ssk2b\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.886179 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-combined-ca-bundle\") pod \"barbican-db-sync-ssk2b\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.886296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwg6c\" (UniqueName: \"kubernetes.io/projected/452d5af5-a474-4459-b454-a1600d09fba8-kube-api-access-mwg6c\") pod \"barbican-db-sync-ssk2b\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:42 crc kubenswrapper[4743]: I1122 09:51:42.988268 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:43 crc kubenswrapper[4743]: I1122 09:51:43.473729 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ssk2b"] Nov 22 09:51:43 crc kubenswrapper[4743]: I1122 09:51:43.937004 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ssk2b" event={"ID":"452d5af5-a474-4459-b454-a1600d09fba8","Type":"ContainerStarted","Data":"1957840df6055a399eb2bf8a4dc0e2351b7963713df5df6abeb1cbb28ca04eea"} Nov 22 09:51:43 crc kubenswrapper[4743]: I1122 09:51:43.937265 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ssk2b" event={"ID":"452d5af5-a474-4459-b454-a1600d09fba8","Type":"ContainerStarted","Data":"bca7bc4a258db85384a6317207f8a8dc4a9eb4b1a83d1a948e1bb9aa30f14937"} Nov 22 09:51:43 crc kubenswrapper[4743]: I1122 09:51:43.958364 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ssk2b" podStartSLOduration=1.958345895 podStartE2EDuration="1.958345895s" podCreationTimestamp="2025-11-22 09:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:51:43.954034631 +0000 UTC m=+5377.660395693" watchObservedRunningTime="2025-11-22 09:51:43.958345895 +0000 UTC m=+5377.664706947" Nov 22 09:51:44 crc kubenswrapper[4743]: I1122 09:51:44.947058 4743 generic.go:334] "Generic (PLEG): container finished" podID="452d5af5-a474-4459-b454-a1600d09fba8" containerID="1957840df6055a399eb2bf8a4dc0e2351b7963713df5df6abeb1cbb28ca04eea" exitCode=0 Nov 22 09:51:44 crc kubenswrapper[4743]: I1122 09:51:44.947113 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ssk2b" event={"ID":"452d5af5-a474-4459-b454-a1600d09fba8","Type":"ContainerDied","Data":"1957840df6055a399eb2bf8a4dc0e2351b7963713df5df6abeb1cbb28ca04eea"} Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.263965 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.327613 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-db-sync-config-data\") pod \"452d5af5-a474-4459-b454-a1600d09fba8\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.327858 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-combined-ca-bundle\") pod \"452d5af5-a474-4459-b454-a1600d09fba8\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.327911 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwg6c\" (UniqueName: \"kubernetes.io/projected/452d5af5-a474-4459-b454-a1600d09fba8-kube-api-access-mwg6c\") pod \"452d5af5-a474-4459-b454-a1600d09fba8\" (UID: \"452d5af5-a474-4459-b454-a1600d09fba8\") " Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.333187 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "452d5af5-a474-4459-b454-a1600d09fba8" (UID: "452d5af5-a474-4459-b454-a1600d09fba8"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.334701 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452d5af5-a474-4459-b454-a1600d09fba8-kube-api-access-mwg6c" (OuterVolumeSpecName: "kube-api-access-mwg6c") pod "452d5af5-a474-4459-b454-a1600d09fba8" (UID: "452d5af5-a474-4459-b454-a1600d09fba8"). InnerVolumeSpecName "kube-api-access-mwg6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.350185 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "452d5af5-a474-4459-b454-a1600d09fba8" (UID: "452d5af5-a474-4459-b454-a1600d09fba8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.430238 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.430269 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwg6c\" (UniqueName: \"kubernetes.io/projected/452d5af5-a474-4459-b454-a1600d09fba8-kube-api-access-mwg6c\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.430278 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/452d5af5-a474-4459-b454-a1600d09fba8-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.964305 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ssk2b" event={"ID":"452d5af5-a474-4459-b454-a1600d09fba8","Type":"ContainerDied","Data":"bca7bc4a258db85384a6317207f8a8dc4a9eb4b1a83d1a948e1bb9aa30f14937"} Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.964719 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bca7bc4a258db85384a6317207f8a8dc4a9eb4b1a83d1a948e1bb9aa30f14937" Nov 22 09:51:46 crc kubenswrapper[4743]: I1122 09:51:46.964343 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ssk2b" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.251008 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-75df9d877b-gr58l"] Nov 22 09:51:47 crc kubenswrapper[4743]: E1122 09:51:47.252001 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452d5af5-a474-4459-b454-a1600d09fba8" containerName="barbican-db-sync" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.252028 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="452d5af5-a474-4459-b454-a1600d09fba8" containerName="barbican-db-sync" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.252300 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="452d5af5-a474-4459-b454-a1600d09fba8" containerName="barbican-db-sync" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.253567 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.260499 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-9vrx2" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.261053 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.261214 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.277701 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56b7f9968f-tlnnl"] Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.280246 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.295027 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.304404 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56b7f9968f-tlnnl"] Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.321613 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75df9d877b-gr58l"] Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.360598 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-l2rxz"] Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.362000 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.366178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-config-data-custom\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.366227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kw9p\" (UniqueName: \"kubernetes.io/projected/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-kube-api-access-4kw9p\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.366268 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-logs\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.366359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wzvl\" (UniqueName: \"kubernetes.io/projected/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-kube-api-access-4wzvl\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.366448 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-config-data-custom\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.366468 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-combined-ca-bundle\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.366525 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-logs\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.366555 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-config-data\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.366615 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-combined-ca-bundle\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.366653 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-config-data\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.371588 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-l2rxz"] Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.380993 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6bcdbc7bc8-2tcrc"] Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.382305 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.385248 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.389315 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6bcdbc7bc8-2tcrc"] Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.468411 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-logs\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.468474 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-config-data\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.468513 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-combined-ca-bundle\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.468547 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-dns-svc\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.468595 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-config-data\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.468636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thcwv\" (UniqueName: \"kubernetes.io/projected/6e25372c-5d60-43bb-94e2-bb2dbe50da35-kube-api-access-thcwv\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.468724 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-config-data-custom\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.468900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-nb\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.468972 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e25372c-5d60-43bb-94e2-bb2dbe50da35-config-data\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469003 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kw9p\" (UniqueName: \"kubernetes.io/projected/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-kube-api-access-4kw9p\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469037 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-logs\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-logs\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469064 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e25372c-5d60-43bb-94e2-bb2dbe50da35-combined-ca-bundle\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469109 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e25372c-5d60-43bb-94e2-bb2dbe50da35-config-data-custom\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wzvl\" (UniqueName: \"kubernetes.io/projected/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-kube-api-access-4wzvl\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469284 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-config-data-custom\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469302 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-combined-ca-bundle\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469321 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hjrd\" (UniqueName: \"kubernetes.io/projected/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-kube-api-access-4hjrd\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-config\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469393 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-sb\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469503 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e25372c-5d60-43bb-94e2-bb2dbe50da35-logs\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.469657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-logs\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.474112 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-config-data-custom\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.474218 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-config-data-custom\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.474751 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-config-data\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.474835 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-combined-ca-bundle\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.480219 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-combined-ca-bundle\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.481942 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-config-data\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.486555 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kw9p\" (UniqueName: \"kubernetes.io/projected/a3b90d81-ea60-48b8-911b-ba9cfefd71e8-kube-api-access-4kw9p\") pod \"barbican-worker-56b7f9968f-tlnnl\" (UID: \"a3b90d81-ea60-48b8-911b-ba9cfefd71e8\") " pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.487926 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wzvl\" (UniqueName: \"kubernetes.io/projected/e75ef71d-a2f3-4bf0-9b91-9116d4ebedce-kube-api-access-4wzvl\") pod \"barbican-keystone-listener-75df9d877b-gr58l\" (UID: \"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce\") " pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.570825 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-sb\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.570884 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-config\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.570911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e25372c-5d60-43bb-94e2-bb2dbe50da35-logs\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.570977 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-dns-svc\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.571018 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thcwv\" (UniqueName: \"kubernetes.io/projected/6e25372c-5d60-43bb-94e2-bb2dbe50da35-kube-api-access-thcwv\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.571045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-nb\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.571065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e25372c-5d60-43bb-94e2-bb2dbe50da35-config-data\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.571096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e25372c-5d60-43bb-94e2-bb2dbe50da35-combined-ca-bundle\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.571131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e25372c-5d60-43bb-94e2-bb2dbe50da35-config-data-custom\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.571199 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hjrd\" (UniqueName: \"kubernetes.io/projected/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-kube-api-access-4hjrd\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.571887 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e25372c-5d60-43bb-94e2-bb2dbe50da35-logs\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.572201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-config\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.572551 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-dns-svc\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.575293 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-sb\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.575995 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-nb\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.584981 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e25372c-5d60-43bb-94e2-bb2dbe50da35-config-data\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.585493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e25372c-5d60-43bb-94e2-bb2dbe50da35-config-data-custom\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.585773 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e25372c-5d60-43bb-94e2-bb2dbe50da35-combined-ca-bundle\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.600102 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hjrd\" (UniqueName: \"kubernetes.io/projected/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-kube-api-access-4hjrd\") pod \"dnsmasq-dns-596df78cd9-l2rxz\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.611164 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.617515 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thcwv\" (UniqueName: \"kubernetes.io/projected/6e25372c-5d60-43bb-94e2-bb2dbe50da35-kube-api-access-thcwv\") pod \"barbican-api-6bcdbc7bc8-2tcrc\" (UID: \"6e25372c-5d60-43bb-94e2-bb2dbe50da35\") " pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.620323 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56b7f9968f-tlnnl" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.691083 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:47 crc kubenswrapper[4743]: I1122 09:51:47.704293 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.121953 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56b7f9968f-tlnnl"] Nov 22 09:51:48 crc kubenswrapper[4743]: W1122 09:51:48.124024 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3b90d81_ea60_48b8_911b_ba9cfefd71e8.slice/crio-e1efca91f35433d627a834a357f5c9f6bfff8c0e621af870f0e9acafae8d0458 WatchSource:0}: Error finding container e1efca91f35433d627a834a357f5c9f6bfff8c0e621af870f0e9acafae8d0458: Status 404 returned error can't find the container with id e1efca91f35433d627a834a357f5c9f6bfff8c0e621af870f0e9acafae8d0458 Nov 22 09:51:48 crc kubenswrapper[4743]: W1122 09:51:48.132354 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode75ef71d_a2f3_4bf0_9b91_9116d4ebedce.slice/crio-1ca281903c1f82f289f755e1a814106e62657991818d4163cdf6088f4d6f1b31 WatchSource:0}: Error finding container 1ca281903c1f82f289f755e1a814106e62657991818d4163cdf6088f4d6f1b31: Status 404 returned error can't find the container with id 1ca281903c1f82f289f755e1a814106e62657991818d4163cdf6088f4d6f1b31 Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.138712 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-75df9d877b-gr58l"] Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.283162 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6bcdbc7bc8-2tcrc"] Nov 22 09:51:48 crc kubenswrapper[4743]: W1122 09:51:48.289487 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e25372c_5d60_43bb_94e2_bb2dbe50da35.slice/crio-e3ec810a9552ca2a7096b7f687deb0b5158534fcfbf8f46266cf29117f04803b WatchSource:0}: Error finding container e3ec810a9552ca2a7096b7f687deb0b5158534fcfbf8f46266cf29117f04803b: Status 404 returned error can't find the container with id e3ec810a9552ca2a7096b7f687deb0b5158534fcfbf8f46266cf29117f04803b Nov 22 09:51:48 crc kubenswrapper[4743]: W1122 09:51:48.291780 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6cebe63_b5cf_4151_a6ab_7612a9de8fb0.slice/crio-9b8cf03901424a547c8613ede482024a770943e5f030b1c79679fb73679fcdd6 WatchSource:0}: Error finding container 9b8cf03901424a547c8613ede482024a770943e5f030b1c79679fb73679fcdd6: Status 404 returned error can't find the container with id 9b8cf03901424a547c8613ede482024a770943e5f030b1c79679fb73679fcdd6 Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.292537 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-l2rxz"] Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.992594 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56b7f9968f-tlnnl" event={"ID":"a3b90d81-ea60-48b8-911b-ba9cfefd71e8","Type":"ContainerStarted","Data":"4901f11a398ad5e14e7ce3e92265536c5fb93a5350c172cef51968a90efea6cf"} Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.993906 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56b7f9968f-tlnnl" event={"ID":"a3b90d81-ea60-48b8-911b-ba9cfefd71e8","Type":"ContainerStarted","Data":"ed918bb8e710f35185b3d08dfdc30a825abc0cbab1b24b22f3ac85338dbf7c42"} Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.993924 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56b7f9968f-tlnnl" event={"ID":"a3b90d81-ea60-48b8-911b-ba9cfefd71e8","Type":"ContainerStarted","Data":"e1efca91f35433d627a834a357f5c9f6bfff8c0e621af870f0e9acafae8d0458"} Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.997524 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" event={"ID":"6e25372c-5d60-43bb-94e2-bb2dbe50da35","Type":"ContainerStarted","Data":"52197f3ca8f56c75b5a78cc8738a0f621ef5ba27a98da136cfebeff2764598fd"} Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.997564 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" event={"ID":"6e25372c-5d60-43bb-94e2-bb2dbe50da35","Type":"ContainerStarted","Data":"b82d7fa40deceda0e414acf5adb17c09da6189847a2ec0fb31010bd429b78b16"} Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.997593 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" event={"ID":"6e25372c-5d60-43bb-94e2-bb2dbe50da35","Type":"ContainerStarted","Data":"e3ec810a9552ca2a7096b7f687deb0b5158534fcfbf8f46266cf29117f04803b"} Nov 22 09:51:48 crc kubenswrapper[4743]: I1122 09:51:48.997795 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:49 crc kubenswrapper[4743]: I1122 09:51:48.999981 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" event={"ID":"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce","Type":"ContainerStarted","Data":"ab830dac18adb15554f7f21649554515463050fd7a877a0eeebe79b78a1b246d"} Nov 22 09:51:49 crc kubenswrapper[4743]: I1122 09:51:49.000345 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" event={"ID":"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce","Type":"ContainerStarted","Data":"47b462538ac52e44eb21ea57cc95db7ce2605fc31367a9628fd9e97e8cb6638b"} Nov 22 09:51:49 crc kubenswrapper[4743]: I1122 09:51:49.000360 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" event={"ID":"e75ef71d-a2f3-4bf0-9b91-9116d4ebedce","Type":"ContainerStarted","Data":"1ca281903c1f82f289f755e1a814106e62657991818d4163cdf6088f4d6f1b31"} Nov 22 09:51:49 crc kubenswrapper[4743]: I1122 09:51:49.001272 4743 generic.go:334] "Generic (PLEG): container finished" podID="d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" containerID="ed97ce854d92582af27c1b3e2ffe2b8eb8705d0ebf12440301e2472e4387556b" exitCode=0 Nov 22 09:51:49 crc kubenswrapper[4743]: I1122 09:51:49.001321 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" event={"ID":"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0","Type":"ContainerDied","Data":"ed97ce854d92582af27c1b3e2ffe2b8eb8705d0ebf12440301e2472e4387556b"} Nov 22 09:51:49 crc kubenswrapper[4743]: I1122 09:51:49.001469 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" event={"ID":"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0","Type":"ContainerStarted","Data":"9b8cf03901424a547c8613ede482024a770943e5f030b1c79679fb73679fcdd6"} Nov 22 09:51:49 crc kubenswrapper[4743]: I1122 09:51:49.013023 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56b7f9968f-tlnnl" podStartSLOduration=2.013005502 podStartE2EDuration="2.013005502s" podCreationTimestamp="2025-11-22 09:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:51:49.01154709 +0000 UTC m=+5382.717908142" watchObservedRunningTime="2025-11-22 09:51:49.013005502 +0000 UTC m=+5382.719366554" Nov 22 09:51:49 crc kubenswrapper[4743]: I1122 09:51:49.055615 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" podStartSLOduration=2.055596265 podStartE2EDuration="2.055596265s" podCreationTimestamp="2025-11-22 09:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:51:49.050978963 +0000 UTC m=+5382.757340015" watchObservedRunningTime="2025-11-22 09:51:49.055596265 +0000 UTC m=+5382.761957317" Nov 22 09:51:49 crc kubenswrapper[4743]: I1122 09:51:49.077794 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-75df9d877b-gr58l" podStartSLOduration=2.077771912 podStartE2EDuration="2.077771912s" podCreationTimestamp="2025-11-22 09:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:51:49.075162247 +0000 UTC m=+5382.781523309" watchObservedRunningTime="2025-11-22 09:51:49.077771912 +0000 UTC m=+5382.784132964" Nov 22 09:51:50 crc kubenswrapper[4743]: I1122 09:51:50.013683 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" event={"ID":"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0","Type":"ContainerStarted","Data":"2eff16eae3a891890e1e241b2be6f042998ae7e50ad1dec4989ef5f774f1f169"} Nov 22 09:51:50 crc kubenswrapper[4743]: I1122 09:51:50.013992 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:50 crc kubenswrapper[4743]: I1122 09:51:50.043921 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" podStartSLOduration=3.04388231 podStartE2EDuration="3.04388231s" podCreationTimestamp="2025-11-22 09:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:51:50.037361373 +0000 UTC m=+5383.743722425" watchObservedRunningTime="2025-11-22 09:51:50.04388231 +0000 UTC m=+5383.750243372" Nov 22 09:51:51 crc kubenswrapper[4743]: I1122 09:51:51.023996 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:52 crc kubenswrapper[4743]: I1122 09:51:52.152250 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:51:52 crc kubenswrapper[4743]: E1122 09:51:52.152740 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:51:57 crc kubenswrapper[4743]: I1122 09:51:57.692528 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:51:57 crc kubenswrapper[4743]: I1122 09:51:57.780401 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58865cd75-276bp"] Nov 22 09:51:57 crc kubenswrapper[4743]: I1122 09:51:57.780684 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58865cd75-276bp" podUID="f802cf69-1653-49b2-aa50-855b1ee4847f" containerName="dnsmasq-dns" containerID="cri-o://0337b0a5fa07ca3c3f3dc694684e86ea5099757c21316f140a922cd54aac0f42" gracePeriod=10 Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.085756 4743 generic.go:334] "Generic (PLEG): container finished" podID="f802cf69-1653-49b2-aa50-855b1ee4847f" containerID="0337b0a5fa07ca3c3f3dc694684e86ea5099757c21316f140a922cd54aac0f42" exitCode=0 Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.085838 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58865cd75-276bp" event={"ID":"f802cf69-1653-49b2-aa50-855b1ee4847f","Type":"ContainerDied","Data":"0337b0a5fa07ca3c3f3dc694684e86ea5099757c21316f140a922cd54aac0f42"} Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.794759 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.873902 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-dns-svc\") pod \"f802cf69-1653-49b2-aa50-855b1ee4847f\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.874027 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-nb\") pod \"f802cf69-1653-49b2-aa50-855b1ee4847f\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.874092 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plcxn\" (UniqueName: \"kubernetes.io/projected/f802cf69-1653-49b2-aa50-855b1ee4847f-kube-api-access-plcxn\") pod \"f802cf69-1653-49b2-aa50-855b1ee4847f\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.874116 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-config\") pod \"f802cf69-1653-49b2-aa50-855b1ee4847f\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.874135 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-sb\") pod \"f802cf69-1653-49b2-aa50-855b1ee4847f\" (UID: \"f802cf69-1653-49b2-aa50-855b1ee4847f\") " Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.886695 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f802cf69-1653-49b2-aa50-855b1ee4847f-kube-api-access-plcxn" (OuterVolumeSpecName: "kube-api-access-plcxn") pod "f802cf69-1653-49b2-aa50-855b1ee4847f" (UID: "f802cf69-1653-49b2-aa50-855b1ee4847f"). InnerVolumeSpecName "kube-api-access-plcxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.917711 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f802cf69-1653-49b2-aa50-855b1ee4847f" (UID: "f802cf69-1653-49b2-aa50-855b1ee4847f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.920023 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f802cf69-1653-49b2-aa50-855b1ee4847f" (UID: "f802cf69-1653-49b2-aa50-855b1ee4847f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.921871 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f802cf69-1653-49b2-aa50-855b1ee4847f" (UID: "f802cf69-1653-49b2-aa50-855b1ee4847f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.923372 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-config" (OuterVolumeSpecName: "config") pod "f802cf69-1653-49b2-aa50-855b1ee4847f" (UID: "f802cf69-1653-49b2-aa50-855b1ee4847f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.976149 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plcxn\" (UniqueName: \"kubernetes.io/projected/f802cf69-1653-49b2-aa50-855b1ee4847f-kube-api-access-plcxn\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.976204 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.976214 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.976223 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:58 crc kubenswrapper[4743]: I1122 09:51:58.976232 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f802cf69-1653-49b2-aa50-855b1ee4847f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:51:59 crc kubenswrapper[4743]: I1122 09:51:59.098106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58865cd75-276bp" event={"ID":"f802cf69-1653-49b2-aa50-855b1ee4847f","Type":"ContainerDied","Data":"75b89c9e83766e4e625a1b36b8fe49e0e9b87225dc8ea37dd9138e1e3eb6dc86"} Nov 22 09:51:59 crc kubenswrapper[4743]: I1122 09:51:59.098167 4743 scope.go:117] "RemoveContainer" containerID="0337b0a5fa07ca3c3f3dc694684e86ea5099757c21316f140a922cd54aac0f42" Nov 22 09:51:59 crc kubenswrapper[4743]: I1122 09:51:59.098324 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58865cd75-276bp" Nov 22 09:51:59 crc kubenswrapper[4743]: I1122 09:51:59.105311 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:59 crc kubenswrapper[4743]: I1122 09:51:59.137926 4743 scope.go:117] "RemoveContainer" containerID="8f05eba6204c1f5981bbc4327599753b3f52a12bfa2330da8b9b581b5b919d28" Nov 22 09:51:59 crc kubenswrapper[4743]: I1122 09:51:59.146653 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6bcdbc7bc8-2tcrc" Nov 22 09:51:59 crc kubenswrapper[4743]: I1122 09:51:59.208590 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58865cd75-276bp"] Nov 22 09:51:59 crc kubenswrapper[4743]: I1122 09:51:59.238985 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58865cd75-276bp"] Nov 22 09:52:01 crc kubenswrapper[4743]: I1122 09:52:01.163716 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f802cf69-1653-49b2-aa50-855b1ee4847f" path="/var/lib/kubelet/pods/f802cf69-1653-49b2-aa50-855b1ee4847f/volumes" Nov 22 09:52:04 crc kubenswrapper[4743]: I1122 09:52:04.151913 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:52:05 crc kubenswrapper[4743]: I1122 09:52:05.149526 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"3eb3f31edfdd1f055cdfe5d03bc7d720df5251d68525b808fed42d018024ae04"} Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.218530 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-vdqlr"] Nov 22 09:52:11 crc kubenswrapper[4743]: E1122 09:52:11.219265 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f802cf69-1653-49b2-aa50-855b1ee4847f" containerName="init" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.219276 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f802cf69-1653-49b2-aa50-855b1ee4847f" containerName="init" Nov 22 09:52:11 crc kubenswrapper[4743]: E1122 09:52:11.219289 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f802cf69-1653-49b2-aa50-855b1ee4847f" containerName="dnsmasq-dns" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.219295 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f802cf69-1653-49b2-aa50-855b1ee4847f" containerName="dnsmasq-dns" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.219479 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f802cf69-1653-49b2-aa50-855b1ee4847f" containerName="dnsmasq-dns" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.220040 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vdqlr" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.270879 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vdqlr"] Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.324436 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-e01a-account-create-rs4p5"] Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.325869 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e01a-account-create-rs4p5" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.332161 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e01a-account-create-rs4p5"] Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.333769 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.404206 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdjq8\" (UniqueName: \"kubernetes.io/projected/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-kube-api-access-kdjq8\") pod \"neutron-db-create-vdqlr\" (UID: \"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3\") " pod="openstack/neutron-db-create-vdqlr" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.404289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-operator-scripts\") pod \"neutron-db-create-vdqlr\" (UID: \"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3\") " pod="openstack/neutron-db-create-vdqlr" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.506498 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdjq8\" (UniqueName: \"kubernetes.io/projected/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-kube-api-access-kdjq8\") pod \"neutron-db-create-vdqlr\" (UID: \"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3\") " pod="openstack/neutron-db-create-vdqlr" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.506916 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5907bd9-c6b9-4676-8884-fcbc49d5986a-operator-scripts\") pod \"neutron-e01a-account-create-rs4p5\" (UID: \"c5907bd9-c6b9-4676-8884-fcbc49d5986a\") " pod="openstack/neutron-e01a-account-create-rs4p5" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.507053 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-operator-scripts\") pod \"neutron-db-create-vdqlr\" (UID: \"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3\") " pod="openstack/neutron-db-create-vdqlr" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.507189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw2cl\" (UniqueName: \"kubernetes.io/projected/c5907bd9-c6b9-4676-8884-fcbc49d5986a-kube-api-access-rw2cl\") pod \"neutron-e01a-account-create-rs4p5\" (UID: \"c5907bd9-c6b9-4676-8884-fcbc49d5986a\") " pod="openstack/neutron-e01a-account-create-rs4p5" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.507858 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-operator-scripts\") pod \"neutron-db-create-vdqlr\" (UID: \"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3\") " pod="openstack/neutron-db-create-vdqlr" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.527140 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdjq8\" (UniqueName: \"kubernetes.io/projected/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-kube-api-access-kdjq8\") pod \"neutron-db-create-vdqlr\" (UID: \"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3\") " pod="openstack/neutron-db-create-vdqlr" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.541102 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vdqlr" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.608555 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5907bd9-c6b9-4676-8884-fcbc49d5986a-operator-scripts\") pod \"neutron-e01a-account-create-rs4p5\" (UID: \"c5907bd9-c6b9-4676-8884-fcbc49d5986a\") " pod="openstack/neutron-e01a-account-create-rs4p5" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.608675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw2cl\" (UniqueName: \"kubernetes.io/projected/c5907bd9-c6b9-4676-8884-fcbc49d5986a-kube-api-access-rw2cl\") pod \"neutron-e01a-account-create-rs4p5\" (UID: \"c5907bd9-c6b9-4676-8884-fcbc49d5986a\") " pod="openstack/neutron-e01a-account-create-rs4p5" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.609259 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5907bd9-c6b9-4676-8884-fcbc49d5986a-operator-scripts\") pod \"neutron-e01a-account-create-rs4p5\" (UID: \"c5907bd9-c6b9-4676-8884-fcbc49d5986a\") " pod="openstack/neutron-e01a-account-create-rs4p5" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.626789 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw2cl\" (UniqueName: \"kubernetes.io/projected/c5907bd9-c6b9-4676-8884-fcbc49d5986a-kube-api-access-rw2cl\") pod \"neutron-e01a-account-create-rs4p5\" (UID: \"c5907bd9-c6b9-4676-8884-fcbc49d5986a\") " pod="openstack/neutron-e01a-account-create-rs4p5" Nov 22 09:52:11 crc kubenswrapper[4743]: I1122 09:52:11.646300 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e01a-account-create-rs4p5" Nov 22 09:52:12 crc kubenswrapper[4743]: I1122 09:52:12.045894 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-vdqlr"] Nov 22 09:52:12 crc kubenswrapper[4743]: W1122 09:52:12.052243 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c8fcd80_31e1_4905_91f1_21e3cee7cbf3.slice/crio-fcdaf65753b8f180f0c8eeb913af526f88ad519a9c703ce9c9acba71b30f2b95 WatchSource:0}: Error finding container fcdaf65753b8f180f0c8eeb913af526f88ad519a9c703ce9c9acba71b30f2b95: Status 404 returned error can't find the container with id fcdaf65753b8f180f0c8eeb913af526f88ad519a9c703ce9c9acba71b30f2b95 Nov 22 09:52:12 crc kubenswrapper[4743]: I1122 09:52:12.120747 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-e01a-account-create-rs4p5"] Nov 22 09:52:12 crc kubenswrapper[4743]: W1122 09:52:12.123913 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5907bd9_c6b9_4676_8884_fcbc49d5986a.slice/crio-65b1bab978afb31830b9353122223d21f55148376f91e42a34889bdcab7dd205 WatchSource:0}: Error finding container 65b1bab978afb31830b9353122223d21f55148376f91e42a34889bdcab7dd205: Status 404 returned error can't find the container with id 65b1bab978afb31830b9353122223d21f55148376f91e42a34889bdcab7dd205 Nov 22 09:52:12 crc kubenswrapper[4743]: I1122 09:52:12.203046 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vdqlr" event={"ID":"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3","Type":"ContainerStarted","Data":"fcdaf65753b8f180f0c8eeb913af526f88ad519a9c703ce9c9acba71b30f2b95"} Nov 22 09:52:12 crc kubenswrapper[4743]: I1122 09:52:12.204404 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e01a-account-create-rs4p5" event={"ID":"c5907bd9-c6b9-4676-8884-fcbc49d5986a","Type":"ContainerStarted","Data":"65b1bab978afb31830b9353122223d21f55148376f91e42a34889bdcab7dd205"} Nov 22 09:52:13 crc kubenswrapper[4743]: I1122 09:52:13.213067 4743 generic.go:334] "Generic (PLEG): container finished" podID="c5907bd9-c6b9-4676-8884-fcbc49d5986a" containerID="a89fdc4a1c1c277300b8d7e936b419bff40daf5844ce29720b519778468df321" exitCode=0 Nov 22 09:52:13 crc kubenswrapper[4743]: I1122 09:52:13.213177 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e01a-account-create-rs4p5" event={"ID":"c5907bd9-c6b9-4676-8884-fcbc49d5986a","Type":"ContainerDied","Data":"a89fdc4a1c1c277300b8d7e936b419bff40daf5844ce29720b519778468df321"} Nov 22 09:52:13 crc kubenswrapper[4743]: I1122 09:52:13.214965 4743 generic.go:334] "Generic (PLEG): container finished" podID="4c8fcd80-31e1-4905-91f1-21e3cee7cbf3" containerID="37dae494fde993647e438f51403a24847e5950cbbadfb5dbfed3ed6faa4c0925" exitCode=0 Nov 22 09:52:13 crc kubenswrapper[4743]: I1122 09:52:13.215059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vdqlr" event={"ID":"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3","Type":"ContainerDied","Data":"37dae494fde993647e438f51403a24847e5950cbbadfb5dbfed3ed6faa4c0925"} Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.623125 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vdqlr" Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.629860 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e01a-account-create-rs4p5" Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.663086 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5907bd9-c6b9-4676-8884-fcbc49d5986a-operator-scripts\") pod \"c5907bd9-c6b9-4676-8884-fcbc49d5986a\" (UID: \"c5907bd9-c6b9-4676-8884-fcbc49d5986a\") " Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.663136 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw2cl\" (UniqueName: \"kubernetes.io/projected/c5907bd9-c6b9-4676-8884-fcbc49d5986a-kube-api-access-rw2cl\") pod \"c5907bd9-c6b9-4676-8884-fcbc49d5986a\" (UID: \"c5907bd9-c6b9-4676-8884-fcbc49d5986a\") " Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.663266 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-operator-scripts\") pod \"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3\" (UID: \"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3\") " Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.663293 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdjq8\" (UniqueName: \"kubernetes.io/projected/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-kube-api-access-kdjq8\") pod \"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3\" (UID: \"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3\") " Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.664351 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4c8fcd80-31e1-4905-91f1-21e3cee7cbf3" (UID: "4c8fcd80-31e1-4905-91f1-21e3cee7cbf3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.664424 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5907bd9-c6b9-4676-8884-fcbc49d5986a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5907bd9-c6b9-4676-8884-fcbc49d5986a" (UID: "c5907bd9-c6b9-4676-8884-fcbc49d5986a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.670436 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-kube-api-access-kdjq8" (OuterVolumeSpecName: "kube-api-access-kdjq8") pod "4c8fcd80-31e1-4905-91f1-21e3cee7cbf3" (UID: "4c8fcd80-31e1-4905-91f1-21e3cee7cbf3"). InnerVolumeSpecName "kube-api-access-kdjq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.670765 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5907bd9-c6b9-4676-8884-fcbc49d5986a-kube-api-access-rw2cl" (OuterVolumeSpecName: "kube-api-access-rw2cl") pod "c5907bd9-c6b9-4676-8884-fcbc49d5986a" (UID: "c5907bd9-c6b9-4676-8884-fcbc49d5986a"). InnerVolumeSpecName "kube-api-access-rw2cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.766168 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.766498 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdjq8\" (UniqueName: \"kubernetes.io/projected/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3-kube-api-access-kdjq8\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.766514 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5907bd9-c6b9-4676-8884-fcbc49d5986a-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:14 crc kubenswrapper[4743]: I1122 09:52:14.766526 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw2cl\" (UniqueName: \"kubernetes.io/projected/c5907bd9-c6b9-4676-8884-fcbc49d5986a-kube-api-access-rw2cl\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:15 crc kubenswrapper[4743]: I1122 09:52:15.230713 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-e01a-account-create-rs4p5" event={"ID":"c5907bd9-c6b9-4676-8884-fcbc49d5986a","Type":"ContainerDied","Data":"65b1bab978afb31830b9353122223d21f55148376f91e42a34889bdcab7dd205"} Nov 22 09:52:15 crc kubenswrapper[4743]: I1122 09:52:15.230757 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65b1bab978afb31830b9353122223d21f55148376f91e42a34889bdcab7dd205" Nov 22 09:52:15 crc kubenswrapper[4743]: I1122 09:52:15.230765 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-e01a-account-create-rs4p5" Nov 22 09:52:15 crc kubenswrapper[4743]: I1122 09:52:15.232393 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-vdqlr" event={"ID":"4c8fcd80-31e1-4905-91f1-21e3cee7cbf3","Type":"ContainerDied","Data":"fcdaf65753b8f180f0c8eeb913af526f88ad519a9c703ce9c9acba71b30f2b95"} Nov 22 09:52:15 crc kubenswrapper[4743]: I1122 09:52:15.232415 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcdaf65753b8f180f0c8eeb913af526f88ad519a9c703ce9c9acba71b30f2b95" Nov 22 09:52:15 crc kubenswrapper[4743]: I1122 09:52:15.232459 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-vdqlr" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.543647 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-fmrsm"] Nov 22 09:52:16 crc kubenswrapper[4743]: E1122 09:52:16.543980 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c8fcd80-31e1-4905-91f1-21e3cee7cbf3" containerName="mariadb-database-create" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.543996 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c8fcd80-31e1-4905-91f1-21e3cee7cbf3" containerName="mariadb-database-create" Nov 22 09:52:16 crc kubenswrapper[4743]: E1122 09:52:16.544028 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5907bd9-c6b9-4676-8884-fcbc49d5986a" containerName="mariadb-account-create" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.544034 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5907bd9-c6b9-4676-8884-fcbc49d5986a" containerName="mariadb-account-create" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.544193 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5907bd9-c6b9-4676-8884-fcbc49d5986a" containerName="mariadb-account-create" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.544207 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c8fcd80-31e1-4905-91f1-21e3cee7cbf3" containerName="mariadb-database-create" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.544782 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.547120 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.547624 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.547637 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wgsqk" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.558183 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fmrsm"] Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.595288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-config\") pod \"neutron-db-sync-fmrsm\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.595475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4gp6\" (UniqueName: \"kubernetes.io/projected/aeda2833-ef57-4a51-a3fa-e0b82f667688-kube-api-access-x4gp6\") pod \"neutron-db-sync-fmrsm\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.595659 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-combined-ca-bundle\") pod \"neutron-db-sync-fmrsm\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.696797 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-combined-ca-bundle\") pod \"neutron-db-sync-fmrsm\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.696935 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-config\") pod \"neutron-db-sync-fmrsm\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.697001 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4gp6\" (UniqueName: \"kubernetes.io/projected/aeda2833-ef57-4a51-a3fa-e0b82f667688-kube-api-access-x4gp6\") pod \"neutron-db-sync-fmrsm\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.705351 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-config\") pod \"neutron-db-sync-fmrsm\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.708239 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-combined-ca-bundle\") pod \"neutron-db-sync-fmrsm\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.715988 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4gp6\" (UniqueName: \"kubernetes.io/projected/aeda2833-ef57-4a51-a3fa-e0b82f667688-kube-api-access-x4gp6\") pod \"neutron-db-sync-fmrsm\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:16 crc kubenswrapper[4743]: I1122 09:52:16.867868 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:17 crc kubenswrapper[4743]: I1122 09:52:17.328490 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-fmrsm"] Nov 22 09:52:17 crc kubenswrapper[4743]: W1122 09:52:17.341963 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeda2833_ef57_4a51_a3fa_e0b82f667688.slice/crio-10a1ebf0ab53568262245ef6331fafcb05a707ab03f4409ddeb8a2ae28d2c517 WatchSource:0}: Error finding container 10a1ebf0ab53568262245ef6331fafcb05a707ab03f4409ddeb8a2ae28d2c517: Status 404 returned error can't find the container with id 10a1ebf0ab53568262245ef6331fafcb05a707ab03f4409ddeb8a2ae28d2c517 Nov 22 09:52:18 crc kubenswrapper[4743]: I1122 09:52:18.271253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fmrsm" event={"ID":"aeda2833-ef57-4a51-a3fa-e0b82f667688","Type":"ContainerStarted","Data":"454cd8c9877a28e42f0be77333c2ef8b29bc591614ef43bcbdb41dcf78388076"} Nov 22 09:52:18 crc kubenswrapper[4743]: I1122 09:52:18.271821 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fmrsm" event={"ID":"aeda2833-ef57-4a51-a3fa-e0b82f667688","Type":"ContainerStarted","Data":"10a1ebf0ab53568262245ef6331fafcb05a707ab03f4409ddeb8a2ae28d2c517"} Nov 22 09:52:18 crc kubenswrapper[4743]: I1122 09:52:18.295766 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-fmrsm" podStartSLOduration=2.295747311 podStartE2EDuration="2.295747311s" podCreationTimestamp="2025-11-22 09:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:52:18.293326452 +0000 UTC m=+5411.999687504" watchObservedRunningTime="2025-11-22 09:52:18.295747311 +0000 UTC m=+5412.002108363" Nov 22 09:52:29 crc kubenswrapper[4743]: I1122 09:52:29.389417 4743 generic.go:334] "Generic (PLEG): container finished" podID="aeda2833-ef57-4a51-a3fa-e0b82f667688" containerID="454cd8c9877a28e42f0be77333c2ef8b29bc591614ef43bcbdb41dcf78388076" exitCode=0 Nov 22 09:52:29 crc kubenswrapper[4743]: I1122 09:52:29.389497 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fmrsm" event={"ID":"aeda2833-ef57-4a51-a3fa-e0b82f667688","Type":"ContainerDied","Data":"454cd8c9877a28e42f0be77333c2ef8b29bc591614ef43bcbdb41dcf78388076"} Nov 22 09:52:30 crc kubenswrapper[4743]: I1122 09:52:30.745419 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:30 crc kubenswrapper[4743]: I1122 09:52:30.829126 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-combined-ca-bundle\") pod \"aeda2833-ef57-4a51-a3fa-e0b82f667688\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " Nov 22 09:52:30 crc kubenswrapper[4743]: I1122 09:52:30.829316 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4gp6\" (UniqueName: \"kubernetes.io/projected/aeda2833-ef57-4a51-a3fa-e0b82f667688-kube-api-access-x4gp6\") pod \"aeda2833-ef57-4a51-a3fa-e0b82f667688\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " Nov 22 09:52:30 crc kubenswrapper[4743]: I1122 09:52:30.829428 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-config\") pod \"aeda2833-ef57-4a51-a3fa-e0b82f667688\" (UID: \"aeda2833-ef57-4a51-a3fa-e0b82f667688\") " Nov 22 09:52:30 crc kubenswrapper[4743]: I1122 09:52:30.840353 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeda2833-ef57-4a51-a3fa-e0b82f667688-kube-api-access-x4gp6" (OuterVolumeSpecName: "kube-api-access-x4gp6") pod "aeda2833-ef57-4a51-a3fa-e0b82f667688" (UID: "aeda2833-ef57-4a51-a3fa-e0b82f667688"). InnerVolumeSpecName "kube-api-access-x4gp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:52:30 crc kubenswrapper[4743]: I1122 09:52:30.859615 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aeda2833-ef57-4a51-a3fa-e0b82f667688" (UID: "aeda2833-ef57-4a51-a3fa-e0b82f667688"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:52:30 crc kubenswrapper[4743]: I1122 09:52:30.859819 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-config" (OuterVolumeSpecName: "config") pod "aeda2833-ef57-4a51-a3fa-e0b82f667688" (UID: "aeda2833-ef57-4a51-a3fa-e0b82f667688"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:52:30 crc kubenswrapper[4743]: I1122 09:52:30.931547 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:30 crc kubenswrapper[4743]: I1122 09:52:30.931599 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4gp6\" (UniqueName: \"kubernetes.io/projected/aeda2833-ef57-4a51-a3fa-e0b82f667688-kube-api-access-x4gp6\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:30 crc kubenswrapper[4743]: I1122 09:52:30.931614 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/aeda2833-ef57-4a51-a3fa-e0b82f667688-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.411296 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-fmrsm" event={"ID":"aeda2833-ef57-4a51-a3fa-e0b82f667688","Type":"ContainerDied","Data":"10a1ebf0ab53568262245ef6331fafcb05a707ab03f4409ddeb8a2ae28d2c517"} Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.411353 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10a1ebf0ab53568262245ef6331fafcb05a707ab03f4409ddeb8a2ae28d2c517" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.411415 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-fmrsm" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.689123 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-mk4s6"] Nov 22 09:52:31 crc kubenswrapper[4743]: E1122 09:52:31.689680 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeda2833-ef57-4a51-a3fa-e0b82f667688" containerName="neutron-db-sync" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.689704 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeda2833-ef57-4a51-a3fa-e0b82f667688" containerName="neutron-db-sync" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.689969 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeda2833-ef57-4a51-a3fa-e0b82f667688" containerName="neutron-db-sync" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.694533 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.707746 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-mk4s6"] Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.819461 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7647bcffd5-9jhp5"] Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.822058 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.826027 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.826245 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-wgsqk" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.826314 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.838283 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7647bcffd5-9jhp5"] Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.854717 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.854797 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-dns-svc\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.854862 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.854900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-config\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.854936 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7d6r\" (UniqueName: \"kubernetes.io/projected/c7bbf544-d418-4152-9f0f-fc8bf87be889-kube-api-access-f7d6r\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.956512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7d6r\" (UniqueName: \"kubernetes.io/projected/c7bbf544-d418-4152-9f0f-fc8bf87be889-kube-api-access-f7d6r\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.956630 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/25112c84-a50c-424f-8f5b-6b815720eaa7-httpd-config\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.956666 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25112c84-a50c-424f-8f5b-6b815720eaa7-combined-ca-bundle\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.956688 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf6mn\" (UniqueName: \"kubernetes.io/projected/25112c84-a50c-424f-8f5b-6b815720eaa7-kube-api-access-mf6mn\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.956712 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.956735 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-dns-svc\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.956930 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.957059 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-config\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.957101 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25112c84-a50c-424f-8f5b-6b815720eaa7-config\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.957674 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-dns-svc\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.958049 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-config\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.958041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.958670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:31 crc kubenswrapper[4743]: I1122 09:52:31.978225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7d6r\" (UniqueName: \"kubernetes.io/projected/c7bbf544-d418-4152-9f0f-fc8bf87be889-kube-api-access-f7d6r\") pod \"dnsmasq-dns-5f59d59797-mk4s6\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.033257 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.058889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25112c84-a50c-424f-8f5b-6b815720eaa7-config\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.058991 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/25112c84-a50c-424f-8f5b-6b815720eaa7-httpd-config\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.059017 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25112c84-a50c-424f-8f5b-6b815720eaa7-combined-ca-bundle\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.059043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf6mn\" (UniqueName: \"kubernetes.io/projected/25112c84-a50c-424f-8f5b-6b815720eaa7-kube-api-access-mf6mn\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.067217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/25112c84-a50c-424f-8f5b-6b815720eaa7-config\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.067829 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/25112c84-a50c-424f-8f5b-6b815720eaa7-httpd-config\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.068044 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25112c84-a50c-424f-8f5b-6b815720eaa7-combined-ca-bundle\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.080421 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf6mn\" (UniqueName: \"kubernetes.io/projected/25112c84-a50c-424f-8f5b-6b815720eaa7-kube-api-access-mf6mn\") pod \"neutron-7647bcffd5-9jhp5\" (UID: \"25112c84-a50c-424f-8f5b-6b815720eaa7\") " pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.142618 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.531257 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-mk4s6"] Nov 22 09:52:32 crc kubenswrapper[4743]: W1122 09:52:32.539950 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7bbf544_d418_4152_9f0f_fc8bf87be889.slice/crio-1af85587c225b90c9d81e63270c7a8b04b357310cd940cdada83516bb464b223 WatchSource:0}: Error finding container 1af85587c225b90c9d81e63270c7a8b04b357310cd940cdada83516bb464b223: Status 404 returned error can't find the container with id 1af85587c225b90c9d81e63270c7a8b04b357310cd940cdada83516bb464b223 Nov 22 09:52:32 crc kubenswrapper[4743]: I1122 09:52:32.729984 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7647bcffd5-9jhp5"] Nov 22 09:52:33 crc kubenswrapper[4743]: I1122 09:52:33.433010 4743 generic.go:334] "Generic (PLEG): container finished" podID="c7bbf544-d418-4152-9f0f-fc8bf87be889" containerID="b53735807008b3a6112124998265dc25935ba1c2fb538e87a81a8d62dc86e61c" exitCode=0 Nov 22 09:52:33 crc kubenswrapper[4743]: I1122 09:52:33.433221 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" event={"ID":"c7bbf544-d418-4152-9f0f-fc8bf87be889","Type":"ContainerDied","Data":"b53735807008b3a6112124998265dc25935ba1c2fb538e87a81a8d62dc86e61c"} Nov 22 09:52:33 crc kubenswrapper[4743]: I1122 09:52:33.433791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" event={"ID":"c7bbf544-d418-4152-9f0f-fc8bf87be889","Type":"ContainerStarted","Data":"1af85587c225b90c9d81e63270c7a8b04b357310cd940cdada83516bb464b223"} Nov 22 09:52:33 crc kubenswrapper[4743]: I1122 09:52:33.436538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7647bcffd5-9jhp5" event={"ID":"25112c84-a50c-424f-8f5b-6b815720eaa7","Type":"ContainerStarted","Data":"92d586f98d291b29f816995b77f6bf30665904e29e47fec23dcf9a49b4c9a26a"} Nov 22 09:52:33 crc kubenswrapper[4743]: I1122 09:52:33.436569 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7647bcffd5-9jhp5" event={"ID":"25112c84-a50c-424f-8f5b-6b815720eaa7","Type":"ContainerStarted","Data":"2597785d5ee6a682a7c435a8b71556783e9007afc0fc15f096aafcf909367640"} Nov 22 09:52:33 crc kubenswrapper[4743]: I1122 09:52:33.436598 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7647bcffd5-9jhp5" event={"ID":"25112c84-a50c-424f-8f5b-6b815720eaa7","Type":"ContainerStarted","Data":"065391ac949c87ffcf63a1566c7babd2d9bb5a9f593b43296ae1a307c412917e"} Nov 22 09:52:33 crc kubenswrapper[4743]: I1122 09:52:33.437276 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:52:33 crc kubenswrapper[4743]: I1122 09:52:33.488570 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7647bcffd5-9jhp5" podStartSLOduration=2.488544438 podStartE2EDuration="2.488544438s" podCreationTimestamp="2025-11-22 09:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:52:33.487685624 +0000 UTC m=+5427.194046676" watchObservedRunningTime="2025-11-22 09:52:33.488544438 +0000 UTC m=+5427.194905490" Nov 22 09:52:34 crc kubenswrapper[4743]: I1122 09:52:34.455546 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" event={"ID":"c7bbf544-d418-4152-9f0f-fc8bf87be889","Type":"ContainerStarted","Data":"cd59b3919b6d1cb11ac0a360e03e5852ae743a0563a0d47e5cd47d7f08a80ba5"} Nov 22 09:52:34 crc kubenswrapper[4743]: I1122 09:52:34.456170 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:34 crc kubenswrapper[4743]: I1122 09:52:34.483213 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" podStartSLOduration=3.483186456 podStartE2EDuration="3.483186456s" podCreationTimestamp="2025-11-22 09:52:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:52:34.479261793 +0000 UTC m=+5428.185622855" watchObservedRunningTime="2025-11-22 09:52:34.483186456 +0000 UTC m=+5428.189547518" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.609319 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kk2kr"] Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.613503 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.628100 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kk2kr"] Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.669945 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwjb9\" (UniqueName: \"kubernetes.io/projected/8c8b40a0-cb8b-47e4-a88d-b6771514b875-kube-api-access-pwjb9\") pod \"certified-operators-kk2kr\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.670050 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-utilities\") pod \"certified-operators-kk2kr\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.670298 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-catalog-content\") pod \"certified-operators-kk2kr\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.772784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-catalog-content\") pod \"certified-operators-kk2kr\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.772924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwjb9\" (UniqueName: \"kubernetes.io/projected/8c8b40a0-cb8b-47e4-a88d-b6771514b875-kube-api-access-pwjb9\") pod \"certified-operators-kk2kr\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.772990 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-utilities\") pod \"certified-operators-kk2kr\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.773529 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-catalog-content\") pod \"certified-operators-kk2kr\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.773537 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-utilities\") pod \"certified-operators-kk2kr\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.798775 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwjb9\" (UniqueName: \"kubernetes.io/projected/8c8b40a0-cb8b-47e4-a88d-b6771514b875-kube-api-access-pwjb9\") pod \"certified-operators-kk2kr\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:40 crc kubenswrapper[4743]: I1122 09:52:40.967227 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:41 crc kubenswrapper[4743]: I1122 09:52:41.457645 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kk2kr"] Nov 22 09:52:41 crc kubenswrapper[4743]: I1122 09:52:41.514096 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kk2kr" event={"ID":"8c8b40a0-cb8b-47e4-a88d-b6771514b875","Type":"ContainerStarted","Data":"0b47cdb0d0d47fc04ddbca3e1ca76a00c85e99e4f27b5da898f5802e3a2759dc"} Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.035659 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.100418 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-l2rxz"] Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.101276 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" podUID="d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" containerName="dnsmasq-dns" containerID="cri-o://2eff16eae3a891890e1e241b2be6f042998ae7e50ad1dec4989ef5f774f1f169" gracePeriod=10 Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.531006 4743 generic.go:334] "Generic (PLEG): container finished" podID="d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" containerID="2eff16eae3a891890e1e241b2be6f042998ae7e50ad1dec4989ef5f774f1f169" exitCode=0 Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.531136 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" event={"ID":"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0","Type":"ContainerDied","Data":"2eff16eae3a891890e1e241b2be6f042998ae7e50ad1dec4989ef5f774f1f169"} Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.535868 4743 generic.go:334] "Generic (PLEG): container finished" podID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerID="e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9" exitCode=0 Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.535929 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kk2kr" event={"ID":"8c8b40a0-cb8b-47e4-a88d-b6771514b875","Type":"ContainerDied","Data":"e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9"} Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.634866 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.714585 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-nb\") pod \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.714673 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-dns-svc\") pod \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.715993 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-config\") pod \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.716034 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-sb\") pod \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.716213 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hjrd\" (UniqueName: \"kubernetes.io/projected/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-kube-api-access-4hjrd\") pod \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\" (UID: \"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0\") " Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.730795 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-kube-api-access-4hjrd" (OuterVolumeSpecName: "kube-api-access-4hjrd") pod "d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" (UID: "d6cebe63-b5cf-4151-a6ab-7612a9de8fb0"). InnerVolumeSpecName "kube-api-access-4hjrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.774521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" (UID: "d6cebe63-b5cf-4151-a6ab-7612a9de8fb0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.787133 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-config" (OuterVolumeSpecName: "config") pod "d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" (UID: "d6cebe63-b5cf-4151-a6ab-7612a9de8fb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.795058 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" (UID: "d6cebe63-b5cf-4151-a6ab-7612a9de8fb0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.800523 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" (UID: "d6cebe63-b5cf-4151-a6ab-7612a9de8fb0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.818375 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.818408 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.818419 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.818428 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:42 crc kubenswrapper[4743]: I1122 09:52:42.818438 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hjrd\" (UniqueName: \"kubernetes.io/projected/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0-kube-api-access-4hjrd\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:43 crc kubenswrapper[4743]: I1122 09:52:43.548760 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kk2kr" event={"ID":"8c8b40a0-cb8b-47e4-a88d-b6771514b875","Type":"ContainerStarted","Data":"ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13"} Nov 22 09:52:43 crc kubenswrapper[4743]: I1122 09:52:43.553810 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" event={"ID":"d6cebe63-b5cf-4151-a6ab-7612a9de8fb0","Type":"ContainerDied","Data":"9b8cf03901424a547c8613ede482024a770943e5f030b1c79679fb73679fcdd6"} Nov 22 09:52:43 crc kubenswrapper[4743]: I1122 09:52:43.553866 4743 scope.go:117] "RemoveContainer" containerID="2eff16eae3a891890e1e241b2be6f042998ae7e50ad1dec4989ef5f774f1f169" Nov 22 09:52:43 crc kubenswrapper[4743]: I1122 09:52:43.554003 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-596df78cd9-l2rxz" Nov 22 09:52:43 crc kubenswrapper[4743]: I1122 09:52:43.572819 4743 scope.go:117] "RemoveContainer" containerID="ed97ce854d92582af27c1b3e2ffe2b8eb8705d0ebf12440301e2472e4387556b" Nov 22 09:52:43 crc kubenswrapper[4743]: I1122 09:52:43.599208 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-l2rxz"] Nov 22 09:52:43 crc kubenswrapper[4743]: I1122 09:52:43.604232 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-596df78cd9-l2rxz"] Nov 22 09:52:44 crc kubenswrapper[4743]: I1122 09:52:44.563758 4743 generic.go:334] "Generic (PLEG): container finished" podID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerID="ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13" exitCode=0 Nov 22 09:52:44 crc kubenswrapper[4743]: I1122 09:52:44.563839 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kk2kr" event={"ID":"8c8b40a0-cb8b-47e4-a88d-b6771514b875","Type":"ContainerDied","Data":"ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13"} Nov 22 09:52:45 crc kubenswrapper[4743]: I1122 09:52:45.169177 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" path="/var/lib/kubelet/pods/d6cebe63-b5cf-4151-a6ab-7612a9de8fb0/volumes" Nov 22 09:52:45 crc kubenswrapper[4743]: I1122 09:52:45.577562 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kk2kr" event={"ID":"8c8b40a0-cb8b-47e4-a88d-b6771514b875","Type":"ContainerStarted","Data":"51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d"} Nov 22 09:52:45 crc kubenswrapper[4743]: I1122 09:52:45.614044 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kk2kr" podStartSLOduration=3.196284805 podStartE2EDuration="5.614001498s" podCreationTimestamp="2025-11-22 09:52:40 +0000 UTC" firstStartedPulling="2025-11-22 09:52:42.539663349 +0000 UTC m=+5436.246024401" lastFinishedPulling="2025-11-22 09:52:44.957380042 +0000 UTC m=+5438.663741094" observedRunningTime="2025-11-22 09:52:45.603318921 +0000 UTC m=+5439.309680003" watchObservedRunningTime="2025-11-22 09:52:45.614001498 +0000 UTC m=+5439.320362580" Nov 22 09:52:50 crc kubenswrapper[4743]: I1122 09:52:50.968730 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:50 crc kubenswrapper[4743]: I1122 09:52:50.969683 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:51 crc kubenswrapper[4743]: I1122 09:52:51.041317 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:51 crc kubenswrapper[4743]: I1122 09:52:51.677216 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:51 crc kubenswrapper[4743]: I1122 09:52:51.722876 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kk2kr"] Nov 22 09:52:53 crc kubenswrapper[4743]: I1122 09:52:53.646585 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kk2kr" podUID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerName="registry-server" containerID="cri-o://51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d" gracePeriod=2 Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.101669 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.218969 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-utilities\") pod \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.219010 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-catalog-content\") pod \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.219187 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwjb9\" (UniqueName: \"kubernetes.io/projected/8c8b40a0-cb8b-47e4-a88d-b6771514b875-kube-api-access-pwjb9\") pod \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\" (UID: \"8c8b40a0-cb8b-47e4-a88d-b6771514b875\") " Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.219944 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-utilities" (OuterVolumeSpecName: "utilities") pod "8c8b40a0-cb8b-47e4-a88d-b6771514b875" (UID: "8c8b40a0-cb8b-47e4-a88d-b6771514b875"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.231738 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8b40a0-cb8b-47e4-a88d-b6771514b875-kube-api-access-pwjb9" (OuterVolumeSpecName: "kube-api-access-pwjb9") pod "8c8b40a0-cb8b-47e4-a88d-b6771514b875" (UID: "8c8b40a0-cb8b-47e4-a88d-b6771514b875"). InnerVolumeSpecName "kube-api-access-pwjb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.321532 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwjb9\" (UniqueName: \"kubernetes.io/projected/8c8b40a0-cb8b-47e4-a88d-b6771514b875-kube-api-access-pwjb9\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.321578 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.658448 4743 generic.go:334] "Generic (PLEG): container finished" podID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerID="51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d" exitCode=0 Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.658501 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kk2kr" event={"ID":"8c8b40a0-cb8b-47e4-a88d-b6771514b875","Type":"ContainerDied","Data":"51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d"} Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.658808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kk2kr" event={"ID":"8c8b40a0-cb8b-47e4-a88d-b6771514b875","Type":"ContainerDied","Data":"0b47cdb0d0d47fc04ddbca3e1ca76a00c85e99e4f27b5da898f5802e3a2759dc"} Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.658829 4743 scope.go:117] "RemoveContainer" containerID="51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.658508 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kk2kr" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.677221 4743 scope.go:117] "RemoveContainer" containerID="ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.728132 4743 scope.go:117] "RemoveContainer" containerID="e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.765947 4743 scope.go:117] "RemoveContainer" containerID="51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d" Nov 22 09:52:54 crc kubenswrapper[4743]: E1122 09:52:54.766448 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d\": container with ID starting with 51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d not found: ID does not exist" containerID="51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.766488 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d"} err="failed to get container status \"51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d\": rpc error: code = NotFound desc = could not find container \"51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d\": container with ID starting with 51f4dd2e3bb6a14aed519dfce9420ec25097b31a789e5d3c158354746e0dc02d not found: ID does not exist" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.766512 4743 scope.go:117] "RemoveContainer" containerID="ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13" Nov 22 09:52:54 crc kubenswrapper[4743]: E1122 09:52:54.766817 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13\": container with ID starting with ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13 not found: ID does not exist" containerID="ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.766843 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13"} err="failed to get container status \"ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13\": rpc error: code = NotFound desc = could not find container \"ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13\": container with ID starting with ad0f83a312ee119d982d7a8e5b80a1f418a1233865f25e8a169e6be061afbc13 not found: ID does not exist" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.766864 4743 scope.go:117] "RemoveContainer" containerID="e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9" Nov 22 09:52:54 crc kubenswrapper[4743]: E1122 09:52:54.767189 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9\": container with ID starting with e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9 not found: ID does not exist" containerID="e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9" Nov 22 09:52:54 crc kubenswrapper[4743]: I1122 09:52:54.767215 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9"} err="failed to get container status \"e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9\": rpc error: code = NotFound desc = could not find container \"e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9\": container with ID starting with e0a4bd8d9ee960f5a5164b8baad1765ebfaf67b99b4b00f2f7d591a951d449b9 not found: ID does not exist" Nov 22 09:52:55 crc kubenswrapper[4743]: I1122 09:52:55.369354 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c8b40a0-cb8b-47e4-a88d-b6771514b875" (UID: "8c8b40a0-cb8b-47e4-a88d-b6771514b875"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:52:55 crc kubenswrapper[4743]: I1122 09:52:55.437906 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c8b40a0-cb8b-47e4-a88d-b6771514b875-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:52:55 crc kubenswrapper[4743]: I1122 09:52:55.597068 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kk2kr"] Nov 22 09:52:55 crc kubenswrapper[4743]: I1122 09:52:55.604636 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kk2kr"] Nov 22 09:52:57 crc kubenswrapper[4743]: I1122 09:52:57.168698 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" path="/var/lib/kubelet/pods/8c8b40a0-cb8b-47e4-a88d-b6771514b875/volumes" Nov 22 09:53:02 crc kubenswrapper[4743]: I1122 09:53:02.152065 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7647bcffd5-9jhp5" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.149783 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-rvqd9"] Nov 22 09:53:09 crc kubenswrapper[4743]: E1122 09:53:09.150461 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerName="extract-utilities" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.150473 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerName="extract-utilities" Nov 22 09:53:09 crc kubenswrapper[4743]: E1122 09:53:09.150499 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerName="registry-server" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.150505 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerName="registry-server" Nov 22 09:53:09 crc kubenswrapper[4743]: E1122 09:53:09.150513 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" containerName="dnsmasq-dns" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.150531 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" containerName="dnsmasq-dns" Nov 22 09:53:09 crc kubenswrapper[4743]: E1122 09:53:09.150541 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerName="extract-content" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.150547 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerName="extract-content" Nov 22 09:53:09 crc kubenswrapper[4743]: E1122 09:53:09.150555 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" containerName="init" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.150561 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" containerName="init" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.150733 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6cebe63-b5cf-4151-a6ab-7612a9de8fb0" containerName="dnsmasq-dns" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.150753 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8b40a0-cb8b-47e4-a88d-b6771514b875" containerName="registry-server" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.151246 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rvqd9" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.173878 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rvqd9"] Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.249819 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bc66-account-create-7ph85"] Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.250845 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc66-account-create-7ph85" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.253243 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.262538 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bc66-account-create-7ph85"] Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.288825 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3684de8-7edc-43f3-92f1-2012be187d12-operator-scripts\") pod \"glance-db-create-rvqd9\" (UID: \"f3684de8-7edc-43f3-92f1-2012be187d12\") " pod="openstack/glance-db-create-rvqd9" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.288944 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d97v8\" (UniqueName: \"kubernetes.io/projected/f3684de8-7edc-43f3-92f1-2012be187d12-kube-api-access-d97v8\") pod \"glance-db-create-rvqd9\" (UID: \"f3684de8-7edc-43f3-92f1-2012be187d12\") " pod="openstack/glance-db-create-rvqd9" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.391442 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3684de8-7edc-43f3-92f1-2012be187d12-operator-scripts\") pod \"glance-db-create-rvqd9\" (UID: \"f3684de8-7edc-43f3-92f1-2012be187d12\") " pod="openstack/glance-db-create-rvqd9" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.391528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc98g\" (UniqueName: \"kubernetes.io/projected/67515e96-6010-4d7b-9d07-82a661b67ef0-kube-api-access-jc98g\") pod \"glance-bc66-account-create-7ph85\" (UID: \"67515e96-6010-4d7b-9d07-82a661b67ef0\") " pod="openstack/glance-bc66-account-create-7ph85" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.391630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d97v8\" (UniqueName: \"kubernetes.io/projected/f3684de8-7edc-43f3-92f1-2012be187d12-kube-api-access-d97v8\") pod \"glance-db-create-rvqd9\" (UID: \"f3684de8-7edc-43f3-92f1-2012be187d12\") " pod="openstack/glance-db-create-rvqd9" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.391831 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67515e96-6010-4d7b-9d07-82a661b67ef0-operator-scripts\") pod \"glance-bc66-account-create-7ph85\" (UID: \"67515e96-6010-4d7b-9d07-82a661b67ef0\") " pod="openstack/glance-bc66-account-create-7ph85" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.392215 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3684de8-7edc-43f3-92f1-2012be187d12-operator-scripts\") pod \"glance-db-create-rvqd9\" (UID: \"f3684de8-7edc-43f3-92f1-2012be187d12\") " pod="openstack/glance-db-create-rvqd9" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.412732 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d97v8\" (UniqueName: \"kubernetes.io/projected/f3684de8-7edc-43f3-92f1-2012be187d12-kube-api-access-d97v8\") pod \"glance-db-create-rvqd9\" (UID: \"f3684de8-7edc-43f3-92f1-2012be187d12\") " pod="openstack/glance-db-create-rvqd9" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.472076 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rvqd9" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.493504 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67515e96-6010-4d7b-9d07-82a661b67ef0-operator-scripts\") pod \"glance-bc66-account-create-7ph85\" (UID: \"67515e96-6010-4d7b-9d07-82a661b67ef0\") " pod="openstack/glance-bc66-account-create-7ph85" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.493717 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc98g\" (UniqueName: \"kubernetes.io/projected/67515e96-6010-4d7b-9d07-82a661b67ef0-kube-api-access-jc98g\") pod \"glance-bc66-account-create-7ph85\" (UID: \"67515e96-6010-4d7b-9d07-82a661b67ef0\") " pod="openstack/glance-bc66-account-create-7ph85" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.494469 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67515e96-6010-4d7b-9d07-82a661b67ef0-operator-scripts\") pod \"glance-bc66-account-create-7ph85\" (UID: \"67515e96-6010-4d7b-9d07-82a661b67ef0\") " pod="openstack/glance-bc66-account-create-7ph85" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.518308 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc98g\" (UniqueName: \"kubernetes.io/projected/67515e96-6010-4d7b-9d07-82a661b67ef0-kube-api-access-jc98g\") pod \"glance-bc66-account-create-7ph85\" (UID: \"67515e96-6010-4d7b-9d07-82a661b67ef0\") " pod="openstack/glance-bc66-account-create-7ph85" Nov 22 09:53:09 crc kubenswrapper[4743]: I1122 09:53:09.566342 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc66-account-create-7ph85" Nov 22 09:53:10 crc kubenswrapper[4743]: I1122 09:53:10.019961 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-rvqd9"] Nov 22 09:53:10 crc kubenswrapper[4743]: W1122 09:53:10.022297 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3684de8_7edc_43f3_92f1_2012be187d12.slice/crio-9457f8e37a75cd79942f57d51f9d7692316f6fc8781e911ae596e21fb02c328d WatchSource:0}: Error finding container 9457f8e37a75cd79942f57d51f9d7692316f6fc8781e911ae596e21fb02c328d: Status 404 returned error can't find the container with id 9457f8e37a75cd79942f57d51f9d7692316f6fc8781e911ae596e21fb02c328d Nov 22 09:53:10 crc kubenswrapper[4743]: I1122 09:53:10.091954 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bc66-account-create-7ph85"] Nov 22 09:53:10 crc kubenswrapper[4743]: W1122 09:53:10.099087 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67515e96_6010_4d7b_9d07_82a661b67ef0.slice/crio-46e071c0936ddd25b6dd86b4f25e7029cfe0d83dc9e8e0aec0b354cc4573c225 WatchSource:0}: Error finding container 46e071c0936ddd25b6dd86b4f25e7029cfe0d83dc9e8e0aec0b354cc4573c225: Status 404 returned error can't find the container with id 46e071c0936ddd25b6dd86b4f25e7029cfe0d83dc9e8e0aec0b354cc4573c225 Nov 22 09:53:10 crc kubenswrapper[4743]: I1122 09:53:10.803898 4743 generic.go:334] "Generic (PLEG): container finished" podID="f3684de8-7edc-43f3-92f1-2012be187d12" containerID="49704b2f3378d514fd719677ec7c09b2d7dd9f4608cffbd94fd354a3a0cd8af3" exitCode=0 Nov 22 09:53:10 crc kubenswrapper[4743]: I1122 09:53:10.804023 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rvqd9" event={"ID":"f3684de8-7edc-43f3-92f1-2012be187d12","Type":"ContainerDied","Data":"49704b2f3378d514fd719677ec7c09b2d7dd9f4608cffbd94fd354a3a0cd8af3"} Nov 22 09:53:10 crc kubenswrapper[4743]: I1122 09:53:10.804072 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rvqd9" event={"ID":"f3684de8-7edc-43f3-92f1-2012be187d12","Type":"ContainerStarted","Data":"9457f8e37a75cd79942f57d51f9d7692316f6fc8781e911ae596e21fb02c328d"} Nov 22 09:53:10 crc kubenswrapper[4743]: I1122 09:53:10.807069 4743 generic.go:334] "Generic (PLEG): container finished" podID="67515e96-6010-4d7b-9d07-82a661b67ef0" containerID="f46bd808cf70e5094d48f92e49fe3b73ce6824c13c3b0cd3f32bd9e99b8719a9" exitCode=0 Nov 22 09:53:10 crc kubenswrapper[4743]: I1122 09:53:10.807131 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc66-account-create-7ph85" event={"ID":"67515e96-6010-4d7b-9d07-82a661b67ef0","Type":"ContainerDied","Data":"f46bd808cf70e5094d48f92e49fe3b73ce6824c13c3b0cd3f32bd9e99b8719a9"} Nov 22 09:53:10 crc kubenswrapper[4743]: I1122 09:53:10.807193 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc66-account-create-7ph85" event={"ID":"67515e96-6010-4d7b-9d07-82a661b67ef0","Type":"ContainerStarted","Data":"46e071c0936ddd25b6dd86b4f25e7029cfe0d83dc9e8e0aec0b354cc4573c225"} Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.204053 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc66-account-create-7ph85" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.210550 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rvqd9" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.359352 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d97v8\" (UniqueName: \"kubernetes.io/projected/f3684de8-7edc-43f3-92f1-2012be187d12-kube-api-access-d97v8\") pod \"f3684de8-7edc-43f3-92f1-2012be187d12\" (UID: \"f3684de8-7edc-43f3-92f1-2012be187d12\") " Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.359418 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3684de8-7edc-43f3-92f1-2012be187d12-operator-scripts\") pod \"f3684de8-7edc-43f3-92f1-2012be187d12\" (UID: \"f3684de8-7edc-43f3-92f1-2012be187d12\") " Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.359482 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc98g\" (UniqueName: \"kubernetes.io/projected/67515e96-6010-4d7b-9d07-82a661b67ef0-kube-api-access-jc98g\") pod \"67515e96-6010-4d7b-9d07-82a661b67ef0\" (UID: \"67515e96-6010-4d7b-9d07-82a661b67ef0\") " Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.359649 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67515e96-6010-4d7b-9d07-82a661b67ef0-operator-scripts\") pod \"67515e96-6010-4d7b-9d07-82a661b67ef0\" (UID: \"67515e96-6010-4d7b-9d07-82a661b67ef0\") " Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.360193 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3684de8-7edc-43f3-92f1-2012be187d12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f3684de8-7edc-43f3-92f1-2012be187d12" (UID: "f3684de8-7edc-43f3-92f1-2012be187d12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.360193 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67515e96-6010-4d7b-9d07-82a661b67ef0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67515e96-6010-4d7b-9d07-82a661b67ef0" (UID: "67515e96-6010-4d7b-9d07-82a661b67ef0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.364253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3684de8-7edc-43f3-92f1-2012be187d12-kube-api-access-d97v8" (OuterVolumeSpecName: "kube-api-access-d97v8") pod "f3684de8-7edc-43f3-92f1-2012be187d12" (UID: "f3684de8-7edc-43f3-92f1-2012be187d12"). InnerVolumeSpecName "kube-api-access-d97v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.364262 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67515e96-6010-4d7b-9d07-82a661b67ef0-kube-api-access-jc98g" (OuterVolumeSpecName: "kube-api-access-jc98g") pod "67515e96-6010-4d7b-9d07-82a661b67ef0" (UID: "67515e96-6010-4d7b-9d07-82a661b67ef0"). InnerVolumeSpecName "kube-api-access-jc98g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.461962 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67515e96-6010-4d7b-9d07-82a661b67ef0-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.462008 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d97v8\" (UniqueName: \"kubernetes.io/projected/f3684de8-7edc-43f3-92f1-2012be187d12-kube-api-access-d97v8\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.462025 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f3684de8-7edc-43f3-92f1-2012be187d12-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.462040 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc98g\" (UniqueName: \"kubernetes.io/projected/67515e96-6010-4d7b-9d07-82a661b67ef0-kube-api-access-jc98g\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.822649 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc66-account-create-7ph85" event={"ID":"67515e96-6010-4d7b-9d07-82a661b67ef0","Type":"ContainerDied","Data":"46e071c0936ddd25b6dd86b4f25e7029cfe0d83dc9e8e0aec0b354cc4573c225"} Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.822687 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46e071c0936ddd25b6dd86b4f25e7029cfe0d83dc9e8e0aec0b354cc4573c225" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.822702 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc66-account-create-7ph85" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.824791 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-rvqd9" event={"ID":"f3684de8-7edc-43f3-92f1-2012be187d12","Type":"ContainerDied","Data":"9457f8e37a75cd79942f57d51f9d7692316f6fc8781e911ae596e21fb02c328d"} Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.824827 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9457f8e37a75cd79942f57d51f9d7692316f6fc8781e911ae596e21fb02c328d" Nov 22 09:53:12 crc kubenswrapper[4743]: I1122 09:53:12.824856 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-rvqd9" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.386970 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-wq9jx"] Nov 22 09:53:14 crc kubenswrapper[4743]: E1122 09:53:14.387806 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67515e96-6010-4d7b-9d07-82a661b67ef0" containerName="mariadb-account-create" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.387824 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="67515e96-6010-4d7b-9d07-82a661b67ef0" containerName="mariadb-account-create" Nov 22 09:53:14 crc kubenswrapper[4743]: E1122 09:53:14.387850 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3684de8-7edc-43f3-92f1-2012be187d12" containerName="mariadb-database-create" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.387859 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3684de8-7edc-43f3-92f1-2012be187d12" containerName="mariadb-database-create" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.388092 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3684de8-7edc-43f3-92f1-2012be187d12" containerName="mariadb-database-create" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.388126 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="67515e96-6010-4d7b-9d07-82a661b67ef0" containerName="mariadb-account-create" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.388874 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.390945 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.391164 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vq74t" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.394693 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wq9jx"] Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.496692 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-combined-ca-bundle\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.496928 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-config-data\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.497010 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrd2d\" (UniqueName: \"kubernetes.io/projected/ed0ded17-e7cd-474f-9aab-50bd38924ae1-kube-api-access-jrd2d\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.497054 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-db-sync-config-data\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.598123 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-config-data\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.598167 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrd2d\" (UniqueName: \"kubernetes.io/projected/ed0ded17-e7cd-474f-9aab-50bd38924ae1-kube-api-access-jrd2d\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.598195 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-db-sync-config-data\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.598241 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-combined-ca-bundle\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.604984 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-combined-ca-bundle\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.615596 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-db-sync-config-data\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.615878 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-config-data\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.619353 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrd2d\" (UniqueName: \"kubernetes.io/projected/ed0ded17-e7cd-474f-9aab-50bd38924ae1-kube-api-access-jrd2d\") pod \"glance-db-sync-wq9jx\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:14 crc kubenswrapper[4743]: I1122 09:53:14.745272 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:15 crc kubenswrapper[4743]: I1122 09:53:15.287473 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-wq9jx"] Nov 22 09:53:15 crc kubenswrapper[4743]: I1122 09:53:15.855831 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wq9jx" event={"ID":"ed0ded17-e7cd-474f-9aab-50bd38924ae1","Type":"ContainerStarted","Data":"eeb73d4da37af9026faa316ea2672ae27ab2c956c883577768937e69e5add235"} Nov 22 09:53:15 crc kubenswrapper[4743]: I1122 09:53:15.856152 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wq9jx" event={"ID":"ed0ded17-e7cd-474f-9aab-50bd38924ae1","Type":"ContainerStarted","Data":"2d3231f87b3f7a816899a43869fd03f8d6745883be44b66ebe0de4c41384b1a1"} Nov 22 09:53:15 crc kubenswrapper[4743]: I1122 09:53:15.871389 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-wq9jx" podStartSLOduration=1.87136936 podStartE2EDuration="1.87136936s" podCreationTimestamp="2025-11-22 09:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:53:15.870718601 +0000 UTC m=+5469.577079653" watchObservedRunningTime="2025-11-22 09:53:15.87136936 +0000 UTC m=+5469.577730412" Nov 22 09:53:18 crc kubenswrapper[4743]: I1122 09:53:18.890615 4743 generic.go:334] "Generic (PLEG): container finished" podID="ed0ded17-e7cd-474f-9aab-50bd38924ae1" containerID="eeb73d4da37af9026faa316ea2672ae27ab2c956c883577768937e69e5add235" exitCode=0 Nov 22 09:53:18 crc kubenswrapper[4743]: I1122 09:53:18.890751 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wq9jx" event={"ID":"ed0ded17-e7cd-474f-9aab-50bd38924ae1","Type":"ContainerDied","Data":"eeb73d4da37af9026faa316ea2672ae27ab2c956c883577768937e69e5add235"} Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.384344 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.509020 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-combined-ca-bundle\") pod \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.509113 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-config-data\") pod \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.509216 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrd2d\" (UniqueName: \"kubernetes.io/projected/ed0ded17-e7cd-474f-9aab-50bd38924ae1-kube-api-access-jrd2d\") pod \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.509426 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-db-sync-config-data\") pod \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\" (UID: \"ed0ded17-e7cd-474f-9aab-50bd38924ae1\") " Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.515738 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ed0ded17-e7cd-474f-9aab-50bd38924ae1" (UID: "ed0ded17-e7cd-474f-9aab-50bd38924ae1"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.535253 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0ded17-e7cd-474f-9aab-50bd38924ae1-kube-api-access-jrd2d" (OuterVolumeSpecName: "kube-api-access-jrd2d") pod "ed0ded17-e7cd-474f-9aab-50bd38924ae1" (UID: "ed0ded17-e7cd-474f-9aab-50bd38924ae1"). InnerVolumeSpecName "kube-api-access-jrd2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.559092 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed0ded17-e7cd-474f-9aab-50bd38924ae1" (UID: "ed0ded17-e7cd-474f-9aab-50bd38924ae1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.562156 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-config-data" (OuterVolumeSpecName: "config-data") pod "ed0ded17-e7cd-474f-9aab-50bd38924ae1" (UID: "ed0ded17-e7cd-474f-9aab-50bd38924ae1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.612684 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.612832 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.612854 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrd2d\" (UniqueName: \"kubernetes.io/projected/ed0ded17-e7cd-474f-9aab-50bd38924ae1-kube-api-access-jrd2d\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.612870 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed0ded17-e7cd-474f-9aab-50bd38924ae1-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.917350 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-wq9jx" event={"ID":"ed0ded17-e7cd-474f-9aab-50bd38924ae1","Type":"ContainerDied","Data":"2d3231f87b3f7a816899a43869fd03f8d6745883be44b66ebe0de4c41384b1a1"} Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.917393 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d3231f87b3f7a816899a43869fd03f8d6745883be44b66ebe0de4c41384b1a1" Nov 22 09:53:20 crc kubenswrapper[4743]: I1122 09:53:20.917450 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-wq9jx" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.302310 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:53:21 crc kubenswrapper[4743]: E1122 09:53:21.303175 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0ded17-e7cd-474f-9aab-50bd38924ae1" containerName="glance-db-sync" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.303202 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0ded17-e7cd-474f-9aab-50bd38924ae1" containerName="glance-db-sync" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.310198 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0ded17-e7cd-474f-9aab-50bd38924ae1" containerName="glance-db-sync" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.311400 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.314914 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.315261 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.315395 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vq74t" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.319966 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.320199 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.342721 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-45jh4"] Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.344549 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.353009 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-45jh4"] Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.402709 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.404167 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.406149 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.412965 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429103 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-dns-svc\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429196 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-config\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429324 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-logs\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429388 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-ceph\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429429 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-scripts\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429449 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlqp2\" (UniqueName: \"kubernetes.io/projected/2e1bf181-2934-48be-b073-1d97e76aa814-kube-api-access-nlqp2\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429469 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-config-data\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429491 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.429555 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hs6x\" (UniqueName: \"kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-kube-api-access-7hs6x\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531438 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-config\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531490 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-logs\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531539 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531603 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531624 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-ceph\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531647 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-scripts\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531728 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-config-data\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531797 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlqp2\" (UniqueName: \"kubernetes.io/projected/2e1bf181-2934-48be-b073-1d97e76aa814-kube-api-access-nlqp2\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.531959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.532017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.532065 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.532097 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.532171 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hs6x\" (UniqueName: \"kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-kube-api-access-7hs6x\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.532197 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-dns-svc\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.532228 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.532325 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.532493 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnsk\" (UniqueName: \"kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-kube-api-access-5jnsk\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.532987 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-dns-svc\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.533197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-logs\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.533214 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.533411 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.535413 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-config\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.535624 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.537598 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-ceph\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.538207 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-scripts\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.538251 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.543890 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-config-data\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.551129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlqp2\" (UniqueName: \"kubernetes.io/projected/2e1bf181-2934-48be-b073-1d97e76aa814-kube-api-access-nlqp2\") pod \"dnsmasq-dns-6b9b57f477-45jh4\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.560410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hs6x\" (UniqueName: \"kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-kube-api-access-7hs6x\") pod \"glance-default-external-api-0\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.632769 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.634111 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnsk\" (UniqueName: \"kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-kube-api-access-5jnsk\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.634804 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.634847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.634873 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.634904 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.634924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.634954 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.635468 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.635811 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.639049 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.641026 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.642220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.642322 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.655471 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnsk\" (UniqueName: \"kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-kube-api-access-5jnsk\") pod \"glance-default-internal-api-0\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.670753 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:21 crc kubenswrapper[4743]: I1122 09:53:21.720808 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:22 crc kubenswrapper[4743]: I1122 09:53:22.248900 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-45jh4"] Nov 22 09:53:22 crc kubenswrapper[4743]: I1122 09:53:22.263215 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:53:22 crc kubenswrapper[4743]: I1122 09:53:22.292318 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:53:22 crc kubenswrapper[4743]: I1122 09:53:22.414355 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:53:22 crc kubenswrapper[4743]: W1122 09:53:22.429716 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0079ce2_db53_40a8_80db_e6e2ede1f711.slice/crio-4629adf52f632833e456aede234725fb1b240422a3dd4cfd59fa678d1eab335c WatchSource:0}: Error finding container 4629adf52f632833e456aede234725fb1b240422a3dd4cfd59fa678d1eab335c: Status 404 returned error can't find the container with id 4629adf52f632833e456aede234725fb1b240422a3dd4cfd59fa678d1eab335c Nov 22 09:53:22 crc kubenswrapper[4743]: I1122 09:53:22.948595 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16e8aeda-7e62-4495-87fb-366ed738b979","Type":"ContainerStarted","Data":"88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8"} Nov 22 09:53:22 crc kubenswrapper[4743]: I1122 09:53:22.948960 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16e8aeda-7e62-4495-87fb-366ed738b979","Type":"ContainerStarted","Data":"aab870d4d34143d2f7649689e23b89aad378e7bee96b88433a034091a7e17bab"} Nov 22 09:53:22 crc kubenswrapper[4743]: I1122 09:53:22.957345 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0079ce2-db53-40a8-80db-e6e2ede1f711","Type":"ContainerStarted","Data":"4629adf52f632833e456aede234725fb1b240422a3dd4cfd59fa678d1eab335c"} Nov 22 09:53:22 crc kubenswrapper[4743]: I1122 09:53:22.961305 4743 generic.go:334] "Generic (PLEG): container finished" podID="2e1bf181-2934-48be-b073-1d97e76aa814" containerID="cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af" exitCode=0 Nov 22 09:53:22 crc kubenswrapper[4743]: I1122 09:53:22.961370 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" event={"ID":"2e1bf181-2934-48be-b073-1d97e76aa814","Type":"ContainerDied","Data":"cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af"} Nov 22 09:53:22 crc kubenswrapper[4743]: I1122 09:53:22.961433 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" event={"ID":"2e1bf181-2934-48be-b073-1d97e76aa814","Type":"ContainerStarted","Data":"6b2c0c8cfa5c5f4af9283fb231f5f459d1b4277ec1ac784b6e9dd4a83224fa00"} Nov 22 09:53:23 crc kubenswrapper[4743]: I1122 09:53:23.956502 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:53:23 crc kubenswrapper[4743]: I1122 09:53:23.974421 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" event={"ID":"2e1bf181-2934-48be-b073-1d97e76aa814","Type":"ContainerStarted","Data":"789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9"} Nov 22 09:53:23 crc kubenswrapper[4743]: I1122 09:53:23.975543 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:23 crc kubenswrapper[4743]: I1122 09:53:23.976883 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16e8aeda-7e62-4495-87fb-366ed738b979","Type":"ContainerStarted","Data":"0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa"} Nov 22 09:53:23 crc kubenswrapper[4743]: I1122 09:53:23.976988 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="16e8aeda-7e62-4495-87fb-366ed738b979" containerName="glance-log" containerID="cri-o://88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8" gracePeriod=30 Nov 22 09:53:23 crc kubenswrapper[4743]: I1122 09:53:23.977209 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="16e8aeda-7e62-4495-87fb-366ed738b979" containerName="glance-httpd" containerID="cri-o://0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa" gracePeriod=30 Nov 22 09:53:23 crc kubenswrapper[4743]: I1122 09:53:23.978935 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0079ce2-db53-40a8-80db-e6e2ede1f711","Type":"ContainerStarted","Data":"7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e"} Nov 22 09:53:23 crc kubenswrapper[4743]: I1122 09:53:23.978960 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0079ce2-db53-40a8-80db-e6e2ede1f711","Type":"ContainerStarted","Data":"b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597"} Nov 22 09:53:23 crc kubenswrapper[4743]: I1122 09:53:23.997782 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" podStartSLOduration=2.997765341 podStartE2EDuration="2.997765341s" podCreationTimestamp="2025-11-22 09:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:53:23.992571572 +0000 UTC m=+5477.698932624" watchObservedRunningTime="2025-11-22 09:53:23.997765341 +0000 UTC m=+5477.704126393" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.028427 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.028411242 podStartE2EDuration="3.028411242s" podCreationTimestamp="2025-11-22 09:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:53:24.02244817 +0000 UTC m=+5477.728809222" watchObservedRunningTime="2025-11-22 09:53:24.028411242 +0000 UTC m=+5477.734772294" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.051364 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.051349781 podStartE2EDuration="3.051349781s" podCreationTimestamp="2025-11-22 09:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:53:24.047940263 +0000 UTC m=+5477.754301315" watchObservedRunningTime="2025-11-22 09:53:24.051349781 +0000 UTC m=+5477.757710823" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.549641 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.684788 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hs6x\" (UniqueName: \"kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-kube-api-access-7hs6x\") pod \"16e8aeda-7e62-4495-87fb-366ed738b979\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.684891 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-combined-ca-bundle\") pod \"16e8aeda-7e62-4495-87fb-366ed738b979\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.684985 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-config-data\") pod \"16e8aeda-7e62-4495-87fb-366ed738b979\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.685212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-httpd-run\") pod \"16e8aeda-7e62-4495-87fb-366ed738b979\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.685277 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-logs\") pod \"16e8aeda-7e62-4495-87fb-366ed738b979\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.685352 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-scripts\") pod \"16e8aeda-7e62-4495-87fb-366ed738b979\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.685398 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-ceph\") pod \"16e8aeda-7e62-4495-87fb-366ed738b979\" (UID: \"16e8aeda-7e62-4495-87fb-366ed738b979\") " Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.685670 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "16e8aeda-7e62-4495-87fb-366ed738b979" (UID: "16e8aeda-7e62-4495-87fb-366ed738b979"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.685765 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-logs" (OuterVolumeSpecName: "logs") pod "16e8aeda-7e62-4495-87fb-366ed738b979" (UID: "16e8aeda-7e62-4495-87fb-366ed738b979"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.686422 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.686455 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/16e8aeda-7e62-4495-87fb-366ed738b979-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.691170 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-scripts" (OuterVolumeSpecName: "scripts") pod "16e8aeda-7e62-4495-87fb-366ed738b979" (UID: "16e8aeda-7e62-4495-87fb-366ed738b979"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.693567 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-ceph" (OuterVolumeSpecName: "ceph") pod "16e8aeda-7e62-4495-87fb-366ed738b979" (UID: "16e8aeda-7e62-4495-87fb-366ed738b979"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.705172 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-kube-api-access-7hs6x" (OuterVolumeSpecName: "kube-api-access-7hs6x") pod "16e8aeda-7e62-4495-87fb-366ed738b979" (UID: "16e8aeda-7e62-4495-87fb-366ed738b979"). InnerVolumeSpecName "kube-api-access-7hs6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.714691 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16e8aeda-7e62-4495-87fb-366ed738b979" (UID: "16e8aeda-7e62-4495-87fb-366ed738b979"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.734085 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-config-data" (OuterVolumeSpecName: "config-data") pod "16e8aeda-7e62-4495-87fb-366ed738b979" (UID: "16e8aeda-7e62-4495-87fb-366ed738b979"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.788242 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.788296 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.788309 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.788322 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hs6x\" (UniqueName: \"kubernetes.io/projected/16e8aeda-7e62-4495-87fb-366ed738b979-kube-api-access-7hs6x\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.788340 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16e8aeda-7e62-4495-87fb-366ed738b979-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.987662 4743 generic.go:334] "Generic (PLEG): container finished" podID="16e8aeda-7e62-4495-87fb-366ed738b979" containerID="0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa" exitCode=0 Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.987699 4743 generic.go:334] "Generic (PLEG): container finished" podID="16e8aeda-7e62-4495-87fb-366ed738b979" containerID="88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8" exitCode=143 Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.988551 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.988721 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16e8aeda-7e62-4495-87fb-366ed738b979","Type":"ContainerDied","Data":"0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa"} Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.988763 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16e8aeda-7e62-4495-87fb-366ed738b979","Type":"ContainerDied","Data":"88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8"} Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.988775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"16e8aeda-7e62-4495-87fb-366ed738b979","Type":"ContainerDied","Data":"aab870d4d34143d2f7649689e23b89aad378e7bee96b88433a034091a7e17bab"} Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.988793 4743 scope.go:117] "RemoveContainer" containerID="0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa" Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.989005 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a0079ce2-db53-40a8-80db-e6e2ede1f711" containerName="glance-log" containerID="cri-o://b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597" gracePeriod=30 Nov 22 09:53:24 crc kubenswrapper[4743]: I1122 09:53:24.989353 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a0079ce2-db53-40a8-80db-e6e2ede1f711" containerName="glance-httpd" containerID="cri-o://7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e" gracePeriod=30 Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.027781 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.028367 4743 scope.go:117] "RemoveContainer" containerID="88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.037879 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.067751 4743 scope.go:117] "RemoveContainer" containerID="0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.068217 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:53:25 crc kubenswrapper[4743]: E1122 09:53:25.068765 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e8aeda-7e62-4495-87fb-366ed738b979" containerName="glance-httpd" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.068796 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e8aeda-7e62-4495-87fb-366ed738b979" containerName="glance-httpd" Nov 22 09:53:25 crc kubenswrapper[4743]: E1122 09:53:25.068817 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16e8aeda-7e62-4495-87fb-366ed738b979" containerName="glance-log" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.068828 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e8aeda-7e62-4495-87fb-366ed738b979" containerName="glance-log" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.069163 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e8aeda-7e62-4495-87fb-366ed738b979" containerName="glance-httpd" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.069184 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="16e8aeda-7e62-4495-87fb-366ed738b979" containerName="glance-log" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.070726 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: E1122 09:53:25.070800 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa\": container with ID starting with 0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa not found: ID does not exist" containerID="0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.070855 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa"} err="failed to get container status \"0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa\": rpc error: code = NotFound desc = could not find container \"0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa\": container with ID starting with 0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa not found: ID does not exist" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.070891 4743 scope.go:117] "RemoveContainer" containerID="88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8" Nov 22 09:53:25 crc kubenswrapper[4743]: E1122 09:53:25.073752 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8\": container with ID starting with 88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8 not found: ID does not exist" containerID="88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.074049 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8"} err="failed to get container status \"88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8\": rpc error: code = NotFound desc = could not find container \"88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8\": container with ID starting with 88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8 not found: ID does not exist" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.074070 4743 scope.go:117] "RemoveContainer" containerID="0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.075309 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa"} err="failed to get container status \"0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa\": rpc error: code = NotFound desc = could not find container \"0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa\": container with ID starting with 0b89cab055e138a9318181e9c3e824e7ecc774dde2c586a60e8806fb10446faa not found: ID does not exist" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.075365 4743 scope.go:117] "RemoveContainer" containerID="88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.075864 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8"} err="failed to get container status \"88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8\": rpc error: code = NotFound desc = could not find container \"88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8\": container with ID starting with 88c66c47dae4526958300402026993e60ac2d1c874bb3aa22295603c60b61bb8 not found: ID does not exist" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.076098 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.080182 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.165501 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16e8aeda-7e62-4495-87fb-366ed738b979" path="/var/lib/kubelet/pods/16e8aeda-7e62-4495-87fb-366ed738b979/volumes" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.194976 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp7l4\" (UniqueName: \"kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-kube-api-access-jp7l4\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.195128 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.195215 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-config-data\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.195234 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-ceph\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.195370 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-logs\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.195386 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.195505 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-scripts\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.298268 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-logs\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.298317 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.298474 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-scripts\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.298709 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp7l4\" (UniqueName: \"kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-kube-api-access-jp7l4\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.298815 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-logs\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.298849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.298909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-config-data\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.298926 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-ceph\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.299083 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.302375 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-ceph\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.302807 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-scripts\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.303147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-config-data\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.303271 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.315096 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp7l4\" (UniqueName: \"kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-kube-api-access-jp7l4\") pod \"glance-default-external-api-0\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.393557 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.500319 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.602669 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-logs\") pod \"a0079ce2-db53-40a8-80db-e6e2ede1f711\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.602762 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-scripts\") pod \"a0079ce2-db53-40a8-80db-e6e2ede1f711\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.602801 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jnsk\" (UniqueName: \"kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-kube-api-access-5jnsk\") pod \"a0079ce2-db53-40a8-80db-e6e2ede1f711\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.602850 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-config-data\") pod \"a0079ce2-db53-40a8-80db-e6e2ede1f711\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.602936 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-combined-ca-bundle\") pod \"a0079ce2-db53-40a8-80db-e6e2ede1f711\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.602962 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-ceph\") pod \"a0079ce2-db53-40a8-80db-e6e2ede1f711\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.602990 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-httpd-run\") pod \"a0079ce2-db53-40a8-80db-e6e2ede1f711\" (UID: \"a0079ce2-db53-40a8-80db-e6e2ede1f711\") " Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.603702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a0079ce2-db53-40a8-80db-e6e2ede1f711" (UID: "a0079ce2-db53-40a8-80db-e6e2ede1f711"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.604029 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-logs" (OuterVolumeSpecName: "logs") pod "a0079ce2-db53-40a8-80db-e6e2ede1f711" (UID: "a0079ce2-db53-40a8-80db-e6e2ede1f711"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.608645 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-scripts" (OuterVolumeSpecName: "scripts") pod "a0079ce2-db53-40a8-80db-e6e2ede1f711" (UID: "a0079ce2-db53-40a8-80db-e6e2ede1f711"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.608676 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-kube-api-access-5jnsk" (OuterVolumeSpecName: "kube-api-access-5jnsk") pod "a0079ce2-db53-40a8-80db-e6e2ede1f711" (UID: "a0079ce2-db53-40a8-80db-e6e2ede1f711"). InnerVolumeSpecName "kube-api-access-5jnsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.610035 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-ceph" (OuterVolumeSpecName: "ceph") pod "a0079ce2-db53-40a8-80db-e6e2ede1f711" (UID: "a0079ce2-db53-40a8-80db-e6e2ede1f711"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.637529 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0079ce2-db53-40a8-80db-e6e2ede1f711" (UID: "a0079ce2-db53-40a8-80db-e6e2ede1f711"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.653268 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-config-data" (OuterVolumeSpecName: "config-data") pod "a0079ce2-db53-40a8-80db-e6e2ede1f711" (UID: "a0079ce2-db53-40a8-80db-e6e2ede1f711"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.704472 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jnsk\" (UniqueName: \"kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-kube-api-access-5jnsk\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.704509 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.704521 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.704531 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a0079ce2-db53-40a8-80db-e6e2ede1f711-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.704540 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.704548 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0079ce2-db53-40a8-80db-e6e2ede1f711-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.704558 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0079ce2-db53-40a8-80db-e6e2ede1f711-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:25 crc kubenswrapper[4743]: I1122 09:53:25.895871 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.003701 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0079ce2-db53-40a8-80db-e6e2ede1f711" containerID="7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e" exitCode=0 Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.003733 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0079ce2-db53-40a8-80db-e6e2ede1f711" containerID="b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597" exitCode=143 Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.003788 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.003788 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0079ce2-db53-40a8-80db-e6e2ede1f711","Type":"ContainerDied","Data":"7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e"} Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.003927 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0079ce2-db53-40a8-80db-e6e2ede1f711","Type":"ContainerDied","Data":"b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597"} Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.003953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0079ce2-db53-40a8-80db-e6e2ede1f711","Type":"ContainerDied","Data":"4629adf52f632833e456aede234725fb1b240422a3dd4cfd59fa678d1eab335c"} Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.003973 4743 scope.go:117] "RemoveContainer" containerID="7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.006745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b9de402d-3bf9-4e99-bac2-6f241134b16a","Type":"ContainerStarted","Data":"572ed50d92610d1a1422c0acf6dbaabc56378b2049bc6017fa0f41cbb9800c54"} Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.038257 4743 scope.go:117] "RemoveContainer" containerID="b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.057966 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.073870 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.088120 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:53:26 crc kubenswrapper[4743]: E1122 09:53:26.088606 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0079ce2-db53-40a8-80db-e6e2ede1f711" containerName="glance-httpd" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.088628 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0079ce2-db53-40a8-80db-e6e2ede1f711" containerName="glance-httpd" Nov 22 09:53:26 crc kubenswrapper[4743]: E1122 09:53:26.088656 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0079ce2-db53-40a8-80db-e6e2ede1f711" containerName="glance-log" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.088664 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0079ce2-db53-40a8-80db-e6e2ede1f711" containerName="glance-log" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.088866 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0079ce2-db53-40a8-80db-e6e2ede1f711" containerName="glance-httpd" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.088899 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0079ce2-db53-40a8-80db-e6e2ede1f711" containerName="glance-log" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.090953 4743 scope.go:117] "RemoveContainer" containerID="7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.091132 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: E1122 09:53:26.091656 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e\": container with ID starting with 7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e not found: ID does not exist" containerID="7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.091698 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e"} err="failed to get container status \"7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e\": rpc error: code = NotFound desc = could not find container \"7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e\": container with ID starting with 7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e not found: ID does not exist" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.091722 4743 scope.go:117] "RemoveContainer" containerID="b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597" Nov 22 09:53:26 crc kubenswrapper[4743]: E1122 09:53:26.092519 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597\": container with ID starting with b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597 not found: ID does not exist" containerID="b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.098186 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.098908 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.102742 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597"} err="failed to get container status \"b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597\": rpc error: code = NotFound desc = could not find container \"b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597\": container with ID starting with b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597 not found: ID does not exist" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.102817 4743 scope.go:117] "RemoveContainer" containerID="7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.103313 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e"} err="failed to get container status \"7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e\": rpc error: code = NotFound desc = could not find container \"7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e\": container with ID starting with 7e9a12864c1c0f45973bfe91c21d438e34338ed5ac01815ee832b77737b73f3e not found: ID does not exist" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.103362 4743 scope.go:117] "RemoveContainer" containerID="b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.103755 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597"} err="failed to get container status \"b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597\": rpc error: code = NotFound desc = could not find container \"b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597\": container with ID starting with b907437a5119940244e498a5caebe3e42189367100127ca0651e6a15a0018597 not found: ID does not exist" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.215045 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.215215 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-logs\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.215278 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n44dx\" (UniqueName: \"kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-kube-api-access-n44dx\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.215360 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-ceph\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.215409 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-scripts\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.215463 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.215530 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-config-data\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.317785 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.319607 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.319778 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-logs\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.319903 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n44dx\" (UniqueName: \"kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-kube-api-access-n44dx\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.319958 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-ceph\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.319991 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-scripts\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.320029 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.321778 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-logs\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.322319 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-config-data\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.326678 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-ceph\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.327221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-config-data\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.327499 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.328839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-scripts\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.338981 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n44dx\" (UniqueName: \"kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-kube-api-access-n44dx\") pod \"glance-default-internal-api-0\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.424271 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:26 crc kubenswrapper[4743]: W1122 09:53:26.926132 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31ccda11_3d5b_4530_9cbe_e3a994610b08.slice/crio-f9cadb364ad3b22c1b5d69f557279b6a5fbe3913b087285993a29b946de41aac WatchSource:0}: Error finding container f9cadb364ad3b22c1b5d69f557279b6a5fbe3913b087285993a29b946de41aac: Status 404 returned error can't find the container with id f9cadb364ad3b22c1b5d69f557279b6a5fbe3913b087285993a29b946de41aac Nov 22 09:53:26 crc kubenswrapper[4743]: I1122 09:53:26.932011 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 09:53:27 crc kubenswrapper[4743]: I1122 09:53:27.022192 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b9de402d-3bf9-4e99-bac2-6f241134b16a","Type":"ContainerStarted","Data":"88a9c587c0e34f007e0973a7375f9960c50ccffc10eaad7452e7a87c2ff01fc4"} Nov 22 09:53:27 crc kubenswrapper[4743]: I1122 09:53:27.022237 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b9de402d-3bf9-4e99-bac2-6f241134b16a","Type":"ContainerStarted","Data":"d3f44e6f73d7c2e9f49e5a06efeed4d63eb89e965c90dd505edb518de23da647"} Nov 22 09:53:27 crc kubenswrapper[4743]: I1122 09:53:27.024747 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31ccda11-3d5b-4530-9cbe-e3a994610b08","Type":"ContainerStarted","Data":"f9cadb364ad3b22c1b5d69f557279b6a5fbe3913b087285993a29b946de41aac"} Nov 22 09:53:27 crc kubenswrapper[4743]: I1122 09:53:27.051340 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=2.051323155 podStartE2EDuration="2.051323155s" podCreationTimestamp="2025-11-22 09:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:53:27.038413924 +0000 UTC m=+5480.744774976" watchObservedRunningTime="2025-11-22 09:53:27.051323155 +0000 UTC m=+5480.757684207" Nov 22 09:53:27 crc kubenswrapper[4743]: I1122 09:53:27.165438 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0079ce2-db53-40a8-80db-e6e2ede1f711" path="/var/lib/kubelet/pods/a0079ce2-db53-40a8-80db-e6e2ede1f711/volumes" Nov 22 09:53:28 crc kubenswrapper[4743]: I1122 09:53:28.040913 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31ccda11-3d5b-4530-9cbe-e3a994610b08","Type":"ContainerStarted","Data":"b61db55f10eacbf9d0ad87b0c1c93c80f3ed1bb015e8b4a8cae8f464dfe032dc"} Nov 22 09:53:28 crc kubenswrapper[4743]: I1122 09:53:28.041484 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31ccda11-3d5b-4530-9cbe-e3a994610b08","Type":"ContainerStarted","Data":"4be7b35b414a045785d62915dd2fd1f1cf18324df0d99b8222170e6f5ba65c09"} Nov 22 09:53:28 crc kubenswrapper[4743]: I1122 09:53:28.064106 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.064084002 podStartE2EDuration="2.064084002s" podCreationTimestamp="2025-11-22 09:53:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:53:28.059196011 +0000 UTC m=+5481.765557053" watchObservedRunningTime="2025-11-22 09:53:28.064084002 +0000 UTC m=+5481.770445054" Nov 22 09:53:31 crc kubenswrapper[4743]: I1122 09:53:31.671841 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:53:31 crc kubenswrapper[4743]: I1122 09:53:31.739750 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-mk4s6"] Nov 22 09:53:31 crc kubenswrapper[4743]: I1122 09:53:31.740011 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" podUID="c7bbf544-d418-4152-9f0f-fc8bf87be889" containerName="dnsmasq-dns" containerID="cri-o://cd59b3919b6d1cb11ac0a360e03e5852ae743a0563a0d47e5cd47d7f08a80ba5" gracePeriod=10 Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.034385 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" podUID="c7bbf544-d418-4152-9f0f-fc8bf87be889" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.34:5353: connect: connection refused" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.093395 4743 generic.go:334] "Generic (PLEG): container finished" podID="c7bbf544-d418-4152-9f0f-fc8bf87be889" containerID="cd59b3919b6d1cb11ac0a360e03e5852ae743a0563a0d47e5cd47d7f08a80ba5" exitCode=0 Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.093470 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" event={"ID":"c7bbf544-d418-4152-9f0f-fc8bf87be889","Type":"ContainerDied","Data":"cd59b3919b6d1cb11ac0a360e03e5852ae743a0563a0d47e5cd47d7f08a80ba5"} Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.322048 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.356105 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-dns-svc\") pod \"c7bbf544-d418-4152-9f0f-fc8bf87be889\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.356207 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-nb\") pod \"c7bbf544-d418-4152-9f0f-fc8bf87be889\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.356507 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7d6r\" (UniqueName: \"kubernetes.io/projected/c7bbf544-d418-4152-9f0f-fc8bf87be889-kube-api-access-f7d6r\") pod \"c7bbf544-d418-4152-9f0f-fc8bf87be889\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.356569 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-config\") pod \"c7bbf544-d418-4152-9f0f-fc8bf87be889\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.356797 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-sb\") pod \"c7bbf544-d418-4152-9f0f-fc8bf87be889\" (UID: \"c7bbf544-d418-4152-9f0f-fc8bf87be889\") " Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.364059 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7bbf544-d418-4152-9f0f-fc8bf87be889-kube-api-access-f7d6r" (OuterVolumeSpecName: "kube-api-access-f7d6r") pod "c7bbf544-d418-4152-9f0f-fc8bf87be889" (UID: "c7bbf544-d418-4152-9f0f-fc8bf87be889"). InnerVolumeSpecName "kube-api-access-f7d6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.399792 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c7bbf544-d418-4152-9f0f-fc8bf87be889" (UID: "c7bbf544-d418-4152-9f0f-fc8bf87be889"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.400537 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c7bbf544-d418-4152-9f0f-fc8bf87be889" (UID: "c7bbf544-d418-4152-9f0f-fc8bf87be889"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.409740 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7bbf544-d418-4152-9f0f-fc8bf87be889" (UID: "c7bbf544-d418-4152-9f0f-fc8bf87be889"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.410475 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-config" (OuterVolumeSpecName: "config") pod "c7bbf544-d418-4152-9f0f-fc8bf87be889" (UID: "c7bbf544-d418-4152-9f0f-fc8bf87be889"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.460212 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7d6r\" (UniqueName: \"kubernetes.io/projected/c7bbf544-d418-4152-9f0f-fc8bf87be889-kube-api-access-f7d6r\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.460279 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.460300 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.460327 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:32 crc kubenswrapper[4743]: I1122 09:53:32.460350 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c7bbf544-d418-4152-9f0f-fc8bf87be889-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:33 crc kubenswrapper[4743]: I1122 09:53:33.124014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" event={"ID":"c7bbf544-d418-4152-9f0f-fc8bf87be889","Type":"ContainerDied","Data":"1af85587c225b90c9d81e63270c7a8b04b357310cd940cdada83516bb464b223"} Nov 22 09:53:33 crc kubenswrapper[4743]: I1122 09:53:33.124136 4743 scope.go:117] "RemoveContainer" containerID="cd59b3919b6d1cb11ac0a360e03e5852ae743a0563a0d47e5cd47d7f08a80ba5" Nov 22 09:53:33 crc kubenswrapper[4743]: I1122 09:53:33.124395 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59d59797-mk4s6" Nov 22 09:53:33 crc kubenswrapper[4743]: I1122 09:53:33.175525 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-mk4s6"] Nov 22 09:53:33 crc kubenswrapper[4743]: I1122 09:53:33.178364 4743 scope.go:117] "RemoveContainer" containerID="b53735807008b3a6112124998265dc25935ba1c2fb538e87a81a8d62dc86e61c" Nov 22 09:53:33 crc kubenswrapper[4743]: I1122 09:53:33.184908 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59d59797-mk4s6"] Nov 22 09:53:35 crc kubenswrapper[4743]: I1122 09:53:35.168554 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7bbf544-d418-4152-9f0f-fc8bf87be889" path="/var/lib/kubelet/pods/c7bbf544-d418-4152-9f0f-fc8bf87be889/volumes" Nov 22 09:53:35 crc kubenswrapper[4743]: I1122 09:53:35.395190 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 09:53:35 crc kubenswrapper[4743]: I1122 09:53:35.395273 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 09:53:35 crc kubenswrapper[4743]: I1122 09:53:35.449904 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 09:53:35 crc kubenswrapper[4743]: I1122 09:53:35.512508 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 09:53:36 crc kubenswrapper[4743]: I1122 09:53:36.162492 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 09:53:36 crc kubenswrapper[4743]: I1122 09:53:36.162562 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 09:53:36 crc kubenswrapper[4743]: I1122 09:53:36.425283 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:36 crc kubenswrapper[4743]: I1122 09:53:36.425346 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:36 crc kubenswrapper[4743]: I1122 09:53:36.477912 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:36 crc kubenswrapper[4743]: I1122 09:53:36.488799 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:37 crc kubenswrapper[4743]: I1122 09:53:37.175061 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:37 crc kubenswrapper[4743]: I1122 09:53:37.175088 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:38 crc kubenswrapper[4743]: I1122 09:53:38.288437 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 09:53:38 crc kubenswrapper[4743]: I1122 09:53:38.289453 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:53:38 crc kubenswrapper[4743]: I1122 09:53:38.291366 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 09:53:39 crc kubenswrapper[4743]: I1122 09:53:39.183544 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:39 crc kubenswrapper[4743]: I1122 09:53:39.188961 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 09:53:39 crc kubenswrapper[4743]: I1122 09:53:39.204865 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 09:53:48 crc kubenswrapper[4743]: I1122 09:53:48.931296 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m8fmm"] Nov 22 09:53:48 crc kubenswrapper[4743]: E1122 09:53:48.932562 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7bbf544-d418-4152-9f0f-fc8bf87be889" containerName="dnsmasq-dns" Nov 22 09:53:48 crc kubenswrapper[4743]: I1122 09:53:48.932714 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7bbf544-d418-4152-9f0f-fc8bf87be889" containerName="dnsmasq-dns" Nov 22 09:53:48 crc kubenswrapper[4743]: E1122 09:53:48.932757 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7bbf544-d418-4152-9f0f-fc8bf87be889" containerName="init" Nov 22 09:53:48 crc kubenswrapper[4743]: I1122 09:53:48.932764 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7bbf544-d418-4152-9f0f-fc8bf87be889" containerName="init" Nov 22 09:53:48 crc kubenswrapper[4743]: I1122 09:53:48.932990 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7bbf544-d418-4152-9f0f-fc8bf87be889" containerName="dnsmasq-dns" Nov 22 09:53:48 crc kubenswrapper[4743]: I1122 09:53:48.933872 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m8fmm" Nov 22 09:53:48 crc kubenswrapper[4743]: I1122 09:53:48.943820 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m8fmm"] Nov 22 09:53:48 crc kubenswrapper[4743]: I1122 09:53:48.966724 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgl8b\" (UniqueName: \"kubernetes.io/projected/39ac93d2-dedd-4109-a0ba-928660962d81-kube-api-access-rgl8b\") pod \"placement-db-create-m8fmm\" (UID: \"39ac93d2-dedd-4109-a0ba-928660962d81\") " pod="openstack/placement-db-create-m8fmm" Nov 22 09:53:48 crc kubenswrapper[4743]: I1122 09:53:48.966786 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39ac93d2-dedd-4109-a0ba-928660962d81-operator-scripts\") pod \"placement-db-create-m8fmm\" (UID: \"39ac93d2-dedd-4109-a0ba-928660962d81\") " pod="openstack/placement-db-create-m8fmm" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.034067 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-42bb-account-create-xrbt7"] Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.035353 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-42bb-account-create-xrbt7" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.039721 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.046697 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-42bb-account-create-xrbt7"] Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.069112 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgl8b\" (UniqueName: \"kubernetes.io/projected/39ac93d2-dedd-4109-a0ba-928660962d81-kube-api-access-rgl8b\") pod \"placement-db-create-m8fmm\" (UID: \"39ac93d2-dedd-4109-a0ba-928660962d81\") " pod="openstack/placement-db-create-m8fmm" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.069394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39ac93d2-dedd-4109-a0ba-928660962d81-operator-scripts\") pod \"placement-db-create-m8fmm\" (UID: \"39ac93d2-dedd-4109-a0ba-928660962d81\") " pod="openstack/placement-db-create-m8fmm" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.069536 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31f7b903-226c-402f-884f-4bf2ae3b7f74-operator-scripts\") pod \"placement-42bb-account-create-xrbt7\" (UID: \"31f7b903-226c-402f-884f-4bf2ae3b7f74\") " pod="openstack/placement-42bb-account-create-xrbt7" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.069713 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpv7t\" (UniqueName: \"kubernetes.io/projected/31f7b903-226c-402f-884f-4bf2ae3b7f74-kube-api-access-mpv7t\") pod \"placement-42bb-account-create-xrbt7\" (UID: \"31f7b903-226c-402f-884f-4bf2ae3b7f74\") " pod="openstack/placement-42bb-account-create-xrbt7" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.070865 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39ac93d2-dedd-4109-a0ba-928660962d81-operator-scripts\") pod \"placement-db-create-m8fmm\" (UID: \"39ac93d2-dedd-4109-a0ba-928660962d81\") " pod="openstack/placement-db-create-m8fmm" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.093694 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgl8b\" (UniqueName: \"kubernetes.io/projected/39ac93d2-dedd-4109-a0ba-928660962d81-kube-api-access-rgl8b\") pod \"placement-db-create-m8fmm\" (UID: \"39ac93d2-dedd-4109-a0ba-928660962d81\") " pod="openstack/placement-db-create-m8fmm" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.171065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpv7t\" (UniqueName: \"kubernetes.io/projected/31f7b903-226c-402f-884f-4bf2ae3b7f74-kube-api-access-mpv7t\") pod \"placement-42bb-account-create-xrbt7\" (UID: \"31f7b903-226c-402f-884f-4bf2ae3b7f74\") " pod="openstack/placement-42bb-account-create-xrbt7" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.171355 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31f7b903-226c-402f-884f-4bf2ae3b7f74-operator-scripts\") pod \"placement-42bb-account-create-xrbt7\" (UID: \"31f7b903-226c-402f-884f-4bf2ae3b7f74\") " pod="openstack/placement-42bb-account-create-xrbt7" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.172464 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31f7b903-226c-402f-884f-4bf2ae3b7f74-operator-scripts\") pod \"placement-42bb-account-create-xrbt7\" (UID: \"31f7b903-226c-402f-884f-4bf2ae3b7f74\") " pod="openstack/placement-42bb-account-create-xrbt7" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.191353 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpv7t\" (UniqueName: \"kubernetes.io/projected/31f7b903-226c-402f-884f-4bf2ae3b7f74-kube-api-access-mpv7t\") pod \"placement-42bb-account-create-xrbt7\" (UID: \"31f7b903-226c-402f-884f-4bf2ae3b7f74\") " pod="openstack/placement-42bb-account-create-xrbt7" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.265740 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m8fmm" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.350880 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-42bb-account-create-xrbt7" Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.709662 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m8fmm"] Nov 22 09:53:49 crc kubenswrapper[4743]: W1122 09:53:49.714335 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39ac93d2_dedd_4109_a0ba_928660962d81.slice/crio-8a33504f60cf3e6ffd88a86dfea8805e781e158232c89017028c365507d0dbf5 WatchSource:0}: Error finding container 8a33504f60cf3e6ffd88a86dfea8805e781e158232c89017028c365507d0dbf5: Status 404 returned error can't find the container with id 8a33504f60cf3e6ffd88a86dfea8805e781e158232c89017028c365507d0dbf5 Nov 22 09:53:49 crc kubenswrapper[4743]: I1122 09:53:49.791907 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-42bb-account-create-xrbt7"] Nov 22 09:53:49 crc kubenswrapper[4743]: W1122 09:53:49.817770 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f7b903_226c_402f_884f_4bf2ae3b7f74.slice/crio-64e36447188d4d161c385c7487996998826b8b8442b4e92b7876bcddfa39579a WatchSource:0}: Error finding container 64e36447188d4d161c385c7487996998826b8b8442b4e92b7876bcddfa39579a: Status 404 returned error can't find the container with id 64e36447188d4d161c385c7487996998826b8b8442b4e92b7876bcddfa39579a Nov 22 09:53:50 crc kubenswrapper[4743]: I1122 09:53:50.278145 4743 generic.go:334] "Generic (PLEG): container finished" podID="39ac93d2-dedd-4109-a0ba-928660962d81" containerID="9c80c72a0cf5d2809f1b57d692a40d1025cebd7e615dbbdaaa50c9df90f881cf" exitCode=0 Nov 22 09:53:50 crc kubenswrapper[4743]: I1122 09:53:50.278232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m8fmm" event={"ID":"39ac93d2-dedd-4109-a0ba-928660962d81","Type":"ContainerDied","Data":"9c80c72a0cf5d2809f1b57d692a40d1025cebd7e615dbbdaaa50c9df90f881cf"} Nov 22 09:53:50 crc kubenswrapper[4743]: I1122 09:53:50.278466 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m8fmm" event={"ID":"39ac93d2-dedd-4109-a0ba-928660962d81","Type":"ContainerStarted","Data":"8a33504f60cf3e6ffd88a86dfea8805e781e158232c89017028c365507d0dbf5"} Nov 22 09:53:50 crc kubenswrapper[4743]: I1122 09:53:50.279887 4743 generic.go:334] "Generic (PLEG): container finished" podID="31f7b903-226c-402f-884f-4bf2ae3b7f74" containerID="f96543829f351b01dcd4c35027a21e98cf0094d6537039b3477ae3be99769d47" exitCode=0 Nov 22 09:53:50 crc kubenswrapper[4743]: I1122 09:53:50.279923 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-42bb-account-create-xrbt7" event={"ID":"31f7b903-226c-402f-884f-4bf2ae3b7f74","Type":"ContainerDied","Data":"f96543829f351b01dcd4c35027a21e98cf0094d6537039b3477ae3be99769d47"} Nov 22 09:53:50 crc kubenswrapper[4743]: I1122 09:53:50.279954 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-42bb-account-create-xrbt7" event={"ID":"31f7b903-226c-402f-884f-4bf2ae3b7f74","Type":"ContainerStarted","Data":"64e36447188d4d161c385c7487996998826b8b8442b4e92b7876bcddfa39579a"} Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.699437 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m8fmm" Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.706236 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-42bb-account-create-xrbt7" Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.818338 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31f7b903-226c-402f-884f-4bf2ae3b7f74-operator-scripts\") pod \"31f7b903-226c-402f-884f-4bf2ae3b7f74\" (UID: \"31f7b903-226c-402f-884f-4bf2ae3b7f74\") " Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.818414 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpv7t\" (UniqueName: \"kubernetes.io/projected/31f7b903-226c-402f-884f-4bf2ae3b7f74-kube-api-access-mpv7t\") pod \"31f7b903-226c-402f-884f-4bf2ae3b7f74\" (UID: \"31f7b903-226c-402f-884f-4bf2ae3b7f74\") " Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.818523 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39ac93d2-dedd-4109-a0ba-928660962d81-operator-scripts\") pod \"39ac93d2-dedd-4109-a0ba-928660962d81\" (UID: \"39ac93d2-dedd-4109-a0ba-928660962d81\") " Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.818739 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgl8b\" (UniqueName: \"kubernetes.io/projected/39ac93d2-dedd-4109-a0ba-928660962d81-kube-api-access-rgl8b\") pod \"39ac93d2-dedd-4109-a0ba-928660962d81\" (UID: \"39ac93d2-dedd-4109-a0ba-928660962d81\") " Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.818986 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31f7b903-226c-402f-884f-4bf2ae3b7f74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31f7b903-226c-402f-884f-4bf2ae3b7f74" (UID: "31f7b903-226c-402f-884f-4bf2ae3b7f74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.819097 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39ac93d2-dedd-4109-a0ba-928660962d81-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39ac93d2-dedd-4109-a0ba-928660962d81" (UID: "39ac93d2-dedd-4109-a0ba-928660962d81"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.819284 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39ac93d2-dedd-4109-a0ba-928660962d81-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.819301 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31f7b903-226c-402f-884f-4bf2ae3b7f74-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.823904 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31f7b903-226c-402f-884f-4bf2ae3b7f74-kube-api-access-mpv7t" (OuterVolumeSpecName: "kube-api-access-mpv7t") pod "31f7b903-226c-402f-884f-4bf2ae3b7f74" (UID: "31f7b903-226c-402f-884f-4bf2ae3b7f74"). InnerVolumeSpecName "kube-api-access-mpv7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.825784 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39ac93d2-dedd-4109-a0ba-928660962d81-kube-api-access-rgl8b" (OuterVolumeSpecName: "kube-api-access-rgl8b") pod "39ac93d2-dedd-4109-a0ba-928660962d81" (UID: "39ac93d2-dedd-4109-a0ba-928660962d81"). InnerVolumeSpecName "kube-api-access-rgl8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.921507 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgl8b\" (UniqueName: \"kubernetes.io/projected/39ac93d2-dedd-4109-a0ba-928660962d81-kube-api-access-rgl8b\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:51 crc kubenswrapper[4743]: I1122 09:53:51.921548 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpv7t\" (UniqueName: \"kubernetes.io/projected/31f7b903-226c-402f-884f-4bf2ae3b7f74-kube-api-access-mpv7t\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:52 crc kubenswrapper[4743]: I1122 09:53:52.300137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m8fmm" event={"ID":"39ac93d2-dedd-4109-a0ba-928660962d81","Type":"ContainerDied","Data":"8a33504f60cf3e6ffd88a86dfea8805e781e158232c89017028c365507d0dbf5"} Nov 22 09:53:52 crc kubenswrapper[4743]: I1122 09:53:52.300425 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a33504f60cf3e6ffd88a86dfea8805e781e158232c89017028c365507d0dbf5" Nov 22 09:53:52 crc kubenswrapper[4743]: I1122 09:53:52.300154 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m8fmm" Nov 22 09:53:52 crc kubenswrapper[4743]: I1122 09:53:52.301881 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-42bb-account-create-xrbt7" event={"ID":"31f7b903-226c-402f-884f-4bf2ae3b7f74","Type":"ContainerDied","Data":"64e36447188d4d161c385c7487996998826b8b8442b4e92b7876bcddfa39579a"} Nov 22 09:53:52 crc kubenswrapper[4743]: I1122 09:53:52.301908 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64e36447188d4d161c385c7487996998826b8b8442b4e92b7876bcddfa39579a" Nov 22 09:53:52 crc kubenswrapper[4743]: I1122 09:53:52.301917 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-42bb-account-create-xrbt7" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.326265 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-trz9s"] Nov 22 09:53:54 crc kubenswrapper[4743]: E1122 09:53:54.326871 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f7b903-226c-402f-884f-4bf2ae3b7f74" containerName="mariadb-account-create" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.326883 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f7b903-226c-402f-884f-4bf2ae3b7f74" containerName="mariadb-account-create" Nov 22 09:53:54 crc kubenswrapper[4743]: E1122 09:53:54.326907 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39ac93d2-dedd-4109-a0ba-928660962d81" containerName="mariadb-database-create" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.326913 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="39ac93d2-dedd-4109-a0ba-928660962d81" containerName="mariadb-database-create" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.327063 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="39ac93d2-dedd-4109-a0ba-928660962d81" containerName="mariadb-database-create" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.327074 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f7b903-226c-402f-884f-4bf2ae3b7f74" containerName="mariadb-account-create" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.328121 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.342946 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-trz9s"] Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.387463 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-sb\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.387569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-config\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.387661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw5zf\" (UniqueName: \"kubernetes.io/projected/a35d8e35-5277-4a64-a3e9-0c3d7382671c-kube-api-access-bw5zf\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.387918 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-dns-svc\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.388100 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-nb\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.388512 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-mf77p"] Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.393069 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.395078 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9r7xs" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.395212 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.395334 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.438287 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mf77p"] Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.489665 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-sb\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.489723 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-config-data\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.489783 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-scripts\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.489806 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6nkj\" (UniqueName: \"kubernetes.io/projected/825a445b-19a5-433a-b0b5-c87cb078d274-kube-api-access-z6nkj\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.489843 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-config\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.489869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-combined-ca-bundle\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.489922 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw5zf\" (UniqueName: \"kubernetes.io/projected/a35d8e35-5277-4a64-a3e9-0c3d7382671c-kube-api-access-bw5zf\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.489963 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-dns-svc\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.489987 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-nb\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.490026 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825a445b-19a5-433a-b0b5-c87cb078d274-logs\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.490473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-sb\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.490731 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-config\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.490999 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-dns-svc\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.491345 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-nb\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.515426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw5zf\" (UniqueName: \"kubernetes.io/projected/a35d8e35-5277-4a64-a3e9-0c3d7382671c-kube-api-access-bw5zf\") pod \"dnsmasq-dns-74df65d56c-trz9s\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.591271 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-combined-ca-bundle\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.591739 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825a445b-19a5-433a-b0b5-c87cb078d274-logs\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.591889 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-config-data\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.592019 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-scripts\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.592141 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6nkj\" (UniqueName: \"kubernetes.io/projected/825a445b-19a5-433a-b0b5-c87cb078d274-kube-api-access-z6nkj\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.592216 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825a445b-19a5-433a-b0b5-c87cb078d274-logs\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.597038 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-combined-ca-bundle\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.597546 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-config-data\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.602106 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-scripts\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.610861 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6nkj\" (UniqueName: \"kubernetes.io/projected/825a445b-19a5-433a-b0b5-c87cb078d274-kube-api-access-z6nkj\") pod \"placement-db-sync-mf77p\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.656735 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:54 crc kubenswrapper[4743]: I1122 09:53:54.719761 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:55 crc kubenswrapper[4743]: I1122 09:53:55.116601 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-trz9s"] Nov 22 09:53:55 crc kubenswrapper[4743]: I1122 09:53:55.199004 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-mf77p"] Nov 22 09:53:55 crc kubenswrapper[4743]: W1122 09:53:55.211911 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod825a445b_19a5_433a_b0b5_c87cb078d274.slice/crio-7d582b5cb80ec6475655a5d1a75aa0c9cd54ba8a93dd44533514321fc513d05c WatchSource:0}: Error finding container 7d582b5cb80ec6475655a5d1a75aa0c9cd54ba8a93dd44533514321fc513d05c: Status 404 returned error can't find the container with id 7d582b5cb80ec6475655a5d1a75aa0c9cd54ba8a93dd44533514321fc513d05c Nov 22 09:53:55 crc kubenswrapper[4743]: I1122 09:53:55.326050 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mf77p" event={"ID":"825a445b-19a5-433a-b0b5-c87cb078d274","Type":"ContainerStarted","Data":"7d582b5cb80ec6475655a5d1a75aa0c9cd54ba8a93dd44533514321fc513d05c"} Nov 22 09:53:55 crc kubenswrapper[4743]: I1122 09:53:55.327490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" event={"ID":"a35d8e35-5277-4a64-a3e9-0c3d7382671c","Type":"ContainerStarted","Data":"9d5bfd13bb18dc50f43a94b646988b63e2cf5182332ec57215e66e7a12d52014"} Nov 22 09:53:55 crc kubenswrapper[4743]: I1122 09:53:55.327513 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" event={"ID":"a35d8e35-5277-4a64-a3e9-0c3d7382671c","Type":"ContainerStarted","Data":"1201de1a6390af2a0790fd54c229f1efe73458c7cb7b8ab0c5e7644639e8ad66"} Nov 22 09:53:56 crc kubenswrapper[4743]: I1122 09:53:56.338121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mf77p" event={"ID":"825a445b-19a5-433a-b0b5-c87cb078d274","Type":"ContainerStarted","Data":"16c17b1cc2fe7a0d8ae6d9df0aad11bfc587664e8ee04a95e2b8d9a6b30816d4"} Nov 22 09:53:56 crc kubenswrapper[4743]: I1122 09:53:56.342315 4743 generic.go:334] "Generic (PLEG): container finished" podID="a35d8e35-5277-4a64-a3e9-0c3d7382671c" containerID="9d5bfd13bb18dc50f43a94b646988b63e2cf5182332ec57215e66e7a12d52014" exitCode=0 Nov 22 09:53:56 crc kubenswrapper[4743]: I1122 09:53:56.342366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" event={"ID":"a35d8e35-5277-4a64-a3e9-0c3d7382671c","Type":"ContainerDied","Data":"9d5bfd13bb18dc50f43a94b646988b63e2cf5182332ec57215e66e7a12d52014"} Nov 22 09:53:56 crc kubenswrapper[4743]: I1122 09:53:56.342424 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" event={"ID":"a35d8e35-5277-4a64-a3e9-0c3d7382671c","Type":"ContainerStarted","Data":"25cad52d4532c852d7a71b4ec33b03722848080d2c216b861fcb2ae05182affe"} Nov 22 09:53:56 crc kubenswrapper[4743]: I1122 09:53:56.342590 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:53:56 crc kubenswrapper[4743]: I1122 09:53:56.365086 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-mf77p" podStartSLOduration=2.365067764 podStartE2EDuration="2.365067764s" podCreationTimestamp="2025-11-22 09:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:53:56.354274434 +0000 UTC m=+5510.060635486" watchObservedRunningTime="2025-11-22 09:53:56.365067764 +0000 UTC m=+5510.071428816" Nov 22 09:53:57 crc kubenswrapper[4743]: I1122 09:53:57.353178 4743 generic.go:334] "Generic (PLEG): container finished" podID="825a445b-19a5-433a-b0b5-c87cb078d274" containerID="16c17b1cc2fe7a0d8ae6d9df0aad11bfc587664e8ee04a95e2b8d9a6b30816d4" exitCode=0 Nov 22 09:53:57 crc kubenswrapper[4743]: I1122 09:53:57.353403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mf77p" event={"ID":"825a445b-19a5-433a-b0b5-c87cb078d274","Type":"ContainerDied","Data":"16c17b1cc2fe7a0d8ae6d9df0aad11bfc587664e8ee04a95e2b8d9a6b30816d4"} Nov 22 09:53:57 crc kubenswrapper[4743]: I1122 09:53:57.383696 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" podStartSLOduration=3.38367872 podStartE2EDuration="3.38367872s" podCreationTimestamp="2025-11-22 09:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:53:56.383922516 +0000 UTC m=+5510.090283568" watchObservedRunningTime="2025-11-22 09:53:57.38367872 +0000 UTC m=+5511.090039772" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.684177 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.779307 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6nkj\" (UniqueName: \"kubernetes.io/projected/825a445b-19a5-433a-b0b5-c87cb078d274-kube-api-access-z6nkj\") pod \"825a445b-19a5-433a-b0b5-c87cb078d274\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.779368 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825a445b-19a5-433a-b0b5-c87cb078d274-logs\") pod \"825a445b-19a5-433a-b0b5-c87cb078d274\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.779392 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-combined-ca-bundle\") pod \"825a445b-19a5-433a-b0b5-c87cb078d274\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.779433 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-config-data\") pod \"825a445b-19a5-433a-b0b5-c87cb078d274\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.779486 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-scripts\") pod \"825a445b-19a5-433a-b0b5-c87cb078d274\" (UID: \"825a445b-19a5-433a-b0b5-c87cb078d274\") " Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.779986 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/825a445b-19a5-433a-b0b5-c87cb078d274-logs" (OuterVolumeSpecName: "logs") pod "825a445b-19a5-433a-b0b5-c87cb078d274" (UID: "825a445b-19a5-433a-b0b5-c87cb078d274"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.784769 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-scripts" (OuterVolumeSpecName: "scripts") pod "825a445b-19a5-433a-b0b5-c87cb078d274" (UID: "825a445b-19a5-433a-b0b5-c87cb078d274"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.785177 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/825a445b-19a5-433a-b0b5-c87cb078d274-kube-api-access-z6nkj" (OuterVolumeSpecName: "kube-api-access-z6nkj") pod "825a445b-19a5-433a-b0b5-c87cb078d274" (UID: "825a445b-19a5-433a-b0b5-c87cb078d274"). InnerVolumeSpecName "kube-api-access-z6nkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.806687 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "825a445b-19a5-433a-b0b5-c87cb078d274" (UID: "825a445b-19a5-433a-b0b5-c87cb078d274"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.809086 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-config-data" (OuterVolumeSpecName: "config-data") pod "825a445b-19a5-433a-b0b5-c87cb078d274" (UID: "825a445b-19a5-433a-b0b5-c87cb078d274"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.881109 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.881147 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6nkj\" (UniqueName: \"kubernetes.io/projected/825a445b-19a5-433a-b0b5-c87cb078d274-kube-api-access-z6nkj\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.881161 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/825a445b-19a5-433a-b0b5-c87cb078d274-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.881172 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:58 crc kubenswrapper[4743]: I1122 09:53:58.881186 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/825a445b-19a5-433a-b0b5-c87cb078d274-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.397550 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-mf77p" event={"ID":"825a445b-19a5-433a-b0b5-c87cb078d274","Type":"ContainerDied","Data":"7d582b5cb80ec6475655a5d1a75aa0c9cd54ba8a93dd44533514321fc513d05c"} Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.397915 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d582b5cb80ec6475655a5d1a75aa0c9cd54ba8a93dd44533514321fc513d05c" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.397656 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-mf77p" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.507617 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5f9b766bfb-hp7ll"] Nov 22 09:53:59 crc kubenswrapper[4743]: E1122 09:53:59.508048 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="825a445b-19a5-433a-b0b5-c87cb078d274" containerName="placement-db-sync" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.508070 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="825a445b-19a5-433a-b0b5-c87cb078d274" containerName="placement-db-sync" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.508294 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="825a445b-19a5-433a-b0b5-c87cb078d274" containerName="placement-db-sync" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.509291 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.514769 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.514965 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.516660 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9r7xs" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.520982 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f9b766bfb-hp7ll"] Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.698475 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lbq2\" (UniqueName: \"kubernetes.io/projected/c4984786-114b-47c3-9dac-ed7029d060d5-kube-api-access-8lbq2\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.698585 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4984786-114b-47c3-9dac-ed7029d060d5-config-data\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.698638 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4984786-114b-47c3-9dac-ed7029d060d5-scripts\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.698699 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4984786-114b-47c3-9dac-ed7029d060d5-combined-ca-bundle\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.698748 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4984786-114b-47c3-9dac-ed7029d060d5-logs\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.799851 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4984786-114b-47c3-9dac-ed7029d060d5-scripts\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.799924 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4984786-114b-47c3-9dac-ed7029d060d5-combined-ca-bundle\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.799959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4984786-114b-47c3-9dac-ed7029d060d5-logs\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.800013 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lbq2\" (UniqueName: \"kubernetes.io/projected/c4984786-114b-47c3-9dac-ed7029d060d5-kube-api-access-8lbq2\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.800086 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4984786-114b-47c3-9dac-ed7029d060d5-config-data\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.801489 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c4984786-114b-47c3-9dac-ed7029d060d5-logs\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.804672 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c4984786-114b-47c3-9dac-ed7029d060d5-scripts\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.812426 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4984786-114b-47c3-9dac-ed7029d060d5-combined-ca-bundle\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.812550 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c4984786-114b-47c3-9dac-ed7029d060d5-config-data\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.820163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lbq2\" (UniqueName: \"kubernetes.io/projected/c4984786-114b-47c3-9dac-ed7029d060d5-kube-api-access-8lbq2\") pod \"placement-5f9b766bfb-hp7ll\" (UID: \"c4984786-114b-47c3-9dac-ed7029d060d5\") " pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:53:59 crc kubenswrapper[4743]: I1122 09:53:59.832442 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:54:00 crc kubenswrapper[4743]: I1122 09:54:00.267498 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5f9b766bfb-hp7ll"] Nov 22 09:54:00 crc kubenswrapper[4743]: W1122 09:54:00.269707 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4984786_114b_47c3_9dac_ed7029d060d5.slice/crio-a606c68a0dbc8b5a1a53c6e11f76b6d8d49ea6b9cf0e4fc586d88ac5f08c7296 WatchSource:0}: Error finding container a606c68a0dbc8b5a1a53c6e11f76b6d8d49ea6b9cf0e4fc586d88ac5f08c7296: Status 404 returned error can't find the container with id a606c68a0dbc8b5a1a53c6e11f76b6d8d49ea6b9cf0e4fc586d88ac5f08c7296 Nov 22 09:54:00 crc kubenswrapper[4743]: I1122 09:54:00.408578 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f9b766bfb-hp7ll" event={"ID":"c4984786-114b-47c3-9dac-ed7029d060d5","Type":"ContainerStarted","Data":"a606c68a0dbc8b5a1a53c6e11f76b6d8d49ea6b9cf0e4fc586d88ac5f08c7296"} Nov 22 09:54:01 crc kubenswrapper[4743]: I1122 09:54:01.418217 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f9b766bfb-hp7ll" event={"ID":"c4984786-114b-47c3-9dac-ed7029d060d5","Type":"ContainerStarted","Data":"50a61adb75d7d15d47e59e4daeb1c542dd55e1917c153278a1c46512b0fe642d"} Nov 22 09:54:01 crc kubenswrapper[4743]: I1122 09:54:01.418358 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5f9b766bfb-hp7ll" event={"ID":"c4984786-114b-47c3-9dac-ed7029d060d5","Type":"ContainerStarted","Data":"714b760702c4173e6ecde506d48f8db8e1764f42de56129ce88bb969f69c2cb8"} Nov 22 09:54:01 crc kubenswrapper[4743]: I1122 09:54:01.418835 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:54:01 crc kubenswrapper[4743]: I1122 09:54:01.418869 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:54:01 crc kubenswrapper[4743]: I1122 09:54:01.438552 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5f9b766bfb-hp7ll" podStartSLOduration=2.438535141 podStartE2EDuration="2.438535141s" podCreationTimestamp="2025-11-22 09:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:54:01.43465729 +0000 UTC m=+5515.141018352" watchObservedRunningTime="2025-11-22 09:54:01.438535141 +0000 UTC m=+5515.144896203" Nov 22 09:54:04 crc kubenswrapper[4743]: I1122 09:54:04.658985 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:54:04 crc kubenswrapper[4743]: I1122 09:54:04.731894 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-45jh4"] Nov 22 09:54:04 crc kubenswrapper[4743]: I1122 09:54:04.732310 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" podUID="2e1bf181-2934-48be-b073-1d97e76aa814" containerName="dnsmasq-dns" containerID="cri-o://789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9" gracePeriod=10 Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.198342 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.200006 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-sb\") pod \"2e1bf181-2934-48be-b073-1d97e76aa814\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.200161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlqp2\" (UniqueName: \"kubernetes.io/projected/2e1bf181-2934-48be-b073-1d97e76aa814-kube-api-access-nlqp2\") pod \"2e1bf181-2934-48be-b073-1d97e76aa814\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.200297 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-nb\") pod \"2e1bf181-2934-48be-b073-1d97e76aa814\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.200359 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-dns-svc\") pod \"2e1bf181-2934-48be-b073-1d97e76aa814\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.212446 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e1bf181-2934-48be-b073-1d97e76aa814-kube-api-access-nlqp2" (OuterVolumeSpecName: "kube-api-access-nlqp2") pod "2e1bf181-2934-48be-b073-1d97e76aa814" (UID: "2e1bf181-2934-48be-b073-1d97e76aa814"). InnerVolumeSpecName "kube-api-access-nlqp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.255481 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2e1bf181-2934-48be-b073-1d97e76aa814" (UID: "2e1bf181-2934-48be-b073-1d97e76aa814"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.256303 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2e1bf181-2934-48be-b073-1d97e76aa814" (UID: "2e1bf181-2934-48be-b073-1d97e76aa814"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.260710 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2e1bf181-2934-48be-b073-1d97e76aa814" (UID: "2e1bf181-2934-48be-b073-1d97e76aa814"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.305001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-config\") pod \"2e1bf181-2934-48be-b073-1d97e76aa814\" (UID: \"2e1bf181-2934-48be-b073-1d97e76aa814\") " Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.305760 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.305784 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.305797 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.305810 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlqp2\" (UniqueName: \"kubernetes.io/projected/2e1bf181-2934-48be-b073-1d97e76aa814-kube-api-access-nlqp2\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.346689 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-config" (OuterVolumeSpecName: "config") pod "2e1bf181-2934-48be-b073-1d97e76aa814" (UID: "2e1bf181-2934-48be-b073-1d97e76aa814"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.408021 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e1bf181-2934-48be-b073-1d97e76aa814-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.459886 4743 generic.go:334] "Generic (PLEG): container finished" podID="2e1bf181-2934-48be-b073-1d97e76aa814" containerID="789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9" exitCode=0 Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.459936 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" event={"ID":"2e1bf181-2934-48be-b073-1d97e76aa814","Type":"ContainerDied","Data":"789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9"} Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.459967 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" event={"ID":"2e1bf181-2934-48be-b073-1d97e76aa814","Type":"ContainerDied","Data":"6b2c0c8cfa5c5f4af9283fb231f5f459d1b4277ec1ac784b6e9dd4a83224fa00"} Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.459988 4743 scope.go:117] "RemoveContainer" containerID="789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.460210 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9b57f477-45jh4" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.487347 4743 scope.go:117] "RemoveContainer" containerID="cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.499222 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-45jh4"] Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.507435 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9b57f477-45jh4"] Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.515388 4743 scope.go:117] "RemoveContainer" containerID="789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9" Nov 22 09:54:05 crc kubenswrapper[4743]: E1122 09:54:05.515962 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9\": container with ID starting with 789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9 not found: ID does not exist" containerID="789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.516000 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9"} err="failed to get container status \"789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9\": rpc error: code = NotFound desc = could not find container \"789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9\": container with ID starting with 789ac4ce4f5c0c175f9bf0666a47ef2a8539509dbe8cb01118cc3da873811ae9 not found: ID does not exist" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.516021 4743 scope.go:117] "RemoveContainer" containerID="cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af" Nov 22 09:54:05 crc kubenswrapper[4743]: E1122 09:54:05.516391 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af\": container with ID starting with cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af not found: ID does not exist" containerID="cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af" Nov 22 09:54:05 crc kubenswrapper[4743]: I1122 09:54:05.516444 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af"} err="failed to get container status \"cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af\": rpc error: code = NotFound desc = could not find container \"cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af\": container with ID starting with cbde77af26228f285303ffd72894e3090d02fde213a313bbfdb623ed9bade8af not found: ID does not exist" Nov 22 09:54:07 crc kubenswrapper[4743]: I1122 09:54:07.163865 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e1bf181-2934-48be-b073-1d97e76aa814" path="/var/lib/kubelet/pods/2e1bf181-2934-48be-b073-1d97e76aa814/volumes" Nov 22 09:54:31 crc kubenswrapper[4743]: I1122 09:54:31.240923 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:54:31 crc kubenswrapper[4743]: I1122 09:54:31.242389 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:54:31 crc kubenswrapper[4743]: I1122 09:54:31.376046 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:54:31 crc kubenswrapper[4743]: I1122 09:54:31.376342 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5f9b766bfb-hp7ll" Nov 22 09:54:32 crc kubenswrapper[4743]: I1122 09:54:32.728853 4743 scope.go:117] "RemoveContainer" containerID="bfd35d76be549bdae885bf19af23e2b8e1ae9823e7f17c923a7be247870e26f7" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.553126 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-m99g9"] Nov 22 09:54:51 crc kubenswrapper[4743]: E1122 09:54:51.553902 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e1bf181-2934-48be-b073-1d97e76aa814" containerName="init" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.553914 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e1bf181-2934-48be-b073-1d97e76aa814" containerName="init" Nov 22 09:54:51 crc kubenswrapper[4743]: E1122 09:54:51.553952 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e1bf181-2934-48be-b073-1d97e76aa814" containerName="dnsmasq-dns" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.553958 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e1bf181-2934-48be-b073-1d97e76aa814" containerName="dnsmasq-dns" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.554122 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e1bf181-2934-48be-b073-1d97e76aa814" containerName="dnsmasq-dns" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.554700 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m99g9" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.573484 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-m99g9"] Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.646599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/073cacc3-d575-4696-a875-9181e9d250d9-operator-scripts\") pod \"nova-api-db-create-m99g9\" (UID: \"073cacc3-d575-4696-a875-9181e9d250d9\") " pod="openstack/nova-api-db-create-m99g9" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.646713 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jfrj\" (UniqueName: \"kubernetes.io/projected/073cacc3-d575-4696-a875-9181e9d250d9-kube-api-access-9jfrj\") pod \"nova-api-db-create-m99g9\" (UID: \"073cacc3-d575-4696-a875-9181e9d250d9\") " pod="openstack/nova-api-db-create-m99g9" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.748459 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/073cacc3-d575-4696-a875-9181e9d250d9-operator-scripts\") pod \"nova-api-db-create-m99g9\" (UID: \"073cacc3-d575-4696-a875-9181e9d250d9\") " pod="openstack/nova-api-db-create-m99g9" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.748841 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jfrj\" (UniqueName: \"kubernetes.io/projected/073cacc3-d575-4696-a875-9181e9d250d9-kube-api-access-9jfrj\") pod \"nova-api-db-create-m99g9\" (UID: \"073cacc3-d575-4696-a875-9181e9d250d9\") " pod="openstack/nova-api-db-create-m99g9" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.749254 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/073cacc3-d575-4696-a875-9181e9d250d9-operator-scripts\") pod \"nova-api-db-create-m99g9\" (UID: \"073cacc3-d575-4696-a875-9181e9d250d9\") " pod="openstack/nova-api-db-create-m99g9" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.754250 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-vcdq2"] Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.755619 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vcdq2" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.771358 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jfrj\" (UniqueName: \"kubernetes.io/projected/073cacc3-d575-4696-a875-9181e9d250d9-kube-api-access-9jfrj\") pod \"nova-api-db-create-m99g9\" (UID: \"073cacc3-d575-4696-a875-9181e9d250d9\") " pod="openstack/nova-api-db-create-m99g9" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.779827 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vcdq2"] Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.800143 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8c6d-account-create-585ck"] Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.801324 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8c6d-account-create-585ck" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.804045 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.814194 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8c6d-account-create-585ck"] Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.853471 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d16badcc-aa28-40f5-ac9e-71a23ee6e209-operator-scripts\") pod \"nova-api-8c6d-account-create-585ck\" (UID: \"d16badcc-aa28-40f5-ac9e-71a23ee6e209\") " pod="openstack/nova-api-8c6d-account-create-585ck" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.853563 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1325d49a-c56f-4183-a0aa-6f558767ccaa-operator-scripts\") pod \"nova-cell0-db-create-vcdq2\" (UID: \"1325d49a-c56f-4183-a0aa-6f558767ccaa\") " pod="openstack/nova-cell0-db-create-vcdq2" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.853616 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vm95\" (UniqueName: \"kubernetes.io/projected/1325d49a-c56f-4183-a0aa-6f558767ccaa-kube-api-access-9vm95\") pod \"nova-cell0-db-create-vcdq2\" (UID: \"1325d49a-c56f-4183-a0aa-6f558767ccaa\") " pod="openstack/nova-cell0-db-create-vcdq2" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.853729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfbkf\" (UniqueName: \"kubernetes.io/projected/d16badcc-aa28-40f5-ac9e-71a23ee6e209-kube-api-access-wfbkf\") pod \"nova-api-8c6d-account-create-585ck\" (UID: \"d16badcc-aa28-40f5-ac9e-71a23ee6e209\") " pod="openstack/nova-api-8c6d-account-create-585ck" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.858721 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wbsdw"] Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.866466 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wbsdw" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.875060 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m99g9" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.877488 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wbsdw"] Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.956252 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-c0ec-account-create-hfj6d"] Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.961701 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfbkf\" (UniqueName: \"kubernetes.io/projected/d16badcc-aa28-40f5-ac9e-71a23ee6e209-kube-api-access-wfbkf\") pod \"nova-api-8c6d-account-create-585ck\" (UID: \"d16badcc-aa28-40f5-ac9e-71a23ee6e209\") " pod="openstack/nova-api-8c6d-account-create-585ck" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.961756 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brw58\" (UniqueName: \"kubernetes.io/projected/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-kube-api-access-brw58\") pod \"nova-cell1-db-create-wbsdw\" (UID: \"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b\") " pod="openstack/nova-cell1-db-create-wbsdw" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.961791 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-operator-scripts\") pod \"nova-cell1-db-create-wbsdw\" (UID: \"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b\") " pod="openstack/nova-cell1-db-create-wbsdw" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.961827 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d16badcc-aa28-40f5-ac9e-71a23ee6e209-operator-scripts\") pod \"nova-api-8c6d-account-create-585ck\" (UID: \"d16badcc-aa28-40f5-ac9e-71a23ee6e209\") " pod="openstack/nova-api-8c6d-account-create-585ck" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.961890 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1325d49a-c56f-4183-a0aa-6f558767ccaa-operator-scripts\") pod \"nova-cell0-db-create-vcdq2\" (UID: \"1325d49a-c56f-4183-a0aa-6f558767ccaa\") " pod="openstack/nova-cell0-db-create-vcdq2" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.961922 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vm95\" (UniqueName: \"kubernetes.io/projected/1325d49a-c56f-4183-a0aa-6f558767ccaa-kube-api-access-9vm95\") pod \"nova-cell0-db-create-vcdq2\" (UID: \"1325d49a-c56f-4183-a0aa-6f558767ccaa\") " pod="openstack/nova-cell0-db-create-vcdq2" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.963493 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1325d49a-c56f-4183-a0aa-6f558767ccaa-operator-scripts\") pod \"nova-cell0-db-create-vcdq2\" (UID: \"1325d49a-c56f-4183-a0aa-6f558767ccaa\") " pod="openstack/nova-cell0-db-create-vcdq2" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.963603 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d16badcc-aa28-40f5-ac9e-71a23ee6e209-operator-scripts\") pod \"nova-api-8c6d-account-create-585ck\" (UID: \"d16badcc-aa28-40f5-ac9e-71a23ee6e209\") " pod="openstack/nova-api-8c6d-account-create-585ck" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.975401 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c0ec-account-create-hfj6d"] Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.975500 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c0ec-account-create-hfj6d" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.977514 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.979257 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vm95\" (UniqueName: \"kubernetes.io/projected/1325d49a-c56f-4183-a0aa-6f558767ccaa-kube-api-access-9vm95\") pod \"nova-cell0-db-create-vcdq2\" (UID: \"1325d49a-c56f-4183-a0aa-6f558767ccaa\") " pod="openstack/nova-cell0-db-create-vcdq2" Nov 22 09:54:51 crc kubenswrapper[4743]: I1122 09:54:51.983023 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfbkf\" (UniqueName: \"kubernetes.io/projected/d16badcc-aa28-40f5-ac9e-71a23ee6e209-kube-api-access-wfbkf\") pod \"nova-api-8c6d-account-create-585ck\" (UID: \"d16badcc-aa28-40f5-ac9e-71a23ee6e209\") " pod="openstack/nova-api-8c6d-account-create-585ck" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.063681 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brw58\" (UniqueName: \"kubernetes.io/projected/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-kube-api-access-brw58\") pod \"nova-cell1-db-create-wbsdw\" (UID: \"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b\") " pod="openstack/nova-cell1-db-create-wbsdw" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.064086 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-operator-scripts\") pod \"nova-cell1-db-create-wbsdw\" (UID: \"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b\") " pod="openstack/nova-cell1-db-create-wbsdw" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.064202 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03224792-7994-47a0-bd4a-68c2e394b3c1-operator-scripts\") pod \"nova-cell0-c0ec-account-create-hfj6d\" (UID: \"03224792-7994-47a0-bd4a-68c2e394b3c1\") " pod="openstack/nova-cell0-c0ec-account-create-hfj6d" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.064241 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z86s7\" (UniqueName: \"kubernetes.io/projected/03224792-7994-47a0-bd4a-68c2e394b3c1-kube-api-access-z86s7\") pod \"nova-cell0-c0ec-account-create-hfj6d\" (UID: \"03224792-7994-47a0-bd4a-68c2e394b3c1\") " pod="openstack/nova-cell0-c0ec-account-create-hfj6d" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.064944 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-operator-scripts\") pod \"nova-cell1-db-create-wbsdw\" (UID: \"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b\") " pod="openstack/nova-cell1-db-create-wbsdw" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.075821 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vcdq2" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.086291 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brw58\" (UniqueName: \"kubernetes.io/projected/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-kube-api-access-brw58\") pod \"nova-cell1-db-create-wbsdw\" (UID: \"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b\") " pod="openstack/nova-cell1-db-create-wbsdw" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.124014 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8c6d-account-create-585ck" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.163243 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-405d-account-create-842jc"] Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.164531 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-405d-account-create-842jc" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.165569 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03224792-7994-47a0-bd4a-68c2e394b3c1-operator-scripts\") pod \"nova-cell0-c0ec-account-create-hfj6d\" (UID: \"03224792-7994-47a0-bd4a-68c2e394b3c1\") " pod="openstack/nova-cell0-c0ec-account-create-hfj6d" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.165652 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z86s7\" (UniqueName: \"kubernetes.io/projected/03224792-7994-47a0-bd4a-68c2e394b3c1-kube-api-access-z86s7\") pod \"nova-cell0-c0ec-account-create-hfj6d\" (UID: \"03224792-7994-47a0-bd4a-68c2e394b3c1\") " pod="openstack/nova-cell0-c0ec-account-create-hfj6d" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.166541 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03224792-7994-47a0-bd4a-68c2e394b3c1-operator-scripts\") pod \"nova-cell0-c0ec-account-create-hfj6d\" (UID: \"03224792-7994-47a0-bd4a-68c2e394b3c1\") " pod="openstack/nova-cell0-c0ec-account-create-hfj6d" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.166999 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.175712 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-405d-account-create-842jc"] Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.187547 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wbsdw" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.195205 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z86s7\" (UniqueName: \"kubernetes.io/projected/03224792-7994-47a0-bd4a-68c2e394b3c1-kube-api-access-z86s7\") pod \"nova-cell0-c0ec-account-create-hfj6d\" (UID: \"03224792-7994-47a0-bd4a-68c2e394b3c1\") " pod="openstack/nova-cell0-c0ec-account-create-hfj6d" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.267142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b713ccde-5f99-47ae-8c30-78677acde194-operator-scripts\") pod \"nova-cell1-405d-account-create-842jc\" (UID: \"b713ccde-5f99-47ae-8c30-78677acde194\") " pod="openstack/nova-cell1-405d-account-create-842jc" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.267200 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cljbl\" (UniqueName: \"kubernetes.io/projected/b713ccde-5f99-47ae-8c30-78677acde194-kube-api-access-cljbl\") pod \"nova-cell1-405d-account-create-842jc\" (UID: \"b713ccde-5f99-47ae-8c30-78677acde194\") " pod="openstack/nova-cell1-405d-account-create-842jc" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.364075 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c0ec-account-create-hfj6d" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.368560 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b713ccde-5f99-47ae-8c30-78677acde194-operator-scripts\") pod \"nova-cell1-405d-account-create-842jc\" (UID: \"b713ccde-5f99-47ae-8c30-78677acde194\") " pod="openstack/nova-cell1-405d-account-create-842jc" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.368612 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cljbl\" (UniqueName: \"kubernetes.io/projected/b713ccde-5f99-47ae-8c30-78677acde194-kube-api-access-cljbl\") pod \"nova-cell1-405d-account-create-842jc\" (UID: \"b713ccde-5f99-47ae-8c30-78677acde194\") " pod="openstack/nova-cell1-405d-account-create-842jc" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.369225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b713ccde-5f99-47ae-8c30-78677acde194-operator-scripts\") pod \"nova-cell1-405d-account-create-842jc\" (UID: \"b713ccde-5f99-47ae-8c30-78677acde194\") " pod="openstack/nova-cell1-405d-account-create-842jc" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.385277 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cljbl\" (UniqueName: \"kubernetes.io/projected/b713ccde-5f99-47ae-8c30-78677acde194-kube-api-access-cljbl\") pod \"nova-cell1-405d-account-create-842jc\" (UID: \"b713ccde-5f99-47ae-8c30-78677acde194\") " pod="openstack/nova-cell1-405d-account-create-842jc" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.416010 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-m99g9"] Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.486822 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-405d-account-create-842jc" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.597987 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-vcdq2"] Nov 22 09:54:52 crc kubenswrapper[4743]: W1122 09:54:52.603827 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1325d49a_c56f_4183_a0aa_6f558767ccaa.slice/crio-ad9d66e6a85dcc7d1a746d914f817f19d5cd01a2c9377463d4e95fa1bf93cb9d WatchSource:0}: Error finding container ad9d66e6a85dcc7d1a746d914f817f19d5cd01a2c9377463d4e95fa1bf93cb9d: Status 404 returned error can't find the container with id ad9d66e6a85dcc7d1a746d914f817f19d5cd01a2c9377463d4e95fa1bf93cb9d Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.694424 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8c6d-account-create-585ck"] Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.778087 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wbsdw"] Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.872895 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-c0ec-account-create-hfj6d"] Nov 22 09:54:52 crc kubenswrapper[4743]: W1122 09:54:52.876764 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03224792_7994_47a0_bd4a_68c2e394b3c1.slice/crio-25950cc5e2e9339a37a564d1d4fcd9867b7909656a9c2dc53660293e4f6fea62 WatchSource:0}: Error finding container 25950cc5e2e9339a37a564d1d4fcd9867b7909656a9c2dc53660293e4f6fea62: Status 404 returned error can't find the container with id 25950cc5e2e9339a37a564d1d4fcd9867b7909656a9c2dc53660293e4f6fea62 Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.952162 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vcdq2" event={"ID":"1325d49a-c56f-4183-a0aa-6f558767ccaa","Type":"ContainerStarted","Data":"78c54715e656a40bd0ec04c4c311853dd98402a2f860139b315dc3f0fce6f086"} Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.952235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vcdq2" event={"ID":"1325d49a-c56f-4183-a0aa-6f558767ccaa","Type":"ContainerStarted","Data":"ad9d66e6a85dcc7d1a746d914f817f19d5cd01a2c9377463d4e95fa1bf93cb9d"} Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.954363 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m99g9" event={"ID":"073cacc3-d575-4696-a875-9181e9d250d9","Type":"ContainerStarted","Data":"5a2ee31798cb93bde59c4247b40c33bb401fc8c2048d9730c047ee8256c1df01"} Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.954409 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m99g9" event={"ID":"073cacc3-d575-4696-a875-9181e9d250d9","Type":"ContainerStarted","Data":"c4dd7390fa54991d24900922b9ce342c88d997bc37ecef2afa893b1c61f3991c"} Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.956099 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c0ec-account-create-hfj6d" event={"ID":"03224792-7994-47a0-bd4a-68c2e394b3c1","Type":"ContainerStarted","Data":"25950cc5e2e9339a37a564d1d4fcd9867b7909656a9c2dc53660293e4f6fea62"} Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.957020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8c6d-account-create-585ck" event={"ID":"d16badcc-aa28-40f5-ac9e-71a23ee6e209","Type":"ContainerStarted","Data":"de0b3b92f7ca01e9dc6e78415f0c0493ec9e7f472d3d2aaf7d5d05c9f8ac4344"} Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.958014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wbsdw" event={"ID":"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b","Type":"ContainerStarted","Data":"f79a6f77b041dc26f128867ae4f59bddad1189f89865b0b10d73d1eae32c6f8d"} Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.973402 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-vcdq2" podStartSLOduration=1.973372172 podStartE2EDuration="1.973372172s" podCreationTimestamp="2025-11-22 09:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:54:52.964295871 +0000 UTC m=+5566.670656923" watchObservedRunningTime="2025-11-22 09:54:52.973372172 +0000 UTC m=+5566.679733224" Nov 22 09:54:52 crc kubenswrapper[4743]: I1122 09:54:52.986030 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-m99g9" podStartSLOduration=1.986011185 podStartE2EDuration="1.986011185s" podCreationTimestamp="2025-11-22 09:54:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:54:52.982081242 +0000 UTC m=+5566.688442294" watchObservedRunningTime="2025-11-22 09:54:52.986011185 +0000 UTC m=+5566.692372237" Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.021453 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-405d-account-create-842jc"] Nov 22 09:54:53 crc kubenswrapper[4743]: W1122 09:54:53.025468 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb713ccde_5f99_47ae_8c30_78677acde194.slice/crio-56ce74e8ab5c67811c643d25e5950d6d914152bef14ebc11474e305b94adbda7 WatchSource:0}: Error finding container 56ce74e8ab5c67811c643d25e5950d6d914152bef14ebc11474e305b94adbda7: Status 404 returned error can't find the container with id 56ce74e8ab5c67811c643d25e5950d6d914152bef14ebc11474e305b94adbda7 Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.974213 4743 generic.go:334] "Generic (PLEG): container finished" podID="dbb6248e-e10b-43ee-aa8c-d9e1bba1219b" containerID="f3ca7a23f6d5f7ad78bca3a8b38f4b6ef1385443b64340f3c33783956342f196" exitCode=0 Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.974276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wbsdw" event={"ID":"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b","Type":"ContainerDied","Data":"f3ca7a23f6d5f7ad78bca3a8b38f4b6ef1385443b64340f3c33783956342f196"} Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.978051 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-405d-account-create-842jc" event={"ID":"b713ccde-5f99-47ae-8c30-78677acde194","Type":"ContainerStarted","Data":"c0f8d107b3c1b021be89ff7af4d768f3502eb72eecc8a9d1f23650309fc5de0e"} Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.978079 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-405d-account-create-842jc" event={"ID":"b713ccde-5f99-47ae-8c30-78677acde194","Type":"ContainerStarted","Data":"56ce74e8ab5c67811c643d25e5950d6d914152bef14ebc11474e305b94adbda7"} Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.979613 4743 generic.go:334] "Generic (PLEG): container finished" podID="1325d49a-c56f-4183-a0aa-6f558767ccaa" containerID="78c54715e656a40bd0ec04c4c311853dd98402a2f860139b315dc3f0fce6f086" exitCode=0 Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.979643 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vcdq2" event={"ID":"1325d49a-c56f-4183-a0aa-6f558767ccaa","Type":"ContainerDied","Data":"78c54715e656a40bd0ec04c4c311853dd98402a2f860139b315dc3f0fce6f086"} Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.981388 4743 generic.go:334] "Generic (PLEG): container finished" podID="073cacc3-d575-4696-a875-9181e9d250d9" containerID="5a2ee31798cb93bde59c4247b40c33bb401fc8c2048d9730c047ee8256c1df01" exitCode=0 Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.981491 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m99g9" event={"ID":"073cacc3-d575-4696-a875-9181e9d250d9","Type":"ContainerDied","Data":"5a2ee31798cb93bde59c4247b40c33bb401fc8c2048d9730c047ee8256c1df01"} Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.989176 4743 generic.go:334] "Generic (PLEG): container finished" podID="03224792-7994-47a0-bd4a-68c2e394b3c1" containerID="24dc3fe94b7d6a358c33cf32e7da0a9a6c3ae5d2e3a6f83875c0e7cf16598a6b" exitCode=0 Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.989276 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c0ec-account-create-hfj6d" event={"ID":"03224792-7994-47a0-bd4a-68c2e394b3c1","Type":"ContainerDied","Data":"24dc3fe94b7d6a358c33cf32e7da0a9a6c3ae5d2e3a6f83875c0e7cf16598a6b"} Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.992948 4743 generic.go:334] "Generic (PLEG): container finished" podID="d16badcc-aa28-40f5-ac9e-71a23ee6e209" containerID="c232c18607f5202278acb57d1d051f46226a915c40cee6e8b85c62e948110e06" exitCode=0 Nov 22 09:54:53 crc kubenswrapper[4743]: I1122 09:54:53.992992 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8c6d-account-create-585ck" event={"ID":"d16badcc-aa28-40f5-ac9e-71a23ee6e209","Type":"ContainerDied","Data":"c232c18607f5202278acb57d1d051f46226a915c40cee6e8b85c62e948110e06"} Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.003058 4743 generic.go:334] "Generic (PLEG): container finished" podID="b713ccde-5f99-47ae-8c30-78677acde194" containerID="c0f8d107b3c1b021be89ff7af4d768f3502eb72eecc8a9d1f23650309fc5de0e" exitCode=0 Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.003108 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-405d-account-create-842jc" event={"ID":"b713ccde-5f99-47ae-8c30-78677acde194","Type":"ContainerDied","Data":"c0f8d107b3c1b021be89ff7af4d768f3502eb72eecc8a9d1f23650309fc5de0e"} Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.405798 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vcdq2" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.430347 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vm95\" (UniqueName: \"kubernetes.io/projected/1325d49a-c56f-4183-a0aa-6f558767ccaa-kube-api-access-9vm95\") pod \"1325d49a-c56f-4183-a0aa-6f558767ccaa\" (UID: \"1325d49a-c56f-4183-a0aa-6f558767ccaa\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.430597 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1325d49a-c56f-4183-a0aa-6f558767ccaa-operator-scripts\") pod \"1325d49a-c56f-4183-a0aa-6f558767ccaa\" (UID: \"1325d49a-c56f-4183-a0aa-6f558767ccaa\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.431535 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1325d49a-c56f-4183-a0aa-6f558767ccaa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1325d49a-c56f-4183-a0aa-6f558767ccaa" (UID: "1325d49a-c56f-4183-a0aa-6f558767ccaa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.437483 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1325d49a-c56f-4183-a0aa-6f558767ccaa-kube-api-access-9vm95" (OuterVolumeSpecName: "kube-api-access-9vm95") pod "1325d49a-c56f-4183-a0aa-6f558767ccaa" (UID: "1325d49a-c56f-4183-a0aa-6f558767ccaa"). InnerVolumeSpecName "kube-api-access-9vm95". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.533632 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1325d49a-c56f-4183-a0aa-6f558767ccaa-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.533666 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vm95\" (UniqueName: \"kubernetes.io/projected/1325d49a-c56f-4183-a0aa-6f558767ccaa-kube-api-access-9vm95\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.621672 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m99g9" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.637345 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8c6d-account-create-585ck" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.653821 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c0ec-account-create-hfj6d" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.671610 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wbsdw" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.673926 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-405d-account-create-842jc" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.736015 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jfrj\" (UniqueName: \"kubernetes.io/projected/073cacc3-d575-4696-a875-9181e9d250d9-kube-api-access-9jfrj\") pod \"073cacc3-d575-4696-a875-9181e9d250d9\" (UID: \"073cacc3-d575-4696-a875-9181e9d250d9\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.736161 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/073cacc3-d575-4696-a875-9181e9d250d9-operator-scripts\") pod \"073cacc3-d575-4696-a875-9181e9d250d9\" (UID: \"073cacc3-d575-4696-a875-9181e9d250d9\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.736709 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/073cacc3-d575-4696-a875-9181e9d250d9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "073cacc3-d575-4696-a875-9181e9d250d9" (UID: "073cacc3-d575-4696-a875-9181e9d250d9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.740144 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/073cacc3-d575-4696-a875-9181e9d250d9-kube-api-access-9jfrj" (OuterVolumeSpecName: "kube-api-access-9jfrj") pod "073cacc3-d575-4696-a875-9181e9d250d9" (UID: "073cacc3-d575-4696-a875-9181e9d250d9"). InnerVolumeSpecName "kube-api-access-9jfrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.838263 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfbkf\" (UniqueName: \"kubernetes.io/projected/d16badcc-aa28-40f5-ac9e-71a23ee6e209-kube-api-access-wfbkf\") pod \"d16badcc-aa28-40f5-ac9e-71a23ee6e209\" (UID: \"d16badcc-aa28-40f5-ac9e-71a23ee6e209\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.838332 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brw58\" (UniqueName: \"kubernetes.io/projected/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-kube-api-access-brw58\") pod \"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b\" (UID: \"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.838355 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d16badcc-aa28-40f5-ac9e-71a23ee6e209-operator-scripts\") pod \"d16badcc-aa28-40f5-ac9e-71a23ee6e209\" (UID: \"d16badcc-aa28-40f5-ac9e-71a23ee6e209\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.838376 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z86s7\" (UniqueName: \"kubernetes.io/projected/03224792-7994-47a0-bd4a-68c2e394b3c1-kube-api-access-z86s7\") pod \"03224792-7994-47a0-bd4a-68c2e394b3c1\" (UID: \"03224792-7994-47a0-bd4a-68c2e394b3c1\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.838393 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03224792-7994-47a0-bd4a-68c2e394b3c1-operator-scripts\") pod \"03224792-7994-47a0-bd4a-68c2e394b3c1\" (UID: \"03224792-7994-47a0-bd4a-68c2e394b3c1\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.838461 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cljbl\" (UniqueName: \"kubernetes.io/projected/b713ccde-5f99-47ae-8c30-78677acde194-kube-api-access-cljbl\") pod \"b713ccde-5f99-47ae-8c30-78677acde194\" (UID: \"b713ccde-5f99-47ae-8c30-78677acde194\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.838893 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d16badcc-aa28-40f5-ac9e-71a23ee6e209-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d16badcc-aa28-40f5-ac9e-71a23ee6e209" (UID: "d16badcc-aa28-40f5-ac9e-71a23ee6e209"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.838996 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03224792-7994-47a0-bd4a-68c2e394b3c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03224792-7994-47a0-bd4a-68c2e394b3c1" (UID: "03224792-7994-47a0-bd4a-68c2e394b3c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.838568 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-operator-scripts\") pod \"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b\" (UID: \"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.839398 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbb6248e-e10b-43ee-aa8c-d9e1bba1219b" (UID: "dbb6248e-e10b-43ee-aa8c-d9e1bba1219b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.839454 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b713ccde-5f99-47ae-8c30-78677acde194-operator-scripts\") pod \"b713ccde-5f99-47ae-8c30-78677acde194\" (UID: \"b713ccde-5f99-47ae-8c30-78677acde194\") " Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.839803 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b713ccde-5f99-47ae-8c30-78677acde194-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b713ccde-5f99-47ae-8c30-78677acde194" (UID: "b713ccde-5f99-47ae-8c30-78677acde194"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.840113 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.840134 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b713ccde-5f99-47ae-8c30-78677acde194-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.840146 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jfrj\" (UniqueName: \"kubernetes.io/projected/073cacc3-d575-4696-a875-9181e9d250d9-kube-api-access-9jfrj\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.840159 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/073cacc3-d575-4696-a875-9181e9d250d9-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.840170 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d16badcc-aa28-40f5-ac9e-71a23ee6e209-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.840180 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03224792-7994-47a0-bd4a-68c2e394b3c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.841741 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b713ccde-5f99-47ae-8c30-78677acde194-kube-api-access-cljbl" (OuterVolumeSpecName: "kube-api-access-cljbl") pod "b713ccde-5f99-47ae-8c30-78677acde194" (UID: "b713ccde-5f99-47ae-8c30-78677acde194"). InnerVolumeSpecName "kube-api-access-cljbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.842018 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16badcc-aa28-40f5-ac9e-71a23ee6e209-kube-api-access-wfbkf" (OuterVolumeSpecName: "kube-api-access-wfbkf") pod "d16badcc-aa28-40f5-ac9e-71a23ee6e209" (UID: "d16badcc-aa28-40f5-ac9e-71a23ee6e209"). InnerVolumeSpecName "kube-api-access-wfbkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.842138 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03224792-7994-47a0-bd4a-68c2e394b3c1-kube-api-access-z86s7" (OuterVolumeSpecName: "kube-api-access-z86s7") pod "03224792-7994-47a0-bd4a-68c2e394b3c1" (UID: "03224792-7994-47a0-bd4a-68c2e394b3c1"). InnerVolumeSpecName "kube-api-access-z86s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.842856 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-kube-api-access-brw58" (OuterVolumeSpecName: "kube-api-access-brw58") pod "dbb6248e-e10b-43ee-aa8c-d9e1bba1219b" (UID: "dbb6248e-e10b-43ee-aa8c-d9e1bba1219b"). InnerVolumeSpecName "kube-api-access-brw58". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.940854 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfbkf\" (UniqueName: \"kubernetes.io/projected/d16badcc-aa28-40f5-ac9e-71a23ee6e209-kube-api-access-wfbkf\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.940887 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brw58\" (UniqueName: \"kubernetes.io/projected/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b-kube-api-access-brw58\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.940898 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z86s7\" (UniqueName: \"kubernetes.io/projected/03224792-7994-47a0-bd4a-68c2e394b3c1-kube-api-access-z86s7\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:55 crc kubenswrapper[4743]: I1122 09:54:55.940908 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cljbl\" (UniqueName: \"kubernetes.io/projected/b713ccde-5f99-47ae-8c30-78677acde194-kube-api-access-cljbl\") on node \"crc\" DevicePath \"\"" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.030175 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-405d-account-create-842jc" event={"ID":"b713ccde-5f99-47ae-8c30-78677acde194","Type":"ContainerDied","Data":"56ce74e8ab5c67811c643d25e5950d6d914152bef14ebc11474e305b94adbda7"} Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.030223 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56ce74e8ab5c67811c643d25e5950d6d914152bef14ebc11474e305b94adbda7" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.030227 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-405d-account-create-842jc" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.033393 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-vcdq2" event={"ID":"1325d49a-c56f-4183-a0aa-6f558767ccaa","Type":"ContainerDied","Data":"ad9d66e6a85dcc7d1a746d914f817f19d5cd01a2c9377463d4e95fa1bf93cb9d"} Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.033418 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9d66e6a85dcc7d1a746d914f817f19d5cd01a2c9377463d4e95fa1bf93cb9d" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.033421 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-vcdq2" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.035839 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-m99g9" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.035832 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-m99g9" event={"ID":"073cacc3-d575-4696-a875-9181e9d250d9","Type":"ContainerDied","Data":"c4dd7390fa54991d24900922b9ce342c88d997bc37ecef2afa893b1c61f3991c"} Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.035962 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4dd7390fa54991d24900922b9ce342c88d997bc37ecef2afa893b1c61f3991c" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.037620 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-c0ec-account-create-hfj6d" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.037667 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-c0ec-account-create-hfj6d" event={"ID":"03224792-7994-47a0-bd4a-68c2e394b3c1","Type":"ContainerDied","Data":"25950cc5e2e9339a37a564d1d4fcd9867b7909656a9c2dc53660293e4f6fea62"} Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.037691 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25950cc5e2e9339a37a564d1d4fcd9867b7909656a9c2dc53660293e4f6fea62" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.039293 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8c6d-account-create-585ck" event={"ID":"d16badcc-aa28-40f5-ac9e-71a23ee6e209","Type":"ContainerDied","Data":"de0b3b92f7ca01e9dc6e78415f0c0493ec9e7f472d3d2aaf7d5d05c9f8ac4344"} Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.039325 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de0b3b92f7ca01e9dc6e78415f0c0493ec9e7f472d3d2aaf7d5d05c9f8ac4344" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.039367 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8c6d-account-create-585ck" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.041346 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wbsdw" event={"ID":"dbb6248e-e10b-43ee-aa8c-d9e1bba1219b","Type":"ContainerDied","Data":"f79a6f77b041dc26f128867ae4f59bddad1189f89865b0b10d73d1eae32c6f8d"} Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.041387 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f79a6f77b041dc26f128867ae4f59bddad1189f89865b0b10d73d1eae32c6f8d" Nov 22 09:54:56 crc kubenswrapper[4743]: I1122 09:54:56.041449 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wbsdw" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.163333 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbrlx"] Nov 22 09:54:57 crc kubenswrapper[4743]: E1122 09:54:57.164003 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16badcc-aa28-40f5-ac9e-71a23ee6e209" containerName="mariadb-account-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164021 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16badcc-aa28-40f5-ac9e-71a23ee6e209" containerName="mariadb-account-create" Nov 22 09:54:57 crc kubenswrapper[4743]: E1122 09:54:57.164049 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbb6248e-e10b-43ee-aa8c-d9e1bba1219b" containerName="mariadb-database-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164057 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbb6248e-e10b-43ee-aa8c-d9e1bba1219b" containerName="mariadb-database-create" Nov 22 09:54:57 crc kubenswrapper[4743]: E1122 09:54:57.164075 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03224792-7994-47a0-bd4a-68c2e394b3c1" containerName="mariadb-account-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164083 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="03224792-7994-47a0-bd4a-68c2e394b3c1" containerName="mariadb-account-create" Nov 22 09:54:57 crc kubenswrapper[4743]: E1122 09:54:57.164097 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1325d49a-c56f-4183-a0aa-6f558767ccaa" containerName="mariadb-database-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164105 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1325d49a-c56f-4183-a0aa-6f558767ccaa" containerName="mariadb-database-create" Nov 22 09:54:57 crc kubenswrapper[4743]: E1122 09:54:57.164116 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b713ccde-5f99-47ae-8c30-78677acde194" containerName="mariadb-account-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164123 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b713ccde-5f99-47ae-8c30-78677acde194" containerName="mariadb-account-create" Nov 22 09:54:57 crc kubenswrapper[4743]: E1122 09:54:57.164148 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="073cacc3-d575-4696-a875-9181e9d250d9" containerName="mariadb-database-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164156 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="073cacc3-d575-4696-a875-9181e9d250d9" containerName="mariadb-database-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164361 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="03224792-7994-47a0-bd4a-68c2e394b3c1" containerName="mariadb-account-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164385 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b713ccde-5f99-47ae-8c30-78677acde194" containerName="mariadb-account-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164398 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16badcc-aa28-40f5-ac9e-71a23ee6e209" containerName="mariadb-account-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164411 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1325d49a-c56f-4183-a0aa-6f558767ccaa" containerName="mariadb-database-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164427 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="073cacc3-d575-4696-a875-9181e9d250d9" containerName="mariadb-database-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.164446 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbb6248e-e10b-43ee-aa8c-d9e1bba1219b" containerName="mariadb-database-create" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.165215 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.169537 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbrlx"] Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.177087 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.177307 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-khxpt" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.177452 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.364253 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.364319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-scripts\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.364350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wht9w\" (UniqueName: \"kubernetes.io/projected/5c9cec77-56f5-4914-9309-19696765d0db-kube-api-access-wht9w\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.364382 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-config-data\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.466568 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.466652 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-scripts\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.466681 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wht9w\" (UniqueName: \"kubernetes.io/projected/5c9cec77-56f5-4914-9309-19696765d0db-kube-api-access-wht9w\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.466749 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-config-data\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.470774 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-config-data\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.470881 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-scripts\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.475163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.497755 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wht9w\" (UniqueName: \"kubernetes.io/projected/5c9cec77-56f5-4914-9309-19696765d0db-kube-api-access-wht9w\") pod \"nova-cell0-conductor-db-sync-wbrlx\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.502875 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:54:57 crc kubenswrapper[4743]: I1122 09:54:57.822864 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbrlx"] Nov 22 09:54:58 crc kubenswrapper[4743]: I1122 09:54:58.062526 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbrlx" event={"ID":"5c9cec77-56f5-4914-9309-19696765d0db","Type":"ContainerStarted","Data":"ddeb9b9c8454225ffb28236314a67faa3d8a1cf5d1639436f6cc528e201a4f8e"} Nov 22 09:54:58 crc kubenswrapper[4743]: I1122 09:54:58.062592 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbrlx" event={"ID":"5c9cec77-56f5-4914-9309-19696765d0db","Type":"ContainerStarted","Data":"e193edffed63c8ff99d855845d7466dac932ef56d5462c0fe9f940815b228538"} Nov 22 09:54:58 crc kubenswrapper[4743]: I1122 09:54:58.082115 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wbrlx" podStartSLOduration=1.082094101 podStartE2EDuration="1.082094101s" podCreationTimestamp="2025-11-22 09:54:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:54:58.077560961 +0000 UTC m=+5571.783922013" watchObservedRunningTime="2025-11-22 09:54:58.082094101 +0000 UTC m=+5571.788455143" Nov 22 09:55:01 crc kubenswrapper[4743]: I1122 09:55:01.241128 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:55:01 crc kubenswrapper[4743]: I1122 09:55:01.241816 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:55:04 crc kubenswrapper[4743]: I1122 09:55:04.112507 4743 generic.go:334] "Generic (PLEG): container finished" podID="5c9cec77-56f5-4914-9309-19696765d0db" containerID="ddeb9b9c8454225ffb28236314a67faa3d8a1cf5d1639436f6cc528e201a4f8e" exitCode=0 Nov 22 09:55:04 crc kubenswrapper[4743]: I1122 09:55:04.112601 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbrlx" event={"ID":"5c9cec77-56f5-4914-9309-19696765d0db","Type":"ContainerDied","Data":"ddeb9b9c8454225ffb28236314a67faa3d8a1cf5d1639436f6cc528e201a4f8e"} Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.448069 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.607717 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-combined-ca-bundle\") pod \"5c9cec77-56f5-4914-9309-19696765d0db\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.607811 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-config-data\") pod \"5c9cec77-56f5-4914-9309-19696765d0db\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.607866 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-scripts\") pod \"5c9cec77-56f5-4914-9309-19696765d0db\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.608075 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wht9w\" (UniqueName: \"kubernetes.io/projected/5c9cec77-56f5-4914-9309-19696765d0db-kube-api-access-wht9w\") pod \"5c9cec77-56f5-4914-9309-19696765d0db\" (UID: \"5c9cec77-56f5-4914-9309-19696765d0db\") " Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.614638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9cec77-56f5-4914-9309-19696765d0db-kube-api-access-wht9w" (OuterVolumeSpecName: "kube-api-access-wht9w") pod "5c9cec77-56f5-4914-9309-19696765d0db" (UID: "5c9cec77-56f5-4914-9309-19696765d0db"). InnerVolumeSpecName "kube-api-access-wht9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.615897 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-scripts" (OuterVolumeSpecName: "scripts") pod "5c9cec77-56f5-4914-9309-19696765d0db" (UID: "5c9cec77-56f5-4914-9309-19696765d0db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.633940 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c9cec77-56f5-4914-9309-19696765d0db" (UID: "5c9cec77-56f5-4914-9309-19696765d0db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.634046 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-config-data" (OuterVolumeSpecName: "config-data") pod "5c9cec77-56f5-4914-9309-19696765d0db" (UID: "5c9cec77-56f5-4914-9309-19696765d0db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.710248 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wht9w\" (UniqueName: \"kubernetes.io/projected/5c9cec77-56f5-4914-9309-19696765d0db-kube-api-access-wht9w\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.710296 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.710308 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:05 crc kubenswrapper[4743]: I1122 09:55:05.710318 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c9cec77-56f5-4914-9309-19696765d0db-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.143374 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wbrlx" event={"ID":"5c9cec77-56f5-4914-9309-19696765d0db","Type":"ContainerDied","Data":"e193edffed63c8ff99d855845d7466dac932ef56d5462c0fe9f940815b228538"} Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.143412 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e193edffed63c8ff99d855845d7466dac932ef56d5462c0fe9f940815b228538" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.143452 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wbrlx" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.209161 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:55:06 crc kubenswrapper[4743]: E1122 09:55:06.209913 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9cec77-56f5-4914-9309-19696765d0db" containerName="nova-cell0-conductor-db-sync" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.209936 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9cec77-56f5-4914-9309-19696765d0db" containerName="nova-cell0-conductor-db-sync" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.210090 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9cec77-56f5-4914-9309-19696765d0db" containerName="nova-cell0-conductor-db-sync" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.210754 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.212650 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-khxpt" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.215541 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.223973 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.325764 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw8dz\" (UniqueName: \"kubernetes.io/projected/f6baf6fa-4d48-46c6-92db-3512a541e3b4-kube-api-access-lw8dz\") pod \"nova-cell0-conductor-0\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.325894 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.325940 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.426988 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.427054 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.427157 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw8dz\" (UniqueName: \"kubernetes.io/projected/f6baf6fa-4d48-46c6-92db-3512a541e3b4-kube-api-access-lw8dz\") pod \"nova-cell0-conductor-0\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.431801 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.432354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.445045 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw8dz\" (UniqueName: \"kubernetes.io/projected/f6baf6fa-4d48-46c6-92db-3512a541e3b4-kube-api-access-lw8dz\") pod \"nova-cell0-conductor-0\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.537052 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:06 crc kubenswrapper[4743]: I1122 09:55:06.950409 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:55:07 crc kubenswrapper[4743]: I1122 09:55:07.174200 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f6baf6fa-4d48-46c6-92db-3512a541e3b4","Type":"ContainerStarted","Data":"8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589"} Nov 22 09:55:07 crc kubenswrapper[4743]: I1122 09:55:07.174285 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f6baf6fa-4d48-46c6-92db-3512a541e3b4","Type":"ContainerStarted","Data":"03b329669a9f7bf0ed08a869af519b986d09053ec92dc747e0a3b76735540216"} Nov 22 09:55:08 crc kubenswrapper[4743]: I1122 09:55:08.186423 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:08 crc kubenswrapper[4743]: I1122 09:55:08.235808 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.235769029 podStartE2EDuration="2.235769029s" podCreationTimestamp="2025-11-22 09:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:08.2065904 +0000 UTC m=+5581.912951462" watchObservedRunningTime="2025-11-22 09:55:08.235769029 +0000 UTC m=+5581.942130161" Nov 22 09:55:16 crc kubenswrapper[4743]: I1122 09:55:16.575072 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 22 09:55:16 crc kubenswrapper[4743]: I1122 09:55:16.981938 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-7wnxw"] Nov 22 09:55:16 crc kubenswrapper[4743]: I1122 09:55:16.983263 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:16 crc kubenswrapper[4743]: I1122 09:55:16.984795 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Nov 22 09:55:16 crc kubenswrapper[4743]: I1122 09:55:16.993808 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Nov 22 09:55:16 crc kubenswrapper[4743]: I1122 09:55:16.994332 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7wnxw"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.096827 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.100468 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.113600 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.121335 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.126226 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-config-data\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.126276 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.126311 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-scripts\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.126401 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmc96\" (UniqueName: \"kubernetes.io/projected/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-kube-api-access-wmc96\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.181449 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.182823 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.185907 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.190464 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.228145 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmc96\" (UniqueName: \"kubernetes.io/projected/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-kube-api-access-wmc96\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.228221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f745269d-eaa6-422e-ab96-5047a30f401e-logs\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.228289 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-config-data\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.228306 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.228335 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-scripts\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.228364 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.228390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-config-data\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.228434 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgqhv\" (UniqueName: \"kubernetes.io/projected/f745269d-eaa6-422e-ab96-5047a30f401e-kube-api-access-sgqhv\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.251103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-scripts\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.251820 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-config-data\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.253006 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.269103 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmc96\" (UniqueName: \"kubernetes.io/projected/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-kube-api-access-wmc96\") pod \"nova-cell0-cell-mapping-7wnxw\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.303371 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.303405 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.304449 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.307069 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.316611 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.341126 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.341173 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-config-data\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.341239 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgqhv\" (UniqueName: \"kubernetes.io/projected/f745269d-eaa6-422e-ab96-5047a30f401e-kube-api-access-sgqhv\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.341268 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.341289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxsrr\" (UniqueName: \"kubernetes.io/projected/c0264257-a8c1-4140-9544-868ef00724f9-kube-api-access-xxsrr\") pod \"nova-scheduler-0\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.341334 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f745269d-eaa6-422e-ab96-5047a30f401e-logs\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.341507 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-config-data\") pod \"nova-scheduler-0\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.349301 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f745269d-eaa6-422e-ab96-5047a30f401e-logs\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.398892 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgqhv\" (UniqueName: \"kubernetes.io/projected/f745269d-eaa6-422e-ab96-5047a30f401e-kube-api-access-sgqhv\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.400161 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.402722 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-config-data\") pod \"nova-api-0\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.418608 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.433733 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.435727 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.451142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl92q\" (UniqueName: \"kubernetes.io/projected/a12092a8-67c2-479d-81de-b879afb81749-kube-api-access-bl92q\") pod \"nova-cell1-novncproxy-0\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.451232 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.451251 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxsrr\" (UniqueName: \"kubernetes.io/projected/c0264257-a8c1-4140-9544-868ef00724f9-kube-api-access-xxsrr\") pod \"nova-scheduler-0\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.451306 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.451340 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-config-data\") pod \"nova-scheduler-0\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.451375 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.454663 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.466470 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.476129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-config-data\") pod \"nova-scheduler-0\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.493054 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-r9kkr"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.494587 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.502640 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxsrr\" (UniqueName: \"kubernetes.io/projected/c0264257-a8c1-4140-9544-868ef00724f9-kube-api-access-xxsrr\") pod \"nova-scheduler-0\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.510248 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.530281 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-r9kkr"] Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.557190 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.557320 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6nbq\" (UniqueName: \"kubernetes.io/projected/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-kube-api-access-n6nbq\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.557359 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-config-data\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.557385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.557469 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl92q\" (UniqueName: \"kubernetes.io/projected/a12092a8-67c2-479d-81de-b879afb81749-kube-api-access-bl92q\") pod \"nova-cell1-novncproxy-0\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.557533 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.557590 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-logs\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.566392 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.567927 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.577225 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl92q\" (UniqueName: \"kubernetes.io/projected/a12092a8-67c2-479d-81de-b879afb81749-kube-api-access-bl92q\") pod \"nova-cell1-novncproxy-0\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.661948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.661992 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-logs\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.662074 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88st7\" (UniqueName: \"kubernetes.io/projected/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-kube-api-access-88st7\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.662101 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-dns-svc\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.662129 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-nb\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.662167 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-sb\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.662194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6nbq\" (UniqueName: \"kubernetes.io/projected/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-kube-api-access-n6nbq\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.662221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-config\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.662248 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-config-data\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.662657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-logs\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.666970 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-config-data\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.678657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.683918 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6nbq\" (UniqueName: \"kubernetes.io/projected/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-kube-api-access-n6nbq\") pod \"nova-metadata-0\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.717900 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.764348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88st7\" (UniqueName: \"kubernetes.io/projected/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-kube-api-access-88st7\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.764400 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-dns-svc\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.764429 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-nb\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.764502 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-sb\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.764540 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-config\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.765430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-dns-svc\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.765736 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-sb\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.765920 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-config\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.767427 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-nb\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.780533 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88st7\" (UniqueName: \"kubernetes.io/projected/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-kube-api-access-88st7\") pod \"dnsmasq-dns-696f9966c7-r9kkr\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.792452 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.822923 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.844310 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:17 crc kubenswrapper[4743]: I1122 09:55:17.879477 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7wnxw"] Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.031369 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s7x4g"] Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.032925 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.035797 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.036073 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.040337 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s7x4g"] Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.049799 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.170318 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-scripts\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.170630 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.170680 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj7d6\" (UniqueName: \"kubernetes.io/projected/19ccec00-e345-40b9-a606-aa72c0d64b8b-kube-api-access-sj7d6\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.170767 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-config-data\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.233250 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.272055 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-scripts\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.273177 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.274356 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj7d6\" (UniqueName: \"kubernetes.io/projected/19ccec00-e345-40b9-a606-aa72c0d64b8b-kube-api-access-sj7d6\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.279652 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-scripts\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.284138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.284365 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-config-data\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.289044 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-config-data\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.304015 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj7d6\" (UniqueName: \"kubernetes.io/projected/19ccec00-e345-40b9-a606-aa72c0d64b8b-kube-api-access-sj7d6\") pod \"nova-cell1-conductor-db-sync-s7x4g\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.305036 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f745269d-eaa6-422e-ab96-5047a30f401e","Type":"ContainerStarted","Data":"60bd519572e2efe01bcecd648797ad5e3aee78c9d03bdf97be0d501ba3fdf708"} Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.310835 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0264257-a8c1-4140-9544-868ef00724f9","Type":"ContainerStarted","Data":"7f981aa087ea9ab5c1d6c03cd9adf6b8cf474a3204e459bff9850698240e8e9c"} Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.313612 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7wnxw" event={"ID":"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3","Type":"ContainerStarted","Data":"72c0a61b882e4f36382e61128f4e9a686449c486cd9a4f07bcb9c3a44a1f62ab"} Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.314024 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7wnxw" event={"ID":"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3","Type":"ContainerStarted","Data":"2fd102805b8e97920a5c902f618fc9d87787a781ff4adeed61271bef5051eb31"} Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.336245 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-7wnxw" podStartSLOduration=2.336220797 podStartE2EDuration="2.336220797s" podCreationTimestamp="2025-11-22 09:55:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:18.328448164 +0000 UTC m=+5592.034809216" watchObservedRunningTime="2025-11-22 09:55:18.336220797 +0000 UTC m=+5592.042581869" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.381149 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:55:18 crc kubenswrapper[4743]: W1122 09:55:18.381841 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda12092a8_67c2_479d_81de_b879afb81749.slice/crio-b27d9f8ff12be71d9ebe4bd6a0ec0fa0395c97fcad92b11ff6018e6da9cf826f WatchSource:0}: Error finding container b27d9f8ff12be71d9ebe4bd6a0ec0fa0395c97fcad92b11ff6018e6da9cf826f: Status 404 returned error can't find the container with id b27d9f8ff12be71d9ebe4bd6a0ec0fa0395c97fcad92b11ff6018e6da9cf826f Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.384277 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.396054 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:18 crc kubenswrapper[4743]: W1122 09:55:18.417730 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6877e3d7_ebb6_46c4_baef_728f89ee3b3d.slice/crio-f22e4ddbfa84c75d84ba4910283662709f7f09de3d3c068ec5873aa41a605741 WatchSource:0}: Error finding container f22e4ddbfa84c75d84ba4910283662709f7f09de3d3c068ec5873aa41a605741: Status 404 returned error can't find the container with id f22e4ddbfa84c75d84ba4910283662709f7f09de3d3c068ec5873aa41a605741 Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.546202 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-r9kkr"] Nov 22 09:55:18 crc kubenswrapper[4743]: W1122 09:55:18.924563 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19ccec00_e345_40b9_a606_aa72c0d64b8b.slice/crio-ac628beb49ec64106cbdebd68fde0f6c35613ccb7321eb117230ee72b05fe704 WatchSource:0}: Error finding container ac628beb49ec64106cbdebd68fde0f6c35613ccb7321eb117230ee72b05fe704: Status 404 returned error can't find the container with id ac628beb49ec64106cbdebd68fde0f6c35613ccb7321eb117230ee72b05fe704 Nov 22 09:55:18 crc kubenswrapper[4743]: I1122 09:55:18.927363 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s7x4g"] Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.326790 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a12092a8-67c2-479d-81de-b879afb81749","Type":"ContainerStarted","Data":"2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.327119 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a12092a8-67c2-479d-81de-b879afb81749","Type":"ContainerStarted","Data":"b27d9f8ff12be71d9ebe4bd6a0ec0fa0395c97fcad92b11ff6018e6da9cf826f"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.330485 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6877e3d7-ebb6-46c4-baef-728f89ee3b3d","Type":"ContainerStarted","Data":"7af9edccf472cbb338e604d4b0c4f092b97a38cd421ca2124c653fce1f9ca5be"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.330511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6877e3d7-ebb6-46c4-baef-728f89ee3b3d","Type":"ContainerStarted","Data":"f22e4ddbfa84c75d84ba4910283662709f7f09de3d3c068ec5873aa41a605741"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.336073 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" containerID="48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4" exitCode=0 Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.336120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" event={"ID":"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8","Type":"ContainerDied","Data":"48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.336140 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" event={"ID":"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8","Type":"ContainerStarted","Data":"d119379142b5b7413d5f17a98ff245e6c9435fd1bc5b917337d957897f318119"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.340270 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s7x4g" event={"ID":"19ccec00-e345-40b9-a606-aa72c0d64b8b","Type":"ContainerStarted","Data":"8be01ab90b71902b48d522f7659269920a4bf9b3e4878b40405b422ea3d7929b"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.340298 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s7x4g" event={"ID":"19ccec00-e345-40b9-a606-aa72c0d64b8b","Type":"ContainerStarted","Data":"ac628beb49ec64106cbdebd68fde0f6c35613ccb7321eb117230ee72b05fe704"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.345018 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f745269d-eaa6-422e-ab96-5047a30f401e","Type":"ContainerStarted","Data":"54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.345047 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f745269d-eaa6-422e-ab96-5047a30f401e","Type":"ContainerStarted","Data":"55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.350297 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0264257-a8c1-4140-9544-868ef00724f9","Type":"ContainerStarted","Data":"fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4"} Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.368400 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.368380463 podStartE2EDuration="2.368380463s" podCreationTimestamp="2025-11-22 09:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:19.359785646 +0000 UTC m=+5593.066146708" watchObservedRunningTime="2025-11-22 09:55:19.368380463 +0000 UTC m=+5593.074741525" Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.386395 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.38637355 podStartE2EDuration="2.38637355s" podCreationTimestamp="2025-11-22 09:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:19.383546519 +0000 UTC m=+5593.089907571" watchObservedRunningTime="2025-11-22 09:55:19.38637355 +0000 UTC m=+5593.092734602" Nov 22 09:55:19 crc kubenswrapper[4743]: I1122 09:55:19.453267 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-s7x4g" podStartSLOduration=1.453230691 podStartE2EDuration="1.453230691s" podCreationTimestamp="2025-11-22 09:55:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:19.405810318 +0000 UTC m=+5593.112171370" watchObservedRunningTime="2025-11-22 09:55:19.453230691 +0000 UTC m=+5593.159591743" Nov 22 09:55:20 crc kubenswrapper[4743]: I1122 09:55:20.378646 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6877e3d7-ebb6-46c4-baef-728f89ee3b3d","Type":"ContainerStarted","Data":"f340843ec3b251c51cc91ec838a04378a83ff513fc42c70d97e0819002c986f9"} Nov 22 09:55:20 crc kubenswrapper[4743]: I1122 09:55:20.382122 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" event={"ID":"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8","Type":"ContainerStarted","Data":"c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777"} Nov 22 09:55:20 crc kubenswrapper[4743]: I1122 09:55:20.408695 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.408675602 podStartE2EDuration="3.408675602s" podCreationTimestamp="2025-11-22 09:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:20.399822398 +0000 UTC m=+5594.106183460" watchObservedRunningTime="2025-11-22 09:55:20.408675602 +0000 UTC m=+5594.115036654" Nov 22 09:55:20 crc kubenswrapper[4743]: I1122 09:55:20.420199 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.420179123 podStartE2EDuration="3.420179123s" podCreationTimestamp="2025-11-22 09:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:20.419142303 +0000 UTC m=+5594.125503355" watchObservedRunningTime="2025-11-22 09:55:20.420179123 +0000 UTC m=+5594.126540175" Nov 22 09:55:20 crc kubenswrapper[4743]: I1122 09:55:20.442069 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" podStartSLOduration=3.442050261 podStartE2EDuration="3.442050261s" podCreationTimestamp="2025-11-22 09:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:20.435430291 +0000 UTC m=+5594.141791343" watchObservedRunningTime="2025-11-22 09:55:20.442050261 +0000 UTC m=+5594.148411313" Nov 22 09:55:21 crc kubenswrapper[4743]: I1122 09:55:21.396369 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:22 crc kubenswrapper[4743]: I1122 09:55:22.511106 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 09:55:22 crc kubenswrapper[4743]: I1122 09:55:22.793041 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:22 crc kubenswrapper[4743]: I1122 09:55:22.824411 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:55:22 crc kubenswrapper[4743]: I1122 09:55:22.824732 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:55:23 crc kubenswrapper[4743]: I1122 09:55:23.415124 4743 generic.go:334] "Generic (PLEG): container finished" podID="9fd1bcc9-26a2-493f-be9e-30dfb052cbc3" containerID="72c0a61b882e4f36382e61128f4e9a686449c486cd9a4f07bcb9c3a44a1f62ab" exitCode=0 Nov 22 09:55:23 crc kubenswrapper[4743]: I1122 09:55:23.415207 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7wnxw" event={"ID":"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3","Type":"ContainerDied","Data":"72c0a61b882e4f36382e61128f4e9a686449c486cd9a4f07bcb9c3a44a1f62ab"} Nov 22 09:55:24 crc kubenswrapper[4743]: I1122 09:55:24.752339 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:24 crc kubenswrapper[4743]: I1122 09:55:24.916205 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-combined-ca-bundle\") pod \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " Nov 22 09:55:24 crc kubenswrapper[4743]: I1122 09:55:24.916363 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-config-data\") pod \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " Nov 22 09:55:24 crc kubenswrapper[4743]: I1122 09:55:24.916419 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmc96\" (UniqueName: \"kubernetes.io/projected/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-kube-api-access-wmc96\") pod \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " Nov 22 09:55:24 crc kubenswrapper[4743]: I1122 09:55:24.916489 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-scripts\") pod \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\" (UID: \"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3\") " Nov 22 09:55:24 crc kubenswrapper[4743]: I1122 09:55:24.922032 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-scripts" (OuterVolumeSpecName: "scripts") pod "9fd1bcc9-26a2-493f-be9e-30dfb052cbc3" (UID: "9fd1bcc9-26a2-493f-be9e-30dfb052cbc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:24 crc kubenswrapper[4743]: I1122 09:55:24.922813 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-kube-api-access-wmc96" (OuterVolumeSpecName: "kube-api-access-wmc96") pod "9fd1bcc9-26a2-493f-be9e-30dfb052cbc3" (UID: "9fd1bcc9-26a2-493f-be9e-30dfb052cbc3"). InnerVolumeSpecName "kube-api-access-wmc96". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:24 crc kubenswrapper[4743]: I1122 09:55:24.945170 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fd1bcc9-26a2-493f-be9e-30dfb052cbc3" (UID: "9fd1bcc9-26a2-493f-be9e-30dfb052cbc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:24 crc kubenswrapper[4743]: I1122 09:55:24.948647 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-config-data" (OuterVolumeSpecName: "config-data") pod "9fd1bcc9-26a2-493f-be9e-30dfb052cbc3" (UID: "9fd1bcc9-26a2-493f-be9e-30dfb052cbc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.018331 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.018363 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.018373 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmc96\" (UniqueName: \"kubernetes.io/projected/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-kube-api-access-wmc96\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.018383 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.436519 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7wnxw" event={"ID":"9fd1bcc9-26a2-493f-be9e-30dfb052cbc3","Type":"ContainerDied","Data":"2fd102805b8e97920a5c902f618fc9d87787a781ff4adeed61271bef5051eb31"} Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.436563 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fd102805b8e97920a5c902f618fc9d87787a781ff4adeed61271bef5051eb31" Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.436636 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7wnxw" Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.615965 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.616201 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f745269d-eaa6-422e-ab96-5047a30f401e" containerName="nova-api-log" containerID="cri-o://55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3" gracePeriod=30 Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.616278 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f745269d-eaa6-422e-ab96-5047a30f401e" containerName="nova-api-api" containerID="cri-o://54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e" gracePeriod=30 Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.625471 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.625907 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c0264257-a8c1-4140-9544-868ef00724f9" containerName="nova-scheduler-scheduler" containerID="cri-o://fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4" gracePeriod=30 Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.640536 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.640789 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" containerName="nova-metadata-log" containerID="cri-o://7af9edccf472cbb338e604d4b0c4f092b97a38cd421ca2124c653fce1f9ca5be" gracePeriod=30 Nov 22 09:55:25 crc kubenswrapper[4743]: I1122 09:55:25.641281 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" containerName="nova-metadata-metadata" containerID="cri-o://f340843ec3b251c51cc91ec838a04378a83ff513fc42c70d97e0819002c986f9" gracePeriod=30 Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.355871 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.448460 4743 generic.go:334] "Generic (PLEG): container finished" podID="f745269d-eaa6-422e-ab96-5047a30f401e" containerID="54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e" exitCode=0 Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.448493 4743 generic.go:334] "Generic (PLEG): container finished" podID="f745269d-eaa6-422e-ab96-5047a30f401e" containerID="55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3" exitCode=143 Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.448520 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.448616 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f745269d-eaa6-422e-ab96-5047a30f401e","Type":"ContainerDied","Data":"54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e"} Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.448650 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f745269d-eaa6-422e-ab96-5047a30f401e","Type":"ContainerDied","Data":"55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3"} Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.448664 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f745269d-eaa6-422e-ab96-5047a30f401e","Type":"ContainerDied","Data":"60bd519572e2efe01bcecd648797ad5e3aee78c9d03bdf97be0d501ba3fdf708"} Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.448683 4743 scope.go:117] "RemoveContainer" containerID="54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.452949 4743 generic.go:334] "Generic (PLEG): container finished" podID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" containerID="f340843ec3b251c51cc91ec838a04378a83ff513fc42c70d97e0819002c986f9" exitCode=0 Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.452965 4743 generic.go:334] "Generic (PLEG): container finished" podID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" containerID="7af9edccf472cbb338e604d4b0c4f092b97a38cd421ca2124c653fce1f9ca5be" exitCode=143 Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.452981 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6877e3d7-ebb6-46c4-baef-728f89ee3b3d","Type":"ContainerDied","Data":"f340843ec3b251c51cc91ec838a04378a83ff513fc42c70d97e0819002c986f9"} Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.453001 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6877e3d7-ebb6-46c4-baef-728f89ee3b3d","Type":"ContainerDied","Data":"7af9edccf472cbb338e604d4b0c4f092b97a38cd421ca2124c653fce1f9ca5be"} Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.474916 4743 scope.go:117] "RemoveContainer" containerID="55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.493270 4743 scope.go:117] "RemoveContainer" containerID="54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e" Nov 22 09:55:26 crc kubenswrapper[4743]: E1122 09:55:26.493988 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e\": container with ID starting with 54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e not found: ID does not exist" containerID="54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.494036 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e"} err="failed to get container status \"54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e\": rpc error: code = NotFound desc = could not find container \"54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e\": container with ID starting with 54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e not found: ID does not exist" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.494065 4743 scope.go:117] "RemoveContainer" containerID="55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3" Nov 22 09:55:26 crc kubenswrapper[4743]: E1122 09:55:26.494472 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3\": container with ID starting with 55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3 not found: ID does not exist" containerID="55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.494493 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3"} err="failed to get container status \"55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3\": rpc error: code = NotFound desc = could not find container \"55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3\": container with ID starting with 55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3 not found: ID does not exist" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.494508 4743 scope.go:117] "RemoveContainer" containerID="54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.494860 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e"} err="failed to get container status \"54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e\": rpc error: code = NotFound desc = could not find container \"54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e\": container with ID starting with 54f588522f5bd00703d9f7c4cef8cf89bb87915cab0e6f4850602421c657205e not found: ID does not exist" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.494905 4743 scope.go:117] "RemoveContainer" containerID="55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.495241 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3"} err="failed to get container status \"55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3\": rpc error: code = NotFound desc = could not find container \"55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3\": container with ID starting with 55935cdf3d3dca007b8b27b004f4e56bcfda0ae2d374188cf26972a5af865cf3 not found: ID does not exist" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.546959 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-config-data\") pod \"f745269d-eaa6-422e-ab96-5047a30f401e\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.547005 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgqhv\" (UniqueName: \"kubernetes.io/projected/f745269d-eaa6-422e-ab96-5047a30f401e-kube-api-access-sgqhv\") pod \"f745269d-eaa6-422e-ab96-5047a30f401e\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.547123 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-combined-ca-bundle\") pod \"f745269d-eaa6-422e-ab96-5047a30f401e\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.547160 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f745269d-eaa6-422e-ab96-5047a30f401e-logs\") pod \"f745269d-eaa6-422e-ab96-5047a30f401e\" (UID: \"f745269d-eaa6-422e-ab96-5047a30f401e\") " Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.547840 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f745269d-eaa6-422e-ab96-5047a30f401e-logs" (OuterVolumeSpecName: "logs") pod "f745269d-eaa6-422e-ab96-5047a30f401e" (UID: "f745269d-eaa6-422e-ab96-5047a30f401e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.552390 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f745269d-eaa6-422e-ab96-5047a30f401e-kube-api-access-sgqhv" (OuterVolumeSpecName: "kube-api-access-sgqhv") pod "f745269d-eaa6-422e-ab96-5047a30f401e" (UID: "f745269d-eaa6-422e-ab96-5047a30f401e"). InnerVolumeSpecName "kube-api-access-sgqhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.572674 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-config-data" (OuterVolumeSpecName: "config-data") pod "f745269d-eaa6-422e-ab96-5047a30f401e" (UID: "f745269d-eaa6-422e-ab96-5047a30f401e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.579958 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f745269d-eaa6-422e-ab96-5047a30f401e" (UID: "f745269d-eaa6-422e-ab96-5047a30f401e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.633549 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.648757 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.648795 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgqhv\" (UniqueName: \"kubernetes.io/projected/f745269d-eaa6-422e-ab96-5047a30f401e-kube-api-access-sgqhv\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.648805 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f745269d-eaa6-422e-ab96-5047a30f401e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.648815 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f745269d-eaa6-422e-ab96-5047a30f401e-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.749815 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-combined-ca-bundle\") pod \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.749867 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-config-data\") pod \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.749955 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-logs\") pod \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.749982 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6nbq\" (UniqueName: \"kubernetes.io/projected/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-kube-api-access-n6nbq\") pod \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.750844 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-logs" (OuterVolumeSpecName: "logs") pod "6877e3d7-ebb6-46c4-baef-728f89ee3b3d" (UID: "6877e3d7-ebb6-46c4-baef-728f89ee3b3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.753337 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-kube-api-access-n6nbq" (OuterVolumeSpecName: "kube-api-access-n6nbq") pod "6877e3d7-ebb6-46c4-baef-728f89ee3b3d" (UID: "6877e3d7-ebb6-46c4-baef-728f89ee3b3d"). InnerVolumeSpecName "kube-api-access-n6nbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:26 crc kubenswrapper[4743]: E1122 09:55:26.769714 4743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-config-data podName:6877e3d7-ebb6-46c4-baef-728f89ee3b3d nodeName:}" failed. No retries permitted until 2025-11-22 09:55:27.269482566 +0000 UTC m=+5600.975843618 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-config-data") pod "6877e3d7-ebb6-46c4-baef-728f89ee3b3d" (UID: "6877e3d7-ebb6-46c4-baef-728f89ee3b3d") : error deleting /var/lib/kubelet/pods/6877e3d7-ebb6-46c4-baef-728f89ee3b3d/volume-subpaths: remove /var/lib/kubelet/pods/6877e3d7-ebb6-46c4-baef-728f89ee3b3d/volume-subpaths: no such file or directory Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.771850 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6877e3d7-ebb6-46c4-baef-728f89ee3b3d" (UID: "6877e3d7-ebb6-46c4-baef-728f89ee3b3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.851799 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.852420 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.852445 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6nbq\" (UniqueName: \"kubernetes.io/projected/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-kube-api-access-n6nbq\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.891846 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.899972 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.911821 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:26 crc kubenswrapper[4743]: E1122 09:55:26.912343 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f745269d-eaa6-422e-ab96-5047a30f401e" containerName="nova-api-log" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.912361 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f745269d-eaa6-422e-ab96-5047a30f401e" containerName="nova-api-log" Nov 22 09:55:26 crc kubenswrapper[4743]: E1122 09:55:26.912377 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f745269d-eaa6-422e-ab96-5047a30f401e" containerName="nova-api-api" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.912383 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f745269d-eaa6-422e-ab96-5047a30f401e" containerName="nova-api-api" Nov 22 09:55:26 crc kubenswrapper[4743]: E1122 09:55:26.912407 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd1bcc9-26a2-493f-be9e-30dfb052cbc3" containerName="nova-manage" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.912413 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd1bcc9-26a2-493f-be9e-30dfb052cbc3" containerName="nova-manage" Nov 22 09:55:26 crc kubenswrapper[4743]: E1122 09:55:26.912428 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" containerName="nova-metadata-metadata" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.912435 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" containerName="nova-metadata-metadata" Nov 22 09:55:26 crc kubenswrapper[4743]: E1122 09:55:26.912446 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" containerName="nova-metadata-log" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.912452 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" containerName="nova-metadata-log" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.912616 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" containerName="nova-metadata-log" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.912630 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" containerName="nova-metadata-metadata" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.912645 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd1bcc9-26a2-493f-be9e-30dfb052cbc3" containerName="nova-manage" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.912664 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f745269d-eaa6-422e-ab96-5047a30f401e" containerName="nova-api-api" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.912677 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f745269d-eaa6-422e-ab96-5047a30f401e" containerName="nova-api-log" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.914074 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.920218 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 09:55:26 crc kubenswrapper[4743]: I1122 09:55:26.927486 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.055441 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-config-data\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.062619 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.062727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7czl\" (UniqueName: \"kubernetes.io/projected/e02b392d-4d2e-44a8-a746-c108ddd5c289-kube-api-access-n7czl\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.062914 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e02b392d-4d2e-44a8-a746-c108ddd5c289-logs\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.162726 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f745269d-eaa6-422e-ab96-5047a30f401e" path="/var/lib/kubelet/pods/f745269d-eaa6-422e-ab96-5047a30f401e/volumes" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.164461 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-config-data\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.164558 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.164645 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7czl\" (UniqueName: \"kubernetes.io/projected/e02b392d-4d2e-44a8-a746-c108ddd5c289-kube-api-access-n7czl\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.164737 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e02b392d-4d2e-44a8-a746-c108ddd5c289-logs\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.165076 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e02b392d-4d2e-44a8-a746-c108ddd5c289-logs\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.168398 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.178459 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-config-data\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.183473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7czl\" (UniqueName: \"kubernetes.io/projected/e02b392d-4d2e-44a8-a746-c108ddd5c289-kube-api-access-n7czl\") pod \"nova-api-0\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.246171 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.367225 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-config-data\") pod \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\" (UID: \"6877e3d7-ebb6-46c4-baef-728f89ee3b3d\") " Nov 22 09:55:27 crc kubenswrapper[4743]: I1122 09:55:27.371971 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-config-data" (OuterVolumeSpecName: "config-data") pod "6877e3d7-ebb6-46c4-baef-728f89ee3b3d" (UID: "6877e3d7-ebb6-46c4-baef-728f89ee3b3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.463169 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6877e3d7-ebb6-46c4-baef-728f89ee3b3d","Type":"ContainerDied","Data":"f22e4ddbfa84c75d84ba4910283662709f7f09de3d3c068ec5873aa41a605741"} Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.463220 4743 scope.go:117] "RemoveContainer" containerID="f340843ec3b251c51cc91ec838a04378a83ff513fc42c70d97e0819002c986f9" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.463246 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.465379 4743 generic.go:334] "Generic (PLEG): container finished" podID="19ccec00-e345-40b9-a606-aa72c0d64b8b" containerID="8be01ab90b71902b48d522f7659269920a4bf9b3e4878b40405b422ea3d7929b" exitCode=0 Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.465486 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s7x4g" event={"ID":"19ccec00-e345-40b9-a606-aa72c0d64b8b","Type":"ContainerDied","Data":"8be01ab90b71902b48d522f7659269920a4bf9b3e4878b40405b422ea3d7929b"} Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.469814 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6877e3d7-ebb6-46c4-baef-728f89ee3b3d-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.487446 4743 scope.go:117] "RemoveContainer" containerID="7af9edccf472cbb338e604d4b0c4f092b97a38cd421ca2124c653fce1f9ca5be" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.513341 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.531029 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.537452 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.539435 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.546154 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.547008 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.570960 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c056ae8-a828-4698-b855-3cb3b7c34936-logs\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.571127 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bs47\" (UniqueName: \"kubernetes.io/projected/6c056ae8-a828-4698-b855-3cb3b7c34936-kube-api-access-9bs47\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.571183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.571240 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-config-data\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.651418 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.672352 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c056ae8-a828-4698-b855-3cb3b7c34936-logs\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.672397 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bs47\" (UniqueName: \"kubernetes.io/projected/6c056ae8-a828-4698-b855-3cb3b7c34936-kube-api-access-9bs47\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.672440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.672496 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-config-data\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.672832 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c056ae8-a828-4698-b855-3cb3b7c34936-logs\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.677999 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-config-data\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.678336 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.687124 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bs47\" (UniqueName: \"kubernetes.io/projected/6c056ae8-a828-4698-b855-3cb3b7c34936-kube-api-access-9bs47\") pod \"nova-metadata-0\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.793712 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.804538 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.845800 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.861862 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.904441 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-trz9s"] Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:27.904726 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" podUID="a35d8e35-5277-4a64-a3e9-0c3d7382671c" containerName="dnsmasq-dns" containerID="cri-o://25cad52d4532c852d7a71b4ec33b03722848080d2c216b861fcb2ae05182affe" gracePeriod=10 Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:28.481329 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e02b392d-4d2e-44a8-a746-c108ddd5c289","Type":"ContainerStarted","Data":"114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6"} Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:28.481885 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e02b392d-4d2e-44a8-a746-c108ddd5c289","Type":"ContainerStarted","Data":"909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8"} Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:28.481907 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e02b392d-4d2e-44a8-a746-c108ddd5c289","Type":"ContainerStarted","Data":"9614281566e3e24402eeb2243f33c97f39c9de2f8da349ac59ac0680d778422a"} Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:28.484020 4743 generic.go:334] "Generic (PLEG): container finished" podID="a35d8e35-5277-4a64-a3e9-0c3d7382671c" containerID="25cad52d4532c852d7a71b4ec33b03722848080d2c216b861fcb2ae05182affe" exitCode=0 Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:28.486107 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" event={"ID":"a35d8e35-5277-4a64-a3e9-0c3d7382671c","Type":"ContainerDied","Data":"25cad52d4532c852d7a71b4ec33b03722848080d2c216b861fcb2ae05182affe"} Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:28.497757 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:55:28 crc kubenswrapper[4743]: I1122 09:55:28.507369 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5073507680000002 podStartE2EDuration="2.507350768s" podCreationTimestamp="2025-11-22 09:55:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:28.503076215 +0000 UTC m=+5602.209437257" watchObservedRunningTime="2025-11-22 09:55:28.507350768 +0000 UTC m=+5602.213711810" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.128513 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.135494 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.173392 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6877e3d7-ebb6-46c4-baef-728f89ee3b3d" path="/var/lib/kubelet/pods/6877e3d7-ebb6-46c4-baef-728f89ee3b3d/volumes" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.311233 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-config\") pod \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.311277 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-nb\") pod \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.311335 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-scripts\") pod \"19ccec00-e345-40b9-a606-aa72c0d64b8b\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.311394 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj7d6\" (UniqueName: \"kubernetes.io/projected/19ccec00-e345-40b9-a606-aa72c0d64b8b-kube-api-access-sj7d6\") pod \"19ccec00-e345-40b9-a606-aa72c0d64b8b\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.311475 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-dns-svc\") pod \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.311495 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw5zf\" (UniqueName: \"kubernetes.io/projected/a35d8e35-5277-4a64-a3e9-0c3d7382671c-kube-api-access-bw5zf\") pod \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.311516 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-combined-ca-bundle\") pod \"19ccec00-e345-40b9-a606-aa72c0d64b8b\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.311537 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-sb\") pod \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\" (UID: \"a35d8e35-5277-4a64-a3e9-0c3d7382671c\") " Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.311560 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-config-data\") pod \"19ccec00-e345-40b9-a606-aa72c0d64b8b\" (UID: \"19ccec00-e345-40b9-a606-aa72c0d64b8b\") " Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.335000 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-scripts" (OuterVolumeSpecName: "scripts") pod "19ccec00-e345-40b9-a606-aa72c0d64b8b" (UID: "19ccec00-e345-40b9-a606-aa72c0d64b8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.335692 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a35d8e35-5277-4a64-a3e9-0c3d7382671c-kube-api-access-bw5zf" (OuterVolumeSpecName: "kube-api-access-bw5zf") pod "a35d8e35-5277-4a64-a3e9-0c3d7382671c" (UID: "a35d8e35-5277-4a64-a3e9-0c3d7382671c"). InnerVolumeSpecName "kube-api-access-bw5zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.336539 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ccec00-e345-40b9-a606-aa72c0d64b8b-kube-api-access-sj7d6" (OuterVolumeSpecName: "kube-api-access-sj7d6") pod "19ccec00-e345-40b9-a606-aa72c0d64b8b" (UID: "19ccec00-e345-40b9-a606-aa72c0d64b8b"). InnerVolumeSpecName "kube-api-access-sj7d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.340394 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-config-data" (OuterVolumeSpecName: "config-data") pod "19ccec00-e345-40b9-a606-aa72c0d64b8b" (UID: "19ccec00-e345-40b9-a606-aa72c0d64b8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.345960 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19ccec00-e345-40b9-a606-aa72c0d64b8b" (UID: "19ccec00-e345-40b9-a606-aa72c0d64b8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.363911 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a35d8e35-5277-4a64-a3e9-0c3d7382671c" (UID: "a35d8e35-5277-4a64-a3e9-0c3d7382671c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.373556 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-config" (OuterVolumeSpecName: "config") pod "a35d8e35-5277-4a64-a3e9-0c3d7382671c" (UID: "a35d8e35-5277-4a64-a3e9-0c3d7382671c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.378424 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.397734 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a35d8e35-5277-4a64-a3e9-0c3d7382671c" (UID: "a35d8e35-5277-4a64-a3e9-0c3d7382671c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.406424 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a35d8e35-5277-4a64-a3e9-0c3d7382671c" (UID: "a35d8e35-5277-4a64-a3e9-0c3d7382671c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.414159 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj7d6\" (UniqueName: \"kubernetes.io/projected/19ccec00-e345-40b9-a606-aa72c0d64b8b-kube-api-access-sj7d6\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.414214 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.414226 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw5zf\" (UniqueName: \"kubernetes.io/projected/a35d8e35-5277-4a64-a3e9-0c3d7382671c-kube-api-access-bw5zf\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.414236 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.414245 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.414254 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.414262 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.414287 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a35d8e35-5277-4a64-a3e9-0c3d7382671c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.414296 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19ccec00-e345-40b9-a606-aa72c0d64b8b-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.496446 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-s7x4g" event={"ID":"19ccec00-e345-40b9-a606-aa72c0d64b8b","Type":"ContainerDied","Data":"ac628beb49ec64106cbdebd68fde0f6c35613ccb7321eb117230ee72b05fe704"} Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.496705 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac628beb49ec64106cbdebd68fde0f6c35613ccb7321eb117230ee72b05fe704" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.496815 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-s7x4g" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.511302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c056ae8-a828-4698-b855-3cb3b7c34936","Type":"ContainerStarted","Data":"390ff8b981a24f64284f874bb50ec1d1f1b8735dfdc72d6a321024e550128341"} Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.516020 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" event={"ID":"a35d8e35-5277-4a64-a3e9-0c3d7382671c","Type":"ContainerDied","Data":"1201de1a6390af2a0790fd54c229f1efe73458c7cb7b8ab0c5e7644639e8ad66"} Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.516096 4743 scope.go:117] "RemoveContainer" containerID="25cad52d4532c852d7a71b4ec33b03722848080d2c216b861fcb2ae05182affe" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.516038 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74df65d56c-trz9s" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.546150 4743 scope.go:117] "RemoveContainer" containerID="9d5bfd13bb18dc50f43a94b646988b63e2cf5182332ec57215e66e7a12d52014" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.586227 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:55:29 crc kubenswrapper[4743]: E1122 09:55:29.587316 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a35d8e35-5277-4a64-a3e9-0c3d7382671c" containerName="dnsmasq-dns" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.587341 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35d8e35-5277-4a64-a3e9-0c3d7382671c" containerName="dnsmasq-dns" Nov 22 09:55:29 crc kubenswrapper[4743]: E1122 09:55:29.587354 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19ccec00-e345-40b9-a606-aa72c0d64b8b" containerName="nova-cell1-conductor-db-sync" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.587361 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="19ccec00-e345-40b9-a606-aa72c0d64b8b" containerName="nova-cell1-conductor-db-sync" Nov 22 09:55:29 crc kubenswrapper[4743]: E1122 09:55:29.587387 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a35d8e35-5277-4a64-a3e9-0c3d7382671c" containerName="init" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.587393 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35d8e35-5277-4a64-a3e9-0c3d7382671c" containerName="init" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.587824 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a35d8e35-5277-4a64-a3e9-0c3d7382671c" containerName="dnsmasq-dns" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.587843 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="19ccec00-e345-40b9-a606-aa72c0d64b8b" containerName="nova-cell1-conductor-db-sync" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.588726 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.591793 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.612934 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.628234 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-trz9s"] Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.644738 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74df65d56c-trz9s"] Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.718668 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.718897 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.719193 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl2sk\" (UniqueName: \"kubernetes.io/projected/f7a220c5-b663-44ac-82f5-46769c94f7a3-kube-api-access-kl2sk\") pod \"nova-cell1-conductor-0\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.821292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl2sk\" (UniqueName: \"kubernetes.io/projected/f7a220c5-b663-44ac-82f5-46769c94f7a3-kube-api-access-kl2sk\") pod \"nova-cell1-conductor-0\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.821358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.821409 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.828147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.828662 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.841263 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl2sk\" (UniqueName: \"kubernetes.io/projected/f7a220c5-b663-44ac-82f5-46769c94f7a3-kube-api-access-kl2sk\") pod \"nova-cell1-conductor-0\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:29 crc kubenswrapper[4743]: I1122 09:55:29.922867 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.149143 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.328330 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-config-data\") pod \"c0264257-a8c1-4140-9544-868ef00724f9\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.328536 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-combined-ca-bundle\") pod \"c0264257-a8c1-4140-9544-868ef00724f9\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.328602 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxsrr\" (UniqueName: \"kubernetes.io/projected/c0264257-a8c1-4140-9544-868ef00724f9-kube-api-access-xxsrr\") pod \"c0264257-a8c1-4140-9544-868ef00724f9\" (UID: \"c0264257-a8c1-4140-9544-868ef00724f9\") " Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.333261 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0264257-a8c1-4140-9544-868ef00724f9-kube-api-access-xxsrr" (OuterVolumeSpecName: "kube-api-access-xxsrr") pod "c0264257-a8c1-4140-9544-868ef00724f9" (UID: "c0264257-a8c1-4140-9544-868ef00724f9"). InnerVolumeSpecName "kube-api-access-xxsrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.353986 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-config-data" (OuterVolumeSpecName: "config-data") pod "c0264257-a8c1-4140-9544-868ef00724f9" (UID: "c0264257-a8c1-4140-9544-868ef00724f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.362398 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0264257-a8c1-4140-9544-868ef00724f9" (UID: "c0264257-a8c1-4140-9544-868ef00724f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.419818 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:55:30 crc kubenswrapper[4743]: W1122 09:55:30.425312 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7a220c5_b663_44ac_82f5_46769c94f7a3.slice/crio-97b10309ce826eb01b89192fd2622c303b199fda0bff4c30c03631ae5dff9b4c WatchSource:0}: Error finding container 97b10309ce826eb01b89192fd2622c303b199fda0bff4c30c03631ae5dff9b4c: Status 404 returned error can't find the container with id 97b10309ce826eb01b89192fd2622c303b199fda0bff4c30c03631ae5dff9b4c Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.430777 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.430802 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxsrr\" (UniqueName: \"kubernetes.io/projected/c0264257-a8c1-4140-9544-868ef00724f9-kube-api-access-xxsrr\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.430812 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0264257-a8c1-4140-9544-868ef00724f9-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.528786 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c056ae8-a828-4698-b855-3cb3b7c34936","Type":"ContainerStarted","Data":"e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049"} Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.528852 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c056ae8-a828-4698-b855-3cb3b7c34936","Type":"ContainerStarted","Data":"e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075"} Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.533084 4743 generic.go:334] "Generic (PLEG): container finished" podID="c0264257-a8c1-4140-9544-868ef00724f9" containerID="fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4" exitCode=0 Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.533254 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.533814 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0264257-a8c1-4140-9544-868ef00724f9","Type":"ContainerDied","Data":"fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4"} Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.533858 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c0264257-a8c1-4140-9544-868ef00724f9","Type":"ContainerDied","Data":"7f981aa087ea9ab5c1d6c03cd9adf6b8cf474a3204e459bff9850698240e8e9c"} Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.533874 4743 scope.go:117] "RemoveContainer" containerID="fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.537255 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7a220c5-b663-44ac-82f5-46769c94f7a3","Type":"ContainerStarted","Data":"97b10309ce826eb01b89192fd2622c303b199fda0bff4c30c03631ae5dff9b4c"} Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.550625 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.550601292 podStartE2EDuration="3.550601292s" podCreationTimestamp="2025-11-22 09:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:30.548471061 +0000 UTC m=+5604.254832113" watchObservedRunningTime="2025-11-22 09:55:30.550601292 +0000 UTC m=+5604.256962344" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.585567 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.588590 4743 scope.go:117] "RemoveContainer" containerID="fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4" Nov 22 09:55:30 crc kubenswrapper[4743]: E1122 09:55:30.589179 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4\": container with ID starting with fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4 not found: ID does not exist" containerID="fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.589235 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4"} err="failed to get container status \"fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4\": rpc error: code = NotFound desc = could not find container \"fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4\": container with ID starting with fdb9392888035018a523866032340de8b46c79e3a1f767bebf9004c3a77705b4 not found: ID does not exist" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.596499 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.612648 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:30 crc kubenswrapper[4743]: E1122 09:55:30.613074 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0264257-a8c1-4140-9544-868ef00724f9" containerName="nova-scheduler-scheduler" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.613092 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0264257-a8c1-4140-9544-868ef00724f9" containerName="nova-scheduler-scheduler" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.613293 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0264257-a8c1-4140-9544-868ef00724f9" containerName="nova-scheduler-scheduler" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.613972 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.617189 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.629586 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.736408 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jhdw\" (UniqueName: \"kubernetes.io/projected/2b908124-d25c-461a-994b-04fd8742c0f7-kube-api-access-9jhdw\") pod \"nova-scheduler-0\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.736627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.736687 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-config-data\") pod \"nova-scheduler-0\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.838834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.838908 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-config-data\") pod \"nova-scheduler-0\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.838959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jhdw\" (UniqueName: \"kubernetes.io/projected/2b908124-d25c-461a-994b-04fd8742c0f7-kube-api-access-9jhdw\") pod \"nova-scheduler-0\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.843187 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-config-data\") pod \"nova-scheduler-0\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.843359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.854525 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jhdw\" (UniqueName: \"kubernetes.io/projected/2b908124-d25c-461a-994b-04fd8742c0f7-kube-api-access-9jhdw\") pod \"nova-scheduler-0\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:30 crc kubenswrapper[4743]: I1122 09:55:30.931811 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.164740 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a35d8e35-5277-4a64-a3e9-0c3d7382671c" path="/var/lib/kubelet/pods/a35d8e35-5277-4a64-a3e9-0c3d7382671c/volumes" Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.165824 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0264257-a8c1-4140-9544-868ef00724f9" path="/var/lib/kubelet/pods/c0264257-a8c1-4140-9544-868ef00724f9/volumes" Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.241425 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.241501 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.241561 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.242535 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3eb3f31edfdd1f055cdfe5d03bc7d720df5251d68525b808fed42d018024ae04"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.242615 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://3eb3f31edfdd1f055cdfe5d03bc7d720df5251d68525b808fed42d018024ae04" gracePeriod=600 Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.348884 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:31 crc kubenswrapper[4743]: W1122 09:55:31.359421 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b908124_d25c_461a_994b_04fd8742c0f7.slice/crio-c7edc9919be7a730c2404bb768823914935b2c18a8732b6bad1f130801b2cad5 WatchSource:0}: Error finding container c7edc9919be7a730c2404bb768823914935b2c18a8732b6bad1f130801b2cad5: Status 404 returned error can't find the container with id c7edc9919be7a730c2404bb768823914935b2c18a8732b6bad1f130801b2cad5 Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.549610 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7a220c5-b663-44ac-82f5-46769c94f7a3","Type":"ContainerStarted","Data":"f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c"} Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.550649 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.552616 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="3eb3f31edfdd1f055cdfe5d03bc7d720df5251d68525b808fed42d018024ae04" exitCode=0 Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.552676 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"3eb3f31edfdd1f055cdfe5d03bc7d720df5251d68525b808fed42d018024ae04"} Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.552706 4743 scope.go:117] "RemoveContainer" containerID="c83b7228db434708f1f26210f7780d19fe5a6b7b63e5662df7ba7d315896ef56" Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.554012 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b908124-d25c-461a-994b-04fd8742c0f7","Type":"ContainerStarted","Data":"c7edc9919be7a730c2404bb768823914935b2c18a8732b6bad1f130801b2cad5"} Nov 22 09:55:31 crc kubenswrapper[4743]: I1122 09:55:31.571511 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.571417462 podStartE2EDuration="2.571417462s" podCreationTimestamp="2025-11-22 09:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:31.564881954 +0000 UTC m=+5605.271243026" watchObservedRunningTime="2025-11-22 09:55:31.571417462 +0000 UTC m=+5605.277778514" Nov 22 09:55:32 crc kubenswrapper[4743]: I1122 09:55:32.570781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68"} Nov 22 09:55:32 crc kubenswrapper[4743]: I1122 09:55:32.572387 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b908124-d25c-461a-994b-04fd8742c0f7","Type":"ContainerStarted","Data":"483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac"} Nov 22 09:55:32 crc kubenswrapper[4743]: I1122 09:55:32.603352 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6033332590000002 podStartE2EDuration="2.603333259s" podCreationTimestamp="2025-11-22 09:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:32.599569311 +0000 UTC m=+5606.305930363" watchObservedRunningTime="2025-11-22 09:55:32.603333259 +0000 UTC m=+5606.309694311" Nov 22 09:55:32 crc kubenswrapper[4743]: I1122 09:55:32.821052 4743 scope.go:117] "RemoveContainer" containerID="33e393f310301fb9ee065eff959f6b18430a6ef659cf7dae231730b614e33f68" Nov 22 09:55:32 crc kubenswrapper[4743]: I1122 09:55:32.862824 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:55:32 crc kubenswrapper[4743]: I1122 09:55:32.862964 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:55:35 crc kubenswrapper[4743]: I1122 09:55:35.931984 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 09:55:37 crc kubenswrapper[4743]: I1122 09:55:37.246967 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:55:37 crc kubenswrapper[4743]: I1122 09:55:37.247337 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:55:37 crc kubenswrapper[4743]: I1122 09:55:37.863032 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 09:55:37 crc kubenswrapper[4743]: I1122 09:55:37.863088 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 09:55:38 crc kubenswrapper[4743]: I1122 09:55:38.349923 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.65:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:55:38 crc kubenswrapper[4743]: I1122 09:55:38.350143 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.65:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:55:38 crc kubenswrapper[4743]: I1122 09:55:38.945784 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:55:38 crc kubenswrapper[4743]: I1122 09:55:38.946111 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:55:39 crc kubenswrapper[4743]: I1122 09:55:39.954354 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.384432 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-6ps4b"] Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.386171 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.396115 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6ps4b"] Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.429442 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.429990 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.431245 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-scripts\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.431329 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-config-data\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.431380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.431491 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w8bl\" (UniqueName: \"kubernetes.io/projected/ba2b2d2d-594a-4b08-baf1-d021c012f86a-kube-api-access-6w8bl\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.533634 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.533727 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w8bl\" (UniqueName: \"kubernetes.io/projected/ba2b2d2d-594a-4b08-baf1-d021c012f86a-kube-api-access-6w8bl\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.533830 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-scripts\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.533887 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-config-data\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.540687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-scripts\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.540779 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-config-data\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.551748 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.559683 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w8bl\" (UniqueName: \"kubernetes.io/projected/ba2b2d2d-594a-4b08-baf1-d021c012f86a-kube-api-access-6w8bl\") pod \"nova-cell1-cell-mapping-6ps4b\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.756879 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.932730 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 09:55:40 crc kubenswrapper[4743]: I1122 09:55:40.972134 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 09:55:41 crc kubenswrapper[4743]: I1122 09:55:41.209914 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-6ps4b"] Nov 22 09:55:41 crc kubenswrapper[4743]: W1122 09:55:41.218853 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba2b2d2d_594a_4b08_baf1_d021c012f86a.slice/crio-f5c8acb1392a855c2edbd62f36a8832c82e03f33357d288c0f376dee2255a1cd WatchSource:0}: Error finding container f5c8acb1392a855c2edbd62f36a8832c82e03f33357d288c0f376dee2255a1cd: Status 404 returned error can't find the container with id f5c8acb1392a855c2edbd62f36a8832c82e03f33357d288c0f376dee2255a1cd Nov 22 09:55:41 crc kubenswrapper[4743]: I1122 09:55:41.661027 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6ps4b" event={"ID":"ba2b2d2d-594a-4b08-baf1-d021c012f86a","Type":"ContainerStarted","Data":"9dc1ca414f493e4915e2648211e37abf27205e7a9436de039ad5286bfb54d5a4"} Nov 22 09:55:41 crc kubenswrapper[4743]: I1122 09:55:41.661563 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6ps4b" event={"ID":"ba2b2d2d-594a-4b08-baf1-d021c012f86a","Type":"ContainerStarted","Data":"f5c8acb1392a855c2edbd62f36a8832c82e03f33357d288c0f376dee2255a1cd"} Nov 22 09:55:41 crc kubenswrapper[4743]: I1122 09:55:41.685001 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-6ps4b" podStartSLOduration=1.684979607 podStartE2EDuration="1.684979607s" podCreationTimestamp="2025-11-22 09:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:41.677063669 +0000 UTC m=+5615.383424721" watchObservedRunningTime="2025-11-22 09:55:41.684979607 +0000 UTC m=+5615.391340659" Nov 22 09:55:41 crc kubenswrapper[4743]: I1122 09:55:41.693343 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 09:55:46 crc kubenswrapper[4743]: I1122 09:55:46.727673 4743 generic.go:334] "Generic (PLEG): container finished" podID="ba2b2d2d-594a-4b08-baf1-d021c012f86a" containerID="9dc1ca414f493e4915e2648211e37abf27205e7a9436de039ad5286bfb54d5a4" exitCode=0 Nov 22 09:55:46 crc kubenswrapper[4743]: I1122 09:55:46.727756 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6ps4b" event={"ID":"ba2b2d2d-594a-4b08-baf1-d021c012f86a","Type":"ContainerDied","Data":"9dc1ca414f493e4915e2648211e37abf27205e7a9436de039ad5286bfb54d5a4"} Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.249658 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.250663 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.250807 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.253020 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.736873 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.742385 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.868160 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.874357 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.874709 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.910707 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qh62j"] Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.914323 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:47 crc kubenswrapper[4743]: I1122 09:55:47.934838 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qh62j"] Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.094927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-sb\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.094975 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-nb\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.095038 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkh2v\" (UniqueName: \"kubernetes.io/projected/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-kube-api-access-tkh2v\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.095076 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-dns-svc\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.095095 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-config\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.125467 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.196448 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-sb\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.196494 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-nb\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.196569 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkh2v\" (UniqueName: \"kubernetes.io/projected/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-kube-api-access-tkh2v\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.196633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-dns-svc\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.196655 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-config\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.197626 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-sb\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.197631 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-config\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.199019 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-nb\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.199800 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-dns-svc\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.235863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkh2v\" (UniqueName: \"kubernetes.io/projected/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-kube-api-access-tkh2v\") pod \"dnsmasq-dns-64b8d7d4fc-qh62j\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.243617 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.297936 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-scripts\") pod \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.298254 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-config-data\") pod \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.298289 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-combined-ca-bundle\") pod \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.298324 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w8bl\" (UniqueName: \"kubernetes.io/projected/ba2b2d2d-594a-4b08-baf1-d021c012f86a-kube-api-access-6w8bl\") pod \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\" (UID: \"ba2b2d2d-594a-4b08-baf1-d021c012f86a\") " Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.303553 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2b2d2d-594a-4b08-baf1-d021c012f86a-kube-api-access-6w8bl" (OuterVolumeSpecName: "kube-api-access-6w8bl") pod "ba2b2d2d-594a-4b08-baf1-d021c012f86a" (UID: "ba2b2d2d-594a-4b08-baf1-d021c012f86a"). InnerVolumeSpecName "kube-api-access-6w8bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.303940 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-scripts" (OuterVolumeSpecName: "scripts") pod "ba2b2d2d-594a-4b08-baf1-d021c012f86a" (UID: "ba2b2d2d-594a-4b08-baf1-d021c012f86a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.329646 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-config-data" (OuterVolumeSpecName: "config-data") pod "ba2b2d2d-594a-4b08-baf1-d021c012f86a" (UID: "ba2b2d2d-594a-4b08-baf1-d021c012f86a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.335070 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba2b2d2d-594a-4b08-baf1-d021c012f86a" (UID: "ba2b2d2d-594a-4b08-baf1-d021c012f86a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.401729 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.402402 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.402422 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2b2d2d-594a-4b08-baf1-d021c012f86a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.402437 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w8bl\" (UniqueName: \"kubernetes.io/projected/ba2b2d2d-594a-4b08-baf1-d021c012f86a-kube-api-access-6w8bl\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.756797 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-6ps4b" event={"ID":"ba2b2d2d-594a-4b08-baf1-d021c012f86a","Type":"ContainerDied","Data":"f5c8acb1392a855c2edbd62f36a8832c82e03f33357d288c0f376dee2255a1cd"} Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.757256 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5c8acb1392a855c2edbd62f36a8832c82e03f33357d288c0f376dee2255a1cd" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.757502 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-6ps4b" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.759462 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.788968 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qh62j"] Nov 22 09:55:48 crc kubenswrapper[4743]: W1122 09:55:48.806224 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56209ff1_f81f_4fa2_be9e_d3387f03a7a7.slice/crio-41b6331a27bdffea327e4ae19c1f43f29d05737e091b0bb0b5571cf898b7adba WatchSource:0}: Error finding container 41b6331a27bdffea327e4ae19c1f43f29d05737e091b0bb0b5571cf898b7adba: Status 404 returned error can't find the container with id 41b6331a27bdffea327e4ae19c1f43f29d05737e091b0bb0b5571cf898b7adba Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.958062 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.975194 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.975384 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2b908124-d25c-461a-994b-04fd8742c0f7" containerName="nova-scheduler-scheduler" containerID="cri-o://483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac" gracePeriod=30 Nov 22 09:55:48 crc kubenswrapper[4743]: I1122 09:55:48.983025 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:49 crc kubenswrapper[4743]: I1122 09:55:49.767251 4743 generic.go:334] "Generic (PLEG): container finished" podID="56209ff1-f81f-4fa2-be9e-d3387f03a7a7" containerID="09e314644d1ac19ffd87beba67e8a4ec3b28113ddf94afd68e1f5069d3fd7a85" exitCode=0 Nov 22 09:55:49 crc kubenswrapper[4743]: I1122 09:55:49.767355 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" event={"ID":"56209ff1-f81f-4fa2-be9e-d3387f03a7a7","Type":"ContainerDied","Data":"09e314644d1ac19ffd87beba67e8a4ec3b28113ddf94afd68e1f5069d3fd7a85"} Nov 22 09:55:49 crc kubenswrapper[4743]: I1122 09:55:49.767857 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" event={"ID":"56209ff1-f81f-4fa2-be9e-d3387f03a7a7","Type":"ContainerStarted","Data":"41b6331a27bdffea327e4ae19c1f43f29d05737e091b0bb0b5571cf898b7adba"} Nov 22 09:55:50 crc kubenswrapper[4743]: I1122 09:55:50.777476 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-log" containerID="cri-o://e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075" gracePeriod=30 Nov 22 09:55:50 crc kubenswrapper[4743]: I1122 09:55:50.778771 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" event={"ID":"56209ff1-f81f-4fa2-be9e-d3387f03a7a7","Type":"ContainerStarted","Data":"87a30abfcd59ad2cb87cb358f309afae08a4d6060569e74d50f3b883bb3f113c"} Nov 22 09:55:50 crc kubenswrapper[4743]: I1122 09:55:50.778806 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:50 crc kubenswrapper[4743]: I1122 09:55:50.778903 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerName="nova-api-log" containerID="cri-o://909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8" gracePeriod=30 Nov 22 09:55:50 crc kubenswrapper[4743]: I1122 09:55:50.780211 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-metadata" containerID="cri-o://e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049" gracePeriod=30 Nov 22 09:55:50 crc kubenswrapper[4743]: I1122 09:55:50.780290 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerName="nova-api-api" containerID="cri-o://114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6" gracePeriod=30 Nov 22 09:55:50 crc kubenswrapper[4743]: I1122 09:55:50.810896 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" podStartSLOduration=3.810881105 podStartE2EDuration="3.810881105s" podCreationTimestamp="2025-11-22 09:55:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:50.809417283 +0000 UTC m=+5624.515778335" watchObservedRunningTime="2025-11-22 09:55:50.810881105 +0000 UTC m=+5624.517242157" Nov 22 09:55:50 crc kubenswrapper[4743]: E1122 09:55:50.881794 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c056ae8_a828_4698_b855_3cb3b7c34936.slice/crio-conmon-e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode02b392d_4d2e_44a8_a746_c108ddd5c289.slice/crio-909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8.scope\": RecentStats: unable to find data in memory cache]" Nov 22 09:55:50 crc kubenswrapper[4743]: E1122 09:55:50.937688 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:55:50 crc kubenswrapper[4743]: E1122 09:55:50.940043 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:55:50 crc kubenswrapper[4743]: E1122 09:55:50.941602 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:55:50 crc kubenswrapper[4743]: E1122 09:55:50.941658 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2b908124-d25c-461a-994b-04fd8742c0f7" containerName="nova-scheduler-scheduler" Nov 22 09:55:51 crc kubenswrapper[4743]: I1122 09:55:51.790502 4743 generic.go:334] "Generic (PLEG): container finished" podID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerID="909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8" exitCode=143 Nov 22 09:55:51 crc kubenswrapper[4743]: I1122 09:55:51.790603 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e02b392d-4d2e-44a8-a746-c108ddd5c289","Type":"ContainerDied","Data":"909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8"} Nov 22 09:55:51 crc kubenswrapper[4743]: I1122 09:55:51.792302 4743 generic.go:334] "Generic (PLEG): container finished" podID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerID="e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075" exitCode=143 Nov 22 09:55:51 crc kubenswrapper[4743]: I1122 09:55:51.792338 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c056ae8-a828-4698-b855-3cb3b7c34936","Type":"ContainerDied","Data":"e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075"} Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.780455 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.812268 4743 generic.go:334] "Generic (PLEG): container finished" podID="2b908124-d25c-461a-994b-04fd8742c0f7" containerID="483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac" exitCode=0 Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.812311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b908124-d25c-461a-994b-04fd8742c0f7","Type":"ContainerDied","Data":"483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac"} Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.812323 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.812335 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2b908124-d25c-461a-994b-04fd8742c0f7","Type":"ContainerDied","Data":"c7edc9919be7a730c2404bb768823914935b2c18a8732b6bad1f130801b2cad5"} Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.812347 4743 scope.go:117] "RemoveContainer" containerID="483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac" Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.834446 4743 scope.go:117] "RemoveContainer" containerID="483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac" Nov 22 09:55:53 crc kubenswrapper[4743]: E1122 09:55:53.835028 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac\": container with ID starting with 483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac not found: ID does not exist" containerID="483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac" Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.835061 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac"} err="failed to get container status \"483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac\": rpc error: code = NotFound desc = could not find container \"483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac\": container with ID starting with 483dcbef861004c3db57cbbf0460b5fb6a6bf7e5baf484459b94e5bf47b026ac not found: ID does not exist" Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.906353 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-combined-ca-bundle\") pod \"2b908124-d25c-461a-994b-04fd8742c0f7\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.906442 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-config-data\") pod \"2b908124-d25c-461a-994b-04fd8742c0f7\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.906653 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jhdw\" (UniqueName: \"kubernetes.io/projected/2b908124-d25c-461a-994b-04fd8742c0f7-kube-api-access-9jhdw\") pod \"2b908124-d25c-461a-994b-04fd8742c0f7\" (UID: \"2b908124-d25c-461a-994b-04fd8742c0f7\") " Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.910943 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b908124-d25c-461a-994b-04fd8742c0f7-kube-api-access-9jhdw" (OuterVolumeSpecName: "kube-api-access-9jhdw") pod "2b908124-d25c-461a-994b-04fd8742c0f7" (UID: "2b908124-d25c-461a-994b-04fd8742c0f7"). InnerVolumeSpecName "kube-api-access-9jhdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.920970 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": read tcp 10.217.0.2:38756->10.217.1.66:8775: read: connection reset by peer" Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.921326 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.66:8775/\": read tcp 10.217.0.2:38748->10.217.1.66:8775: read: connection reset by peer" Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.933642 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-config-data" (OuterVolumeSpecName: "config-data") pod "2b908124-d25c-461a-994b-04fd8742c0f7" (UID: "2b908124-d25c-461a-994b-04fd8742c0f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:53 crc kubenswrapper[4743]: I1122 09:55:53.934230 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b908124-d25c-461a-994b-04fd8742c0f7" (UID: "2b908124-d25c-461a-994b-04fd8742c0f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.008763 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jhdw\" (UniqueName: \"kubernetes.io/projected/2b908124-d25c-461a-994b-04fd8742c0f7-kube-api-access-9jhdw\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.008813 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.008826 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b908124-d25c-461a-994b-04fd8742c0f7-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.171537 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.181283 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.193673 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:54 crc kubenswrapper[4743]: E1122 09:55:54.194239 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2b2d2d-594a-4b08-baf1-d021c012f86a" containerName="nova-manage" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.194266 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2b2d2d-594a-4b08-baf1-d021c012f86a" containerName="nova-manage" Nov 22 09:55:54 crc kubenswrapper[4743]: E1122 09:55:54.194284 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b908124-d25c-461a-994b-04fd8742c0f7" containerName="nova-scheduler-scheduler" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.194293 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b908124-d25c-461a-994b-04fd8742c0f7" containerName="nova-scheduler-scheduler" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.194487 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b908124-d25c-461a-994b-04fd8742c0f7" containerName="nova-scheduler-scheduler" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.194520 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2b2d2d-594a-4b08-baf1-d021c012f86a" containerName="nova-manage" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.195264 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.198360 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.202844 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.323557 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kftdq\" (UniqueName: \"kubernetes.io/projected/7743a75b-660a-489f-a88c-4fe0e0c793e8-kube-api-access-kftdq\") pod \"nova-scheduler-0\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.323738 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.324886 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-config-data\") pod \"nova-scheduler-0\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.368724 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.431848 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.431958 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-config-data\") pod \"nova-scheduler-0\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.432135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kftdq\" (UniqueName: \"kubernetes.io/projected/7743a75b-660a-489f-a88c-4fe0e0c793e8-kube-api-access-kftdq\") pod \"nova-scheduler-0\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.436795 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.438302 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-config-data\") pod \"nova-scheduler-0\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.449074 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kftdq\" (UniqueName: \"kubernetes.io/projected/7743a75b-660a-489f-a88c-4fe0e0c793e8-kube-api-access-kftdq\") pod \"nova-scheduler-0\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.522355 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.533494 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-combined-ca-bundle\") pod \"6c056ae8-a828-4698-b855-3cb3b7c34936\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.533642 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bs47\" (UniqueName: \"kubernetes.io/projected/6c056ae8-a828-4698-b855-3cb3b7c34936-kube-api-access-9bs47\") pod \"6c056ae8-a828-4698-b855-3cb3b7c34936\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.533815 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c056ae8-a828-4698-b855-3cb3b7c34936-logs\") pod \"6c056ae8-a828-4698-b855-3cb3b7c34936\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.533958 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-config-data\") pod \"6c056ae8-a828-4698-b855-3cb3b7c34936\" (UID: \"6c056ae8-a828-4698-b855-3cb3b7c34936\") " Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.534224 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c056ae8-a828-4698-b855-3cb3b7c34936-logs" (OuterVolumeSpecName: "logs") pod "6c056ae8-a828-4698-b855-3cb3b7c34936" (UID: "6c056ae8-a828-4698-b855-3cb3b7c34936"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.534672 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c056ae8-a828-4698-b855-3cb3b7c34936-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.538649 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c056ae8-a828-4698-b855-3cb3b7c34936-kube-api-access-9bs47" (OuterVolumeSpecName: "kube-api-access-9bs47") pod "6c056ae8-a828-4698-b855-3cb3b7c34936" (UID: "6c056ae8-a828-4698-b855-3cb3b7c34936"). InnerVolumeSpecName "kube-api-access-9bs47". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.564302 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-config-data" (OuterVolumeSpecName: "config-data") pod "6c056ae8-a828-4698-b855-3cb3b7c34936" (UID: "6c056ae8-a828-4698-b855-3cb3b7c34936"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.566573 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c056ae8-a828-4698-b855-3cb3b7c34936" (UID: "6c056ae8-a828-4698-b855-3cb3b7c34936"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.636997 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.637039 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c056ae8-a828-4698-b855-3cb3b7c34936-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.637055 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bs47\" (UniqueName: \"kubernetes.io/projected/6c056ae8-a828-4698-b855-3cb3b7c34936-kube-api-access-9bs47\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.825985 4743 generic.go:334] "Generic (PLEG): container finished" podID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerID="e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049" exitCode=0 Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.826031 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c056ae8-a828-4698-b855-3cb3b7c34936","Type":"ContainerDied","Data":"e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049"} Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.826057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6c056ae8-a828-4698-b855-3cb3b7c34936","Type":"ContainerDied","Data":"390ff8b981a24f64284f874bb50ec1d1f1b8735dfdc72d6a321024e550128341"} Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.826074 4743 scope.go:117] "RemoveContainer" containerID="e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.826975 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.867318 4743 scope.go:117] "RemoveContainer" containerID="e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.871731 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.884060 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.894058 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:54 crc kubenswrapper[4743]: E1122 09:55:54.894530 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-metadata" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.894549 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-metadata" Nov 22 09:55:54 crc kubenswrapper[4743]: E1122 09:55:54.894573 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-log" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.894601 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-log" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.894838 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-log" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.894860 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" containerName="nova-metadata-metadata" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.897989 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.900690 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.914141 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.940216 4743 scope.go:117] "RemoveContainer" containerID="e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049" Nov 22 09:55:54 crc kubenswrapper[4743]: E1122 09:55:54.940914 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049\": container with ID starting with e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049 not found: ID does not exist" containerID="e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.940964 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049"} err="failed to get container status \"e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049\": rpc error: code = NotFound desc = could not find container \"e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049\": container with ID starting with e7df6c16c8798a11705b8bd2ef09e403377e27bd00c04b23490d3f700f162049 not found: ID does not exist" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.940995 4743 scope.go:117] "RemoveContainer" containerID="e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075" Nov 22 09:55:54 crc kubenswrapper[4743]: E1122 09:55:54.941505 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075\": container with ID starting with e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075 not found: ID does not exist" containerID="e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.941531 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075"} err="failed to get container status \"e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075\": rpc error: code = NotFound desc = could not find container \"e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075\": container with ID starting with e0f29fb1aa3c5d6dad58799bc223c541b9174399a5f8fc557c2fc7b319a61075 not found: ID does not exist" Nov 22 09:55:54 crc kubenswrapper[4743]: I1122 09:55:54.965155 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:55:54 crc kubenswrapper[4743]: W1122 09:55:54.968594 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7743a75b_660a_489f_a88c_4fe0e0c793e8.slice/crio-1fa56b120367daefeacd690c389b2401e5cc3bf96f08d41be555e92749b35770 WatchSource:0}: Error finding container 1fa56b120367daefeacd690c389b2401e5cc3bf96f08d41be555e92749b35770: Status 404 returned error can't find the container with id 1fa56b120367daefeacd690c389b2401e5cc3bf96f08d41be555e92749b35770 Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.044216 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-config-data\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.044296 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05f87f5-3725-48c9-b295-dfc1dd42d40e-logs\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.044334 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.044514 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92k8d\" (UniqueName: \"kubernetes.io/projected/d05f87f5-3725-48c9-b295-dfc1dd42d40e-kube-api-access-92k8d\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.147025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-config-data\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.147092 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05f87f5-3725-48c9-b295-dfc1dd42d40e-logs\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.147150 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.147724 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05f87f5-3725-48c9-b295-dfc1dd42d40e-logs\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.149620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92k8d\" (UniqueName: \"kubernetes.io/projected/d05f87f5-3725-48c9-b295-dfc1dd42d40e-kube-api-access-92k8d\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.151127 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-config-data\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.151803 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.179344 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92k8d\" (UniqueName: \"kubernetes.io/projected/d05f87f5-3725-48c9-b295-dfc1dd42d40e-kube-api-access-92k8d\") pod \"nova-metadata-0\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.181878 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b908124-d25c-461a-994b-04fd8742c0f7" path="/var/lib/kubelet/pods/2b908124-d25c-461a-994b-04fd8742c0f7/volumes" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.183302 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c056ae8-a828-4698-b855-3cb3b7c34936" path="/var/lib/kubelet/pods/6c056ae8-a828-4698-b855-3cb3b7c34936/volumes" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.213150 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.750080 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.776520 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:55:55 crc kubenswrapper[4743]: W1122 09:55:55.785277 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd05f87f5_3725_48c9_b295_dfc1dd42d40e.slice/crio-f9a145aa44f50482354ed76b70e02f3e379e744d52a5eb7456c431356507080f WatchSource:0}: Error finding container f9a145aa44f50482354ed76b70e02f3e379e744d52a5eb7456c431356507080f: Status 404 returned error can't find the container with id f9a145aa44f50482354ed76b70e02f3e379e744d52a5eb7456c431356507080f Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.840718 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7743a75b-660a-489f-a88c-4fe0e0c793e8","Type":"ContainerStarted","Data":"f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089"} Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.840770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7743a75b-660a-489f-a88c-4fe0e0c793e8","Type":"ContainerStarted","Data":"1fa56b120367daefeacd690c389b2401e5cc3bf96f08d41be555e92749b35770"} Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.847637 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d05f87f5-3725-48c9-b295-dfc1dd42d40e","Type":"ContainerStarted","Data":"f9a145aa44f50482354ed76b70e02f3e379e744d52a5eb7456c431356507080f"} Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.850939 4743 generic.go:334] "Generic (PLEG): container finished" podID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerID="114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6" exitCode=0 Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.850971 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e02b392d-4d2e-44a8-a746-c108ddd5c289","Type":"ContainerDied","Data":"114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6"} Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.850986 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e02b392d-4d2e-44a8-a746-c108ddd5c289","Type":"ContainerDied","Data":"9614281566e3e24402eeb2243f33c97f39c9de2f8da349ac59ac0680d778422a"} Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.851000 4743 scope.go:117] "RemoveContainer" containerID="114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.851145 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.864048 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-config-data\") pod \"e02b392d-4d2e-44a8-a746-c108ddd5c289\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.864528 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-combined-ca-bundle\") pod \"e02b392d-4d2e-44a8-a746-c108ddd5c289\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.864601 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7czl\" (UniqueName: \"kubernetes.io/projected/e02b392d-4d2e-44a8-a746-c108ddd5c289-kube-api-access-n7czl\") pod \"e02b392d-4d2e-44a8-a746-c108ddd5c289\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.864860 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e02b392d-4d2e-44a8-a746-c108ddd5c289-logs\") pod \"e02b392d-4d2e-44a8-a746-c108ddd5c289\" (UID: \"e02b392d-4d2e-44a8-a746-c108ddd5c289\") " Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.866303 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e02b392d-4d2e-44a8-a746-c108ddd5c289-logs" (OuterVolumeSpecName: "logs") pod "e02b392d-4d2e-44a8-a746-c108ddd5c289" (UID: "e02b392d-4d2e-44a8-a746-c108ddd5c289"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.869295 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.86927752 podStartE2EDuration="1.86927752s" podCreationTimestamp="2025-11-22 09:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:55.859133648 +0000 UTC m=+5629.565494700" watchObservedRunningTime="2025-11-22 09:55:55.86927752 +0000 UTC m=+5629.575638572" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.878771 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e02b392d-4d2e-44a8-a746-c108ddd5c289-kube-api-access-n7czl" (OuterVolumeSpecName: "kube-api-access-n7czl") pod "e02b392d-4d2e-44a8-a746-c108ddd5c289" (UID: "e02b392d-4d2e-44a8-a746-c108ddd5c289"). InnerVolumeSpecName "kube-api-access-n7czl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.883714 4743 scope.go:117] "RemoveContainer" containerID="909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.904462 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-config-data" (OuterVolumeSpecName: "config-data") pod "e02b392d-4d2e-44a8-a746-c108ddd5c289" (UID: "e02b392d-4d2e-44a8-a746-c108ddd5c289"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.906811 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e02b392d-4d2e-44a8-a746-c108ddd5c289" (UID: "e02b392d-4d2e-44a8-a746-c108ddd5c289"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.929748 4743 scope.go:117] "RemoveContainer" containerID="114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6" Nov 22 09:55:55 crc kubenswrapper[4743]: E1122 09:55:55.930432 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6\": container with ID starting with 114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6 not found: ID does not exist" containerID="114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.930477 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6"} err="failed to get container status \"114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6\": rpc error: code = NotFound desc = could not find container \"114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6\": container with ID starting with 114976c2a3999d763c7cd3c2dc468ed61b4076dd1995d8a505a95689fbaf79a6 not found: ID does not exist" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.930512 4743 scope.go:117] "RemoveContainer" containerID="909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8" Nov 22 09:55:55 crc kubenswrapper[4743]: E1122 09:55:55.930800 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8\": container with ID starting with 909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8 not found: ID does not exist" containerID="909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.930826 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8"} err="failed to get container status \"909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8\": rpc error: code = NotFound desc = could not find container \"909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8\": container with ID starting with 909ad31396fe18ce5a4c6c66ba4918a634b9681d7c78d63304e416e92b6a3ff8 not found: ID does not exist" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.970106 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.970153 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e02b392d-4d2e-44a8-a746-c108ddd5c289-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.970168 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7czl\" (UniqueName: \"kubernetes.io/projected/e02b392d-4d2e-44a8-a746-c108ddd5c289-kube-api-access-n7czl\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:55 crc kubenswrapper[4743]: I1122 09:55:55.970179 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e02b392d-4d2e-44a8-a746-c108ddd5c289-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.189664 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.205215 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.215565 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:56 crc kubenswrapper[4743]: E1122 09:55:56.216215 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerName="nova-api-api" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.216280 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerName="nova-api-api" Nov 22 09:55:56 crc kubenswrapper[4743]: E1122 09:55:56.216342 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerName="nova-api-log" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.216415 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerName="nova-api-log" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.216656 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerName="nova-api-api" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.216755 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" containerName="nova-api-log" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.218075 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.225421 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.227328 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.375727 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-logs\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.375842 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-config-data\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.375879 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.375919 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvg6g\" (UniqueName: \"kubernetes.io/projected/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-kube-api-access-wvg6g\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.477834 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-logs\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.478327 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-config-data\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.478358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.478391 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvg6g\" (UniqueName: \"kubernetes.io/projected/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-kube-api-access-wvg6g\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.478246 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-logs\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.481887 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.486324 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-config-data\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.496789 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvg6g\" (UniqueName: \"kubernetes.io/projected/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-kube-api-access-wvg6g\") pod \"nova-api-0\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.560030 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.860834 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d05f87f5-3725-48c9-b295-dfc1dd42d40e","Type":"ContainerStarted","Data":"b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71"} Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.860989 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d05f87f5-3725-48c9-b295-dfc1dd42d40e","Type":"ContainerStarted","Data":"bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c"} Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.874657 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:55:56 crc kubenswrapper[4743]: W1122 09:55:56.878359 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51355ddf_602c_4f1b_b5e7_ae859e1f1dbb.slice/crio-30119c1871eb582d544a898d8ba52581bef84a2c28bb95c7af129766c126e63f WatchSource:0}: Error finding container 30119c1871eb582d544a898d8ba52581bef84a2c28bb95c7af129766c126e63f: Status 404 returned error can't find the container with id 30119c1871eb582d544a898d8ba52581bef84a2c28bb95c7af129766c126e63f Nov 22 09:55:56 crc kubenswrapper[4743]: I1122 09:55:56.888752 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.88873517 podStartE2EDuration="2.88873517s" podCreationTimestamp="2025-11-22 09:55:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:56.885474346 +0000 UTC m=+5630.591835398" watchObservedRunningTime="2025-11-22 09:55:56.88873517 +0000 UTC m=+5630.595096222" Nov 22 09:55:57 crc kubenswrapper[4743]: I1122 09:55:57.166809 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e02b392d-4d2e-44a8-a746-c108ddd5c289" path="/var/lib/kubelet/pods/e02b392d-4d2e-44a8-a746-c108ddd5c289/volumes" Nov 22 09:55:57 crc kubenswrapper[4743]: I1122 09:55:57.885817 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb","Type":"ContainerStarted","Data":"da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d"} Nov 22 09:55:57 crc kubenswrapper[4743]: I1122 09:55:57.886141 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb","Type":"ContainerStarted","Data":"e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17"} Nov 22 09:55:57 crc kubenswrapper[4743]: I1122 09:55:57.886154 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb","Type":"ContainerStarted","Data":"30119c1871eb582d544a898d8ba52581bef84a2c28bb95c7af129766c126e63f"} Nov 22 09:55:57 crc kubenswrapper[4743]: I1122 09:55:57.909477 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.909458536 podStartE2EDuration="1.909458536s" podCreationTimestamp="2025-11-22 09:55:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:55:57.902455135 +0000 UTC m=+5631.608816187" watchObservedRunningTime="2025-11-22 09:55:57.909458536 +0000 UTC m=+5631.615819588" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.245121 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.308494 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-r9kkr"] Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.309949 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" podUID="8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" containerName="dnsmasq-dns" containerID="cri-o://c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777" gracePeriod=10 Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.806818 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.894710 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" containerID="c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777" exitCode=0 Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.894867 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.895450 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" event={"ID":"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8","Type":"ContainerDied","Data":"c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777"} Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.895495 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-696f9966c7-r9kkr" event={"ID":"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8","Type":"ContainerDied","Data":"d119379142b5b7413d5f17a98ff245e6c9435fd1bc5b917337d957897f318119"} Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.895514 4743 scope.go:117] "RemoveContainer" containerID="c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.914938 4743 scope.go:117] "RemoveContainer" containerID="48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.924454 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88st7\" (UniqueName: \"kubernetes.io/projected/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-kube-api-access-88st7\") pod \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.924618 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-config\") pod \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.924671 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-sb\") pod \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.924699 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-nb\") pod \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.924787 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-dns-svc\") pod \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\" (UID: \"8b9591ad-2941-43f5-9cd3-f6dafa64ffb8\") " Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.930932 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-kube-api-access-88st7" (OuterVolumeSpecName: "kube-api-access-88st7") pod "8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" (UID: "8b9591ad-2941-43f5-9cd3-f6dafa64ffb8"). InnerVolumeSpecName "kube-api-access-88st7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.934902 4743 scope.go:117] "RemoveContainer" containerID="c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777" Nov 22 09:55:58 crc kubenswrapper[4743]: E1122 09:55:58.938622 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777\": container with ID starting with c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777 not found: ID does not exist" containerID="c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.938685 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777"} err="failed to get container status \"c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777\": rpc error: code = NotFound desc = could not find container \"c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777\": container with ID starting with c74d487d83c7acdf6ef63bcc01bb4089ddb0b4a205ebb62d322c308d49790777 not found: ID does not exist" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.938712 4743 scope.go:117] "RemoveContainer" containerID="48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4" Nov 22 09:55:58 crc kubenswrapper[4743]: E1122 09:55:58.939069 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4\": container with ID starting with 48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4 not found: ID does not exist" containerID="48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.939109 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4"} err="failed to get container status \"48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4\": rpc error: code = NotFound desc = could not find container \"48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4\": container with ID starting with 48b99041c8b0531892bfc4362d3e79d1beb877fd989c14dbfc20efd4d772a3d4 not found: ID does not exist" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.968708 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" (UID: "8b9591ad-2941-43f5-9cd3-f6dafa64ffb8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.973058 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-config" (OuterVolumeSpecName: "config") pod "8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" (UID: "8b9591ad-2941-43f5-9cd3-f6dafa64ffb8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.973080 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" (UID: "8b9591ad-2941-43f5-9cd3-f6dafa64ffb8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:55:58 crc kubenswrapper[4743]: I1122 09:55:58.975311 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" (UID: "8b9591ad-2941-43f5-9cd3-f6dafa64ffb8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:55:59 crc kubenswrapper[4743]: I1122 09:55:59.026407 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:59 crc kubenswrapper[4743]: I1122 09:55:59.026447 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88st7\" (UniqueName: \"kubernetes.io/projected/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-kube-api-access-88st7\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:59 crc kubenswrapper[4743]: I1122 09:55:59.026459 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:59 crc kubenswrapper[4743]: I1122 09:55:59.026556 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:59 crc kubenswrapper[4743]: I1122 09:55:59.026628 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:55:59 crc kubenswrapper[4743]: I1122 09:55:59.221682 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-r9kkr"] Nov 22 09:55:59 crc kubenswrapper[4743]: I1122 09:55:59.231230 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-696f9966c7-r9kkr"] Nov 22 09:55:59 crc kubenswrapper[4743]: I1122 09:55:59.523424 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 09:56:00 crc kubenswrapper[4743]: I1122 09:56:00.214957 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:56:00 crc kubenswrapper[4743]: I1122 09:56:00.215763 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:56:01 crc kubenswrapper[4743]: I1122 09:56:01.164794 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" path="/var/lib/kubelet/pods/8b9591ad-2941-43f5-9cd3-f6dafa64ffb8/volumes" Nov 22 09:56:04 crc kubenswrapper[4743]: I1122 09:56:04.523276 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 09:56:04 crc kubenswrapper[4743]: I1122 09:56:04.552228 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 09:56:04 crc kubenswrapper[4743]: I1122 09:56:04.974205 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 09:56:05 crc kubenswrapper[4743]: I1122 09:56:05.214808 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 09:56:05 crc kubenswrapper[4743]: I1122 09:56:05.215108 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 09:56:06 crc kubenswrapper[4743]: I1122 09:56:06.298823 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:56:06 crc kubenswrapper[4743]: I1122 09:56:06.298833 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:56:06 crc kubenswrapper[4743]: I1122 09:56:06.561281 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:56:06 crc kubenswrapper[4743]: I1122 09:56:06.561401 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:56:07 crc kubenswrapper[4743]: I1122 09:56:07.645802 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:56:07 crc kubenswrapper[4743]: I1122 09:56:07.645822 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:56:15 crc kubenswrapper[4743]: I1122 09:56:15.217565 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 09:56:15 crc kubenswrapper[4743]: I1122 09:56:15.218506 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 09:56:15 crc kubenswrapper[4743]: I1122 09:56:15.220761 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 09:56:15 crc kubenswrapper[4743]: I1122 09:56:15.221113 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 09:56:16 crc kubenswrapper[4743]: I1122 09:56:16.565906 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 09:56:16 crc kubenswrapper[4743]: I1122 09:56:16.566044 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 09:56:16 crc kubenswrapper[4743]: I1122 09:56:16.566685 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 09:56:16 crc kubenswrapper[4743]: I1122 09:56:16.566787 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 09:56:16 crc kubenswrapper[4743]: I1122 09:56:16.572034 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 09:56:16 crc kubenswrapper[4743]: I1122 09:56:16.572104 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.688522 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-9dn66"] Nov 22 09:56:26 crc kubenswrapper[4743]: E1122 09:56:26.689549 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" containerName="dnsmasq-dns" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.689566 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" containerName="dnsmasq-dns" Nov 22 09:56:26 crc kubenswrapper[4743]: E1122 09:56:26.689612 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" containerName="init" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.689620 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" containerName="init" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.689885 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9591ad-2941-43f5-9cd3-f6dafa64ffb8" containerName="dnsmasq-dns" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.690681 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9dn66" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.695627 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9dn66"] Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.767951 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4a2ede5-5abd-4899-a65b-45c0eac279c4-operator-scripts\") pod \"cinder-db-create-9dn66\" (UID: \"d4a2ede5-5abd-4899-a65b-45c0eac279c4\") " pod="openstack/cinder-db-create-9dn66" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.768178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmdg7\" (UniqueName: \"kubernetes.io/projected/d4a2ede5-5abd-4899-a65b-45c0eac279c4-kube-api-access-zmdg7\") pod \"cinder-db-create-9dn66\" (UID: \"d4a2ede5-5abd-4899-a65b-45c0eac279c4\") " pod="openstack/cinder-db-create-9dn66" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.793634 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f58b-account-create-gt4nm"] Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.795333 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f58b-account-create-gt4nm" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.797844 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.814790 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f58b-account-create-gt4nm"] Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.870125 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f87462c-1792-4b68-82f0-3bbcb314686d-operator-scripts\") pod \"cinder-f58b-account-create-gt4nm\" (UID: \"8f87462c-1792-4b68-82f0-3bbcb314686d\") " pod="openstack/cinder-f58b-account-create-gt4nm" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.870619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4a2ede5-5abd-4899-a65b-45c0eac279c4-operator-scripts\") pod \"cinder-db-create-9dn66\" (UID: \"d4a2ede5-5abd-4899-a65b-45c0eac279c4\") " pod="openstack/cinder-db-create-9dn66" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.870757 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9r7g\" (UniqueName: \"kubernetes.io/projected/8f87462c-1792-4b68-82f0-3bbcb314686d-kube-api-access-n9r7g\") pod \"cinder-f58b-account-create-gt4nm\" (UID: \"8f87462c-1792-4b68-82f0-3bbcb314686d\") " pod="openstack/cinder-f58b-account-create-gt4nm" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.870864 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmdg7\" (UniqueName: \"kubernetes.io/projected/d4a2ede5-5abd-4899-a65b-45c0eac279c4-kube-api-access-zmdg7\") pod \"cinder-db-create-9dn66\" (UID: \"d4a2ede5-5abd-4899-a65b-45c0eac279c4\") " pod="openstack/cinder-db-create-9dn66" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.871781 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4a2ede5-5abd-4899-a65b-45c0eac279c4-operator-scripts\") pod \"cinder-db-create-9dn66\" (UID: \"d4a2ede5-5abd-4899-a65b-45c0eac279c4\") " pod="openstack/cinder-db-create-9dn66" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.890290 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmdg7\" (UniqueName: \"kubernetes.io/projected/d4a2ede5-5abd-4899-a65b-45c0eac279c4-kube-api-access-zmdg7\") pod \"cinder-db-create-9dn66\" (UID: \"d4a2ede5-5abd-4899-a65b-45c0eac279c4\") " pod="openstack/cinder-db-create-9dn66" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.972942 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9r7g\" (UniqueName: \"kubernetes.io/projected/8f87462c-1792-4b68-82f0-3bbcb314686d-kube-api-access-n9r7g\") pod \"cinder-f58b-account-create-gt4nm\" (UID: \"8f87462c-1792-4b68-82f0-3bbcb314686d\") " pod="openstack/cinder-f58b-account-create-gt4nm" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.973311 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f87462c-1792-4b68-82f0-3bbcb314686d-operator-scripts\") pod \"cinder-f58b-account-create-gt4nm\" (UID: \"8f87462c-1792-4b68-82f0-3bbcb314686d\") " pod="openstack/cinder-f58b-account-create-gt4nm" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.974449 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f87462c-1792-4b68-82f0-3bbcb314686d-operator-scripts\") pod \"cinder-f58b-account-create-gt4nm\" (UID: \"8f87462c-1792-4b68-82f0-3bbcb314686d\") " pod="openstack/cinder-f58b-account-create-gt4nm" Nov 22 09:56:26 crc kubenswrapper[4743]: I1122 09:56:26.994055 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9r7g\" (UniqueName: \"kubernetes.io/projected/8f87462c-1792-4b68-82f0-3bbcb314686d-kube-api-access-n9r7g\") pod \"cinder-f58b-account-create-gt4nm\" (UID: \"8f87462c-1792-4b68-82f0-3bbcb314686d\") " pod="openstack/cinder-f58b-account-create-gt4nm" Nov 22 09:56:27 crc kubenswrapper[4743]: I1122 09:56:27.009134 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9dn66" Nov 22 09:56:27 crc kubenswrapper[4743]: I1122 09:56:27.114046 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f58b-account-create-gt4nm" Nov 22 09:56:27 crc kubenswrapper[4743]: I1122 09:56:27.633116 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-9dn66"] Nov 22 09:56:27 crc kubenswrapper[4743]: I1122 09:56:27.722604 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f58b-account-create-gt4nm"] Nov 22 09:56:27 crc kubenswrapper[4743]: W1122 09:56:27.733435 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f87462c_1792_4b68_82f0_3bbcb314686d.slice/crio-3fb9d582046a6863a7682aca2a60c56e5ee890e8c3a65d798ab08fde31aa801c WatchSource:0}: Error finding container 3fb9d582046a6863a7682aca2a60c56e5ee890e8c3a65d798ab08fde31aa801c: Status 404 returned error can't find the container with id 3fb9d582046a6863a7682aca2a60c56e5ee890e8c3a65d798ab08fde31aa801c Nov 22 09:56:28 crc kubenswrapper[4743]: I1122 09:56:28.265643 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f58b-account-create-gt4nm" event={"ID":"8f87462c-1792-4b68-82f0-3bbcb314686d","Type":"ContainerStarted","Data":"50556b3083165646da7ba173acd2dd8a9feb2e072e3ec9c4233ae40e94b54641"} Nov 22 09:56:28 crc kubenswrapper[4743]: I1122 09:56:28.266116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f58b-account-create-gt4nm" event={"ID":"8f87462c-1792-4b68-82f0-3bbcb314686d","Type":"ContainerStarted","Data":"3fb9d582046a6863a7682aca2a60c56e5ee890e8c3a65d798ab08fde31aa801c"} Nov 22 09:56:28 crc kubenswrapper[4743]: I1122 09:56:28.267342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9dn66" event={"ID":"d4a2ede5-5abd-4899-a65b-45c0eac279c4","Type":"ContainerStarted","Data":"a180f17750eabcbf6c96aaf8eb5febb2fca99436fec7bf3fc6aef77fce152b8f"} Nov 22 09:56:28 crc kubenswrapper[4743]: I1122 09:56:28.267387 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9dn66" event={"ID":"d4a2ede5-5abd-4899-a65b-45c0eac279c4","Type":"ContainerStarted","Data":"c9f5447a90b73fd809cd9c6212bac33d4c505db6823c3d7761b7853131d6a0d2"} Nov 22 09:56:28 crc kubenswrapper[4743]: I1122 09:56:28.289460 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-f58b-account-create-gt4nm" podStartSLOduration=2.289439911 podStartE2EDuration="2.289439911s" podCreationTimestamp="2025-11-22 09:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:56:28.279770954 +0000 UTC m=+5661.986132006" watchObservedRunningTime="2025-11-22 09:56:28.289439911 +0000 UTC m=+5661.995800973" Nov 22 09:56:28 crc kubenswrapper[4743]: I1122 09:56:28.301454 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-9dn66" podStartSLOduration=2.301430196 podStartE2EDuration="2.301430196s" podCreationTimestamp="2025-11-22 09:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:56:28.294535318 +0000 UTC m=+5662.000896380" watchObservedRunningTime="2025-11-22 09:56:28.301430196 +0000 UTC m=+5662.007791268" Nov 22 09:56:29 crc kubenswrapper[4743]: I1122 09:56:29.277607 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4a2ede5-5abd-4899-a65b-45c0eac279c4" containerID="a180f17750eabcbf6c96aaf8eb5febb2fca99436fec7bf3fc6aef77fce152b8f" exitCode=0 Nov 22 09:56:29 crc kubenswrapper[4743]: I1122 09:56:29.278955 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9dn66" event={"ID":"d4a2ede5-5abd-4899-a65b-45c0eac279c4","Type":"ContainerDied","Data":"a180f17750eabcbf6c96aaf8eb5febb2fca99436fec7bf3fc6aef77fce152b8f"} Nov 22 09:56:29 crc kubenswrapper[4743]: I1122 09:56:29.279478 4743 generic.go:334] "Generic (PLEG): container finished" podID="8f87462c-1792-4b68-82f0-3bbcb314686d" containerID="50556b3083165646da7ba173acd2dd8a9feb2e072e3ec9c4233ae40e94b54641" exitCode=0 Nov 22 09:56:29 crc kubenswrapper[4743]: I1122 09:56:29.279523 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f58b-account-create-gt4nm" event={"ID":"8f87462c-1792-4b68-82f0-3bbcb314686d","Type":"ContainerDied","Data":"50556b3083165646da7ba173acd2dd8a9feb2e072e3ec9c4233ae40e94b54641"} Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.707118 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f58b-account-create-gt4nm" Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.713933 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9dn66" Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.736222 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmdg7\" (UniqueName: \"kubernetes.io/projected/d4a2ede5-5abd-4899-a65b-45c0eac279c4-kube-api-access-zmdg7\") pod \"d4a2ede5-5abd-4899-a65b-45c0eac279c4\" (UID: \"d4a2ede5-5abd-4899-a65b-45c0eac279c4\") " Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.736336 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f87462c-1792-4b68-82f0-3bbcb314686d-operator-scripts\") pod \"8f87462c-1792-4b68-82f0-3bbcb314686d\" (UID: \"8f87462c-1792-4b68-82f0-3bbcb314686d\") " Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.736546 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9r7g\" (UniqueName: \"kubernetes.io/projected/8f87462c-1792-4b68-82f0-3bbcb314686d-kube-api-access-n9r7g\") pod \"8f87462c-1792-4b68-82f0-3bbcb314686d\" (UID: \"8f87462c-1792-4b68-82f0-3bbcb314686d\") " Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.736682 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4a2ede5-5abd-4899-a65b-45c0eac279c4-operator-scripts\") pod \"d4a2ede5-5abd-4899-a65b-45c0eac279c4\" (UID: \"d4a2ede5-5abd-4899-a65b-45c0eac279c4\") " Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.737148 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f87462c-1792-4b68-82f0-3bbcb314686d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8f87462c-1792-4b68-82f0-3bbcb314686d" (UID: "8f87462c-1792-4b68-82f0-3bbcb314686d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.737141 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4a2ede5-5abd-4899-a65b-45c0eac279c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4a2ede5-5abd-4899-a65b-45c0eac279c4" (UID: "d4a2ede5-5abd-4899-a65b-45c0eac279c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.742766 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f87462c-1792-4b68-82f0-3bbcb314686d-kube-api-access-n9r7g" (OuterVolumeSpecName: "kube-api-access-n9r7g") pod "8f87462c-1792-4b68-82f0-3bbcb314686d" (UID: "8f87462c-1792-4b68-82f0-3bbcb314686d"). InnerVolumeSpecName "kube-api-access-n9r7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.742856 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4a2ede5-5abd-4899-a65b-45c0eac279c4-kube-api-access-zmdg7" (OuterVolumeSpecName: "kube-api-access-zmdg7") pod "d4a2ede5-5abd-4899-a65b-45c0eac279c4" (UID: "d4a2ede5-5abd-4899-a65b-45c0eac279c4"). InnerVolumeSpecName "kube-api-access-zmdg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.839464 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8f87462c-1792-4b68-82f0-3bbcb314686d-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.839508 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9r7g\" (UniqueName: \"kubernetes.io/projected/8f87462c-1792-4b68-82f0-3bbcb314686d-kube-api-access-n9r7g\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.839527 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4a2ede5-5abd-4899-a65b-45c0eac279c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:30 crc kubenswrapper[4743]: I1122 09:56:30.839537 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmdg7\" (UniqueName: \"kubernetes.io/projected/d4a2ede5-5abd-4899-a65b-45c0eac279c4-kube-api-access-zmdg7\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:31 crc kubenswrapper[4743]: I1122 09:56:31.297942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f58b-account-create-gt4nm" event={"ID":"8f87462c-1792-4b68-82f0-3bbcb314686d","Type":"ContainerDied","Data":"3fb9d582046a6863a7682aca2a60c56e5ee890e8c3a65d798ab08fde31aa801c"} Nov 22 09:56:31 crc kubenswrapper[4743]: I1122 09:56:31.297976 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fb9d582046a6863a7682aca2a60c56e5ee890e8c3a65d798ab08fde31aa801c" Nov 22 09:56:31 crc kubenswrapper[4743]: I1122 09:56:31.297991 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f58b-account-create-gt4nm" Nov 22 09:56:31 crc kubenswrapper[4743]: I1122 09:56:31.304436 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-9dn66" event={"ID":"d4a2ede5-5abd-4899-a65b-45c0eac279c4","Type":"ContainerDied","Data":"c9f5447a90b73fd809cd9c6212bac33d4c505db6823c3d7761b7853131d6a0d2"} Nov 22 09:56:31 crc kubenswrapper[4743]: I1122 09:56:31.304538 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9f5447a90b73fd809cd9c6212bac33d4c505db6823c3d7761b7853131d6a0d2" Nov 22 09:56:31 crc kubenswrapper[4743]: I1122 09:56:31.304645 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-9dn66" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.017034 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fgqpl"] Nov 22 09:56:32 crc kubenswrapper[4743]: E1122 09:56:32.017747 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4a2ede5-5abd-4899-a65b-45c0eac279c4" containerName="mariadb-database-create" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.017761 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4a2ede5-5abd-4899-a65b-45c0eac279c4" containerName="mariadb-database-create" Nov 22 09:56:32 crc kubenswrapper[4743]: E1122 09:56:32.017784 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f87462c-1792-4b68-82f0-3bbcb314686d" containerName="mariadb-account-create" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.017790 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f87462c-1792-4b68-82f0-3bbcb314686d" containerName="mariadb-account-create" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.017985 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f87462c-1792-4b68-82f0-3bbcb314686d" containerName="mariadb-account-create" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.018004 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4a2ede5-5abd-4899-a65b-45c0eac279c4" containerName="mariadb-database-create" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.018624 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.026054 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7ppmg" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.026176 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.026366 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.031557 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fgqpl"] Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.058163 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-config-data\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.058239 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-db-sync-config-data\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.058275 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-scripts\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.058291 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61988351-c22b-4c17-ad07-e5fdfd3edea0-etc-machine-id\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.058334 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-combined-ca-bundle\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.058376 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktf4s\" (UniqueName: \"kubernetes.io/projected/61988351-c22b-4c17-ad07-e5fdfd3edea0-kube-api-access-ktf4s\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.168069 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-combined-ca-bundle\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.168512 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktf4s\" (UniqueName: \"kubernetes.io/projected/61988351-c22b-4c17-ad07-e5fdfd3edea0-kube-api-access-ktf4s\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.168902 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-config-data\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.168999 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-db-sync-config-data\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.169049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-scripts\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.169075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61988351-c22b-4c17-ad07-e5fdfd3edea0-etc-machine-id\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.169180 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61988351-c22b-4c17-ad07-e5fdfd3edea0-etc-machine-id\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.173961 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-db-sync-config-data\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.174132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-config-data\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.174573 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-combined-ca-bundle\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.184873 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-scripts\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.189135 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktf4s\" (UniqueName: \"kubernetes.io/projected/61988351-c22b-4c17-ad07-e5fdfd3edea0-kube-api-access-ktf4s\") pod \"cinder-db-sync-fgqpl\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.342791 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:32 crc kubenswrapper[4743]: I1122 09:56:32.795530 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fgqpl"] Nov 22 09:56:32 crc kubenswrapper[4743]: W1122 09:56:32.802961 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61988351_c22b_4c17_ad07_e5fdfd3edea0.slice/crio-0df9e0caa63118c603aae1a9de7f8c8b572b4a4001533cf1499875e4a2834ef9 WatchSource:0}: Error finding container 0df9e0caa63118c603aae1a9de7f8c8b572b4a4001533cf1499875e4a2834ef9: Status 404 returned error can't find the container with id 0df9e0caa63118c603aae1a9de7f8c8b572b4a4001533cf1499875e4a2834ef9 Nov 22 09:56:33 crc kubenswrapper[4743]: I1122 09:56:33.325442 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgqpl" event={"ID":"61988351-c22b-4c17-ad07-e5fdfd3edea0","Type":"ContainerStarted","Data":"0df9e0caa63118c603aae1a9de7f8c8b572b4a4001533cf1499875e4a2834ef9"} Nov 22 09:56:34 crc kubenswrapper[4743]: I1122 09:56:34.340180 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgqpl" event={"ID":"61988351-c22b-4c17-ad07-e5fdfd3edea0","Type":"ContainerStarted","Data":"0fe17c12f93974794d3e3628703864a4451a452538ac160835e8ff3c5b1a9673"} Nov 22 09:56:34 crc kubenswrapper[4743]: I1122 09:56:34.371192 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fgqpl" podStartSLOduration=3.371169107 podStartE2EDuration="3.371169107s" podCreationTimestamp="2025-11-22 09:56:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:56:34.358901274 +0000 UTC m=+5668.065262366" watchObservedRunningTime="2025-11-22 09:56:34.371169107 +0000 UTC m=+5668.077530169" Nov 22 09:56:41 crc kubenswrapper[4743]: I1122 09:56:41.417949 4743 generic.go:334] "Generic (PLEG): container finished" podID="61988351-c22b-4c17-ad07-e5fdfd3edea0" containerID="0fe17c12f93974794d3e3628703864a4451a452538ac160835e8ff3c5b1a9673" exitCode=0 Nov 22 09:56:41 crc kubenswrapper[4743]: I1122 09:56:41.418176 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgqpl" event={"ID":"61988351-c22b-4c17-ad07-e5fdfd3edea0","Type":"ContainerDied","Data":"0fe17c12f93974794d3e3628703864a4451a452538ac160835e8ff3c5b1a9673"} Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.855262 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.969448 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-combined-ca-bundle\") pod \"61988351-c22b-4c17-ad07-e5fdfd3edea0\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.970038 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktf4s\" (UniqueName: \"kubernetes.io/projected/61988351-c22b-4c17-ad07-e5fdfd3edea0-kube-api-access-ktf4s\") pod \"61988351-c22b-4c17-ad07-e5fdfd3edea0\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.970198 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61988351-c22b-4c17-ad07-e5fdfd3edea0-etc-machine-id\") pod \"61988351-c22b-4c17-ad07-e5fdfd3edea0\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.970322 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-db-sync-config-data\") pod \"61988351-c22b-4c17-ad07-e5fdfd3edea0\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.970441 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-config-data\") pod \"61988351-c22b-4c17-ad07-e5fdfd3edea0\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.970250 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61988351-c22b-4c17-ad07-e5fdfd3edea0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "61988351-c22b-4c17-ad07-e5fdfd3edea0" (UID: "61988351-c22b-4c17-ad07-e5fdfd3edea0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.970608 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-scripts\") pod \"61988351-c22b-4c17-ad07-e5fdfd3edea0\" (UID: \"61988351-c22b-4c17-ad07-e5fdfd3edea0\") " Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.971451 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/61988351-c22b-4c17-ad07-e5fdfd3edea0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.975068 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-scripts" (OuterVolumeSpecName: "scripts") pod "61988351-c22b-4c17-ad07-e5fdfd3edea0" (UID: "61988351-c22b-4c17-ad07-e5fdfd3edea0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.975144 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61988351-c22b-4c17-ad07-e5fdfd3edea0-kube-api-access-ktf4s" (OuterVolumeSpecName: "kube-api-access-ktf4s") pod "61988351-c22b-4c17-ad07-e5fdfd3edea0" (UID: "61988351-c22b-4c17-ad07-e5fdfd3edea0"). InnerVolumeSpecName "kube-api-access-ktf4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.976088 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "61988351-c22b-4c17-ad07-e5fdfd3edea0" (UID: "61988351-c22b-4c17-ad07-e5fdfd3edea0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:42 crc kubenswrapper[4743]: I1122 09:56:42.995878 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61988351-c22b-4c17-ad07-e5fdfd3edea0" (UID: "61988351-c22b-4c17-ad07-e5fdfd3edea0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.025454 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-config-data" (OuterVolumeSpecName: "config-data") pod "61988351-c22b-4c17-ad07-e5fdfd3edea0" (UID: "61988351-c22b-4c17-ad07-e5fdfd3edea0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.073642 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktf4s\" (UniqueName: \"kubernetes.io/projected/61988351-c22b-4c17-ad07-e5fdfd3edea0-kube-api-access-ktf4s\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.073847 4743 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.073944 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.074028 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.074103 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61988351-c22b-4c17-ad07-e5fdfd3edea0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.462241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fgqpl" event={"ID":"61988351-c22b-4c17-ad07-e5fdfd3edea0","Type":"ContainerDied","Data":"0df9e0caa63118c603aae1a9de7f8c8b572b4a4001533cf1499875e4a2834ef9"} Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.462294 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0df9e0caa63118c603aae1a9de7f8c8b572b4a4001533cf1499875e4a2834ef9" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.462350 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fgqpl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.800316 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-qxqsl"] Nov 22 09:56:43 crc kubenswrapper[4743]: E1122 09:56:43.801111 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61988351-c22b-4c17-ad07-e5fdfd3edea0" containerName="cinder-db-sync" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.801131 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="61988351-c22b-4c17-ad07-e5fdfd3edea0" containerName="cinder-db-sync" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.801323 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="61988351-c22b-4c17-ad07-e5fdfd3edea0" containerName="cinder-db-sync" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.803333 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.834605 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-qxqsl"] Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.896163 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-dns-svc\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.896213 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.896301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-config\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.896502 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxz7r\" (UniqueName: \"kubernetes.io/projected/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-kube-api-access-pxz7r\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.896720 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.992118 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.994225 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.996002 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.998276 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7ppmg" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.998395 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.998778 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.998831 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-config\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.998899 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxz7r\" (UniqueName: \"kubernetes.io/projected/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-kube-api-access-pxz7r\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.998978 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:43 crc kubenswrapper[4743]: I1122 09:56:43.999024 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-dns-svc\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:43.999980 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-dns-svc\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.000774 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.001131 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-config\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.001360 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.001432 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.014120 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.032738 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxz7r\" (UniqueName: \"kubernetes.io/projected/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-kube-api-access-pxz7r\") pod \"dnsmasq-dns-58dc68b6c7-qxqsl\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.100999 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data-custom\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.101051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.101076 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-scripts\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.101160 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-logs\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.101183 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.101200 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.101242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5l8wh\" (UniqueName: \"kubernetes.io/projected/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-kube-api-access-5l8wh\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.143467 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.202441 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-logs\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.202798 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.202823 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.202867 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5l8wh\" (UniqueName: \"kubernetes.io/projected/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-kube-api-access-5l8wh\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.202910 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data-custom\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.202931 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.202948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-scripts\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.203217 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-logs\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.203292 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.208813 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-scripts\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.209768 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.209879 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.214460 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data-custom\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.229320 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5l8wh\" (UniqueName: \"kubernetes.io/projected/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-kube-api-access-5l8wh\") pod \"cinder-api-0\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.323874 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.737251 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-qxqsl"] Nov 22 09:56:44 crc kubenswrapper[4743]: I1122 09:56:44.874587 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:56:45 crc kubenswrapper[4743]: I1122 09:56:45.507668 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca","Type":"ContainerStarted","Data":"3bb2a9cc6ecb6da377e05296bfc4d27bf6f8b1fe246bc50f95517e7aa4bef2a1"} Nov 22 09:56:45 crc kubenswrapper[4743]: I1122 09:56:45.520138 4743 generic.go:334] "Generic (PLEG): container finished" podID="06eb355f-ad73-42ff-8549-9a2d8a71b5f8" containerID="628653c160e74df425da8386b159f78e50c01cfd9a66c71526c46757b2f93862" exitCode=0 Nov 22 09:56:45 crc kubenswrapper[4743]: I1122 09:56:45.520191 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" event={"ID":"06eb355f-ad73-42ff-8549-9a2d8a71b5f8","Type":"ContainerDied","Data":"628653c160e74df425da8386b159f78e50c01cfd9a66c71526c46757b2f93862"} Nov 22 09:56:45 crc kubenswrapper[4743]: I1122 09:56:45.520218 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" event={"ID":"06eb355f-ad73-42ff-8549-9a2d8a71b5f8","Type":"ContainerStarted","Data":"d248483764fc1c420e6fda4dd9b9eef1dbdd40fe72223d9986b6c5dfaa52ee7f"} Nov 22 09:56:46 crc kubenswrapper[4743]: I1122 09:56:46.530165 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca","Type":"ContainerStarted","Data":"6d7f8971e63edda8c64a98c975296a38dc72607c2037d84f94daa432eafbfc8f"} Nov 22 09:56:46 crc kubenswrapper[4743]: I1122 09:56:46.530701 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca","Type":"ContainerStarted","Data":"fd628be2e89ecbb90cfd902467d7fc2b03732c226593dcee6b465cafeaa721ec"} Nov 22 09:56:46 crc kubenswrapper[4743]: I1122 09:56:46.530740 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 09:56:46 crc kubenswrapper[4743]: I1122 09:56:46.534286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" event={"ID":"06eb355f-ad73-42ff-8549-9a2d8a71b5f8","Type":"ContainerStarted","Data":"286f64e0c4317765d7f018c71499c21348eb7d8936b03824d2a454948f4a7504"} Nov 22 09:56:46 crc kubenswrapper[4743]: I1122 09:56:46.534504 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:46 crc kubenswrapper[4743]: I1122 09:56:46.551533 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.5515115440000002 podStartE2EDuration="3.551511544s" podCreationTimestamp="2025-11-22 09:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:56:46.54684965 +0000 UTC m=+5680.253210702" watchObservedRunningTime="2025-11-22 09:56:46.551511544 +0000 UTC m=+5680.257872596" Nov 22 09:56:46 crc kubenswrapper[4743]: I1122 09:56:46.581080 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" podStartSLOduration=3.581062893 podStartE2EDuration="3.581062893s" podCreationTimestamp="2025-11-22 09:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:56:46.575859523 +0000 UTC m=+5680.282220595" watchObservedRunningTime="2025-11-22 09:56:46.581062893 +0000 UTC m=+5680.287423945" Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.144749 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.216361 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qh62j"] Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.216896 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" podUID="56209ff1-f81f-4fa2-be9e-d3387f03a7a7" containerName="dnsmasq-dns" containerID="cri-o://87a30abfcd59ad2cb87cb358f309afae08a4d6060569e74d50f3b883bb3f113c" gracePeriod=10 Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.622665 4743 generic.go:334] "Generic (PLEG): container finished" podID="56209ff1-f81f-4fa2-be9e-d3387f03a7a7" containerID="87a30abfcd59ad2cb87cb358f309afae08a4d6060569e74d50f3b883bb3f113c" exitCode=0 Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.622857 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" event={"ID":"56209ff1-f81f-4fa2-be9e-d3387f03a7a7","Type":"ContainerDied","Data":"87a30abfcd59ad2cb87cb358f309afae08a4d6060569e74d50f3b883bb3f113c"} Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.795804 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.926625 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-config\") pod \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.926875 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-sb\") pod \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.926950 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-dns-svc\") pod \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.926978 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-nb\") pod \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.927050 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkh2v\" (UniqueName: \"kubernetes.io/projected/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-kube-api-access-tkh2v\") pod \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\" (UID: \"56209ff1-f81f-4fa2-be9e-d3387f03a7a7\") " Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.946360 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-kube-api-access-tkh2v" (OuterVolumeSpecName: "kube-api-access-tkh2v") pod "56209ff1-f81f-4fa2-be9e-d3387f03a7a7" (UID: "56209ff1-f81f-4fa2-be9e-d3387f03a7a7"). InnerVolumeSpecName "kube-api-access-tkh2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.978256 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "56209ff1-f81f-4fa2-be9e-d3387f03a7a7" (UID: "56209ff1-f81f-4fa2-be9e-d3387f03a7a7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.983602 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56209ff1-f81f-4fa2-be9e-d3387f03a7a7" (UID: "56209ff1-f81f-4fa2-be9e-d3387f03a7a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.986794 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-config" (OuterVolumeSpecName: "config") pod "56209ff1-f81f-4fa2-be9e-d3387f03a7a7" (UID: "56209ff1-f81f-4fa2-be9e-d3387f03a7a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:56:54 crc kubenswrapper[4743]: I1122 09:56:54.987873 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56209ff1-f81f-4fa2-be9e-d3387f03a7a7" (UID: "56209ff1-f81f-4fa2-be9e-d3387f03a7a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.029065 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-config\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.029111 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.029126 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.029139 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.029151 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkh2v\" (UniqueName: \"kubernetes.io/projected/56209ff1-f81f-4fa2-be9e-d3387f03a7a7-kube-api-access-tkh2v\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.632174 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" event={"ID":"56209ff1-f81f-4fa2-be9e-d3387f03a7a7","Type":"ContainerDied","Data":"41b6331a27bdffea327e4ae19c1f43f29d05737e091b0bb0b5571cf898b7adba"} Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.632228 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b8d7d4fc-qh62j" Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.632235 4743 scope.go:117] "RemoveContainer" containerID="87a30abfcd59ad2cb87cb358f309afae08a4d6060569e74d50f3b883bb3f113c" Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.662943 4743 scope.go:117] "RemoveContainer" containerID="09e314644d1ac19ffd87beba67e8a4ec3b28113ddf94afd68e1f5069d3fd7a85" Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.665015 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qh62j"] Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.677132 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64b8d7d4fc-qh62j"] Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.749847 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.750355 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="f6baf6fa-4d48-46c6-92db-3512a541e3b4" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589" gracePeriod=30 Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.762730 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.762979 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7743a75b-660a-489f-a88c-4fe0e0c793e8" containerName="nova-scheduler-scheduler" containerID="cri-o://f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089" gracePeriod=30 Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.774436 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.774698 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a12092a8-67c2-479d-81de-b879afb81749" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c" gracePeriod=30 Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.783009 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.783302 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-log" containerID="cri-o://e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17" gracePeriod=30 Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.783354 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-api" containerID="cri-o://da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d" gracePeriod=30 Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.792019 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.792282 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerName="nova-metadata-log" containerID="cri-o://bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c" gracePeriod=30 Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.792443 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerName="nova-metadata-metadata" containerID="cri-o://b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71" gracePeriod=30 Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.826146 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:56:55 crc kubenswrapper[4743]: I1122 09:56:55.826343 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f7a220c5-b663-44ac-82f5-46769c94f7a3" containerName="nova-cell1-conductor-conductor" containerID="cri-o://f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c" gracePeriod=30 Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.539946 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:56 crc kubenswrapper[4743]: E1122 09:56:56.542207 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 09:56:56 crc kubenswrapper[4743]: E1122 09:56:56.543787 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 09:56:56 crc kubenswrapper[4743]: E1122 09:56:56.545281 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 09:56:56 crc kubenswrapper[4743]: E1122 09:56:56.545349 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="f6baf6fa-4d48-46c6-92db-3512a541e3b4" containerName="nova-cell0-conductor-conductor" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.657618 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-config-data\") pod \"a12092a8-67c2-479d-81de-b879afb81749\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.657676 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-combined-ca-bundle\") pod \"a12092a8-67c2-479d-81de-b879afb81749\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.657796 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl92q\" (UniqueName: \"kubernetes.io/projected/a12092a8-67c2-479d-81de-b879afb81749-kube-api-access-bl92q\") pod \"a12092a8-67c2-479d-81de-b879afb81749\" (UID: \"a12092a8-67c2-479d-81de-b879afb81749\") " Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.667447 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a12092a8-67c2-479d-81de-b879afb81749-kube-api-access-bl92q" (OuterVolumeSpecName: "kube-api-access-bl92q") pod "a12092a8-67c2-479d-81de-b879afb81749" (UID: "a12092a8-67c2-479d-81de-b879afb81749"). InnerVolumeSpecName "kube-api-access-bl92q". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.672896 4743 generic.go:334] "Generic (PLEG): container finished" podID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerID="e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17" exitCode=143 Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.673062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb","Type":"ContainerDied","Data":"e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17"} Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.690193 4743 generic.go:334] "Generic (PLEG): container finished" podID="a12092a8-67c2-479d-81de-b879afb81749" containerID="2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c" exitCode=0 Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.690258 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a12092a8-67c2-479d-81de-b879afb81749","Type":"ContainerDied","Data":"2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c"} Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.690287 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a12092a8-67c2-479d-81de-b879afb81749","Type":"ContainerDied","Data":"b27d9f8ff12be71d9ebe4bd6a0ec0fa0395c97fcad92b11ff6018e6da9cf826f"} Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.690311 4743 scope.go:117] "RemoveContainer" containerID="2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.690422 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.699808 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a12092a8-67c2-479d-81de-b879afb81749" (UID: "a12092a8-67c2-479d-81de-b879afb81749"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.701679 4743 generic.go:334] "Generic (PLEG): container finished" podID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerID="bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c" exitCode=143 Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.701876 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d05f87f5-3725-48c9-b295-dfc1dd42d40e","Type":"ContainerDied","Data":"bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c"} Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.718889 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-config-data" (OuterVolumeSpecName: "config-data") pod "a12092a8-67c2-479d-81de-b879afb81749" (UID: "a12092a8-67c2-479d-81de-b879afb81749"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.760269 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.760310 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12092a8-67c2-479d-81de-b879afb81749-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.760328 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl92q\" (UniqueName: \"kubernetes.io/projected/a12092a8-67c2-479d-81de-b879afb81749-kube-api-access-bl92q\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.760464 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.851314 4743 scope.go:117] "RemoveContainer" containerID="2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c" Nov 22 09:56:56 crc kubenswrapper[4743]: E1122 09:56:56.852053 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c\": container with ID starting with 2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c not found: ID does not exist" containerID="2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c" Nov 22 09:56:56 crc kubenswrapper[4743]: I1122 09:56:56.852173 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c"} err="failed to get container status \"2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c\": rpc error: code = NotFound desc = could not find container \"2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c\": container with ID starting with 2e820d6567f211d111404f792dbc636b03089481532c79c6c3e8a202f3cf520c not found: ID does not exist" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.025281 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.040832 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.052375 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:56:57 crc kubenswrapper[4743]: E1122 09:56:57.052844 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56209ff1-f81f-4fa2-be9e-d3387f03a7a7" containerName="init" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.052862 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="56209ff1-f81f-4fa2-be9e-d3387f03a7a7" containerName="init" Nov 22 09:56:57 crc kubenswrapper[4743]: E1122 09:56:57.052903 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a12092a8-67c2-479d-81de-b879afb81749" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.052910 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a12092a8-67c2-479d-81de-b879afb81749" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 09:56:57 crc kubenswrapper[4743]: E1122 09:56:57.052925 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56209ff1-f81f-4fa2-be9e-d3387f03a7a7" containerName="dnsmasq-dns" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.052931 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="56209ff1-f81f-4fa2-be9e-d3387f03a7a7" containerName="dnsmasq-dns" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.053117 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a12092a8-67c2-479d-81de-b879afb81749" containerName="nova-cell1-novncproxy-novncproxy" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.053139 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="56209ff1-f81f-4fa2-be9e-d3387f03a7a7" containerName="dnsmasq-dns" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.053863 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.056194 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.060713 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.167784 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56209ff1-f81f-4fa2-be9e-d3387f03a7a7" path="/var/lib/kubelet/pods/56209ff1-f81f-4fa2-be9e-d3387f03a7a7/volumes" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.168024 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.168111 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.168222 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzgnk\" (UniqueName: \"kubernetes.io/projected/ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0-kube-api-access-mzgnk\") pod \"nova-cell1-novncproxy-0\" (UID: \"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.168840 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a12092a8-67c2-479d-81de-b879afb81749" path="/var/lib/kubelet/pods/a12092a8-67c2-479d-81de-b879afb81749/volumes" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.270422 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.270888 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzgnk\" (UniqueName: \"kubernetes.io/projected/ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0-kube-api-access-mzgnk\") pod \"nova-cell1-novncproxy-0\" (UID: \"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.271075 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.276710 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.282418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.292699 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzgnk\" (UniqueName: \"kubernetes.io/projected/ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0-kube-api-access-mzgnk\") pod \"nova-cell1-novncproxy-0\" (UID: \"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0\") " pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.391127 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:56:57 crc kubenswrapper[4743]: I1122 09:56:57.839129 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Nov 22 09:56:57 crc kubenswrapper[4743]: W1122 09:56:57.839732 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca4774d0_09fd_4ea1_8445_3f3d7ecdb3e0.slice/crio-c08ce3cdaefcd04b833d5d21d1afb18f65dd997f68ac673f4fd214f7e1947394 WatchSource:0}: Error finding container c08ce3cdaefcd04b833d5d21d1afb18f65dd997f68ac673f4fd214f7e1947394: Status 404 returned error can't find the container with id c08ce3cdaefcd04b833d5d21d1afb18f65dd997f68ac673f4fd214f7e1947394 Nov 22 09:56:58 crc kubenswrapper[4743]: I1122 09:56:58.721362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0","Type":"ContainerStarted","Data":"459caeaf76e9fe26516e18768ba2895aa86368f4768d391bfd759361baa5978b"} Nov 22 09:56:58 crc kubenswrapper[4743]: I1122 09:56:58.721731 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0","Type":"ContainerStarted","Data":"c08ce3cdaefcd04b833d5d21d1afb18f65dd997f68ac673f4fd214f7e1947394"} Nov 22 09:56:58 crc kubenswrapper[4743]: I1122 09:56:58.742621 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.742599698 podStartE2EDuration="1.742599698s" podCreationTimestamp="2025-11-22 09:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:56:58.737256214 +0000 UTC m=+5692.443617266" watchObservedRunningTime="2025-11-22 09:56:58.742599698 +0000 UTC m=+5692.448960750" Nov 22 09:56:58 crc kubenswrapper[4743]: I1122 09:56:58.942166 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": read tcp 10.217.0.2:46440->10.217.1.73:8774: read: connection reset by peer" Nov 22 09:56:58 crc kubenswrapper[4743]: I1122 09:56:58.942476 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.73:8774/\": read tcp 10.217.0.2:46428->10.217.1.73:8774: read: connection reset by peer" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.305277 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.417839 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-config-data\") pod \"f7a220c5-b663-44ac-82f5-46769c94f7a3\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.417905 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-combined-ca-bundle\") pod \"f7a220c5-b663-44ac-82f5-46769c94f7a3\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.418009 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl2sk\" (UniqueName: \"kubernetes.io/projected/f7a220c5-b663-44ac-82f5-46769c94f7a3-kube-api-access-kl2sk\") pod \"f7a220c5-b663-44ac-82f5-46769c94f7a3\" (UID: \"f7a220c5-b663-44ac-82f5-46769c94f7a3\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.434489 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a220c5-b663-44ac-82f5-46769c94f7a3-kube-api-access-kl2sk" (OuterVolumeSpecName: "kube-api-access-kl2sk") pod "f7a220c5-b663-44ac-82f5-46769c94f7a3" (UID: "f7a220c5-b663-44ac-82f5-46769c94f7a3"). InnerVolumeSpecName "kube-api-access-kl2sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.456855 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-config-data" (OuterVolumeSpecName: "config-data") pod "f7a220c5-b663-44ac-82f5-46769c94f7a3" (UID: "f7a220c5-b663-44ac-82f5-46769c94f7a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.464272 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7a220c5-b663-44ac-82f5-46769c94f7a3" (UID: "f7a220c5-b663-44ac-82f5-46769c94f7a3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.487449 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.496526 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.520857 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.520893 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7a220c5-b663-44ac-82f5-46769c94f7a3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.520906 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl2sk\" (UniqueName: \"kubernetes.io/projected/f7a220c5-b663-44ac-82f5-46769c94f7a3-kube-api-access-kl2sk\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.529939 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.542949 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.544907 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.544967 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7743a75b-660a-489f-a88c-4fe0e0c793e8" containerName="nova-scheduler-scheduler" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.622251 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05f87f5-3725-48c9-b295-dfc1dd42d40e-logs\") pod \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.622362 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92k8d\" (UniqueName: \"kubernetes.io/projected/d05f87f5-3725-48c9-b295-dfc1dd42d40e-kube-api-access-92k8d\") pod \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.622382 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvg6g\" (UniqueName: \"kubernetes.io/projected/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-kube-api-access-wvg6g\") pod \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.622431 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-config-data\") pod \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.622480 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-combined-ca-bundle\") pod \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\" (UID: \"d05f87f5-3725-48c9-b295-dfc1dd42d40e\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.622524 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-logs\") pod \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.622554 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-combined-ca-bundle\") pod \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.622641 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-config-data\") pod \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\" (UID: \"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb\") " Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.623820 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d05f87f5-3725-48c9-b295-dfc1dd42d40e-logs" (OuterVolumeSpecName: "logs") pod "d05f87f5-3725-48c9-b295-dfc1dd42d40e" (UID: "d05f87f5-3725-48c9-b295-dfc1dd42d40e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.624051 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-logs" (OuterVolumeSpecName: "logs") pod "51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" (UID: "51355ddf-602c-4f1b-b5e7-ae859e1f1dbb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.630276 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-kube-api-access-wvg6g" (OuterVolumeSpecName: "kube-api-access-wvg6g") pod "51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" (UID: "51355ddf-602c-4f1b-b5e7-ae859e1f1dbb"). InnerVolumeSpecName "kube-api-access-wvg6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.634393 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d05f87f5-3725-48c9-b295-dfc1dd42d40e-kube-api-access-92k8d" (OuterVolumeSpecName: "kube-api-access-92k8d") pod "d05f87f5-3725-48c9-b295-dfc1dd42d40e" (UID: "d05f87f5-3725-48c9-b295-dfc1dd42d40e"). InnerVolumeSpecName "kube-api-access-92k8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.655994 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" (UID: "51355ddf-602c-4f1b-b5e7-ae859e1f1dbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.665728 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-config-data" (OuterVolumeSpecName: "config-data") pod "d05f87f5-3725-48c9-b295-dfc1dd42d40e" (UID: "d05f87f5-3725-48c9-b295-dfc1dd42d40e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.668979 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d05f87f5-3725-48c9-b295-dfc1dd42d40e" (UID: "d05f87f5-3725-48c9-b295-dfc1dd42d40e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.671933 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-config-data" (OuterVolumeSpecName: "config-data") pod "51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" (UID: "51355ddf-602c-4f1b-b5e7-ae859e1f1dbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.724126 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d05f87f5-3725-48c9-b295-dfc1dd42d40e-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.724152 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92k8d\" (UniqueName: \"kubernetes.io/projected/d05f87f5-3725-48c9-b295-dfc1dd42d40e-kube-api-access-92k8d\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.724162 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvg6g\" (UniqueName: \"kubernetes.io/projected/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-kube-api-access-wvg6g\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.724171 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.724181 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d05f87f5-3725-48c9-b295-dfc1dd42d40e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.724191 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.724201 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.724213 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.735885 4743 generic.go:334] "Generic (PLEG): container finished" podID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerID="da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d" exitCode=0 Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.735988 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb","Type":"ContainerDied","Data":"da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d"} Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.736423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"51355ddf-602c-4f1b-b5e7-ae859e1f1dbb","Type":"ContainerDied","Data":"30119c1871eb582d544a898d8ba52581bef84a2c28bb95c7af129766c126e63f"} Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.736444 4743 scope.go:117] "RemoveContainer" containerID="da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.736671 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.737698 4743 generic.go:334] "Generic (PLEG): container finished" podID="f7a220c5-b663-44ac-82f5-46769c94f7a3" containerID="f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c" exitCode=0 Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.737754 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7a220c5-b663-44ac-82f5-46769c94f7a3","Type":"ContainerDied","Data":"f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c"} Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.737760 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.737785 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f7a220c5-b663-44ac-82f5-46769c94f7a3","Type":"ContainerDied","Data":"97b10309ce826eb01b89192fd2622c303b199fda0bff4c30c03631ae5dff9b4c"} Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.749021 4743 generic.go:334] "Generic (PLEG): container finished" podID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerID="b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71" exitCode=0 Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.750328 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.750631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d05f87f5-3725-48c9-b295-dfc1dd42d40e","Type":"ContainerDied","Data":"b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71"} Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.751821 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d05f87f5-3725-48c9-b295-dfc1dd42d40e","Type":"ContainerDied","Data":"f9a145aa44f50482354ed76b70e02f3e379e744d52a5eb7456c431356507080f"} Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.762518 4743 scope.go:117] "RemoveContainer" containerID="e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.819654 4743 scope.go:117] "RemoveContainer" containerID="da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d" Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.821113 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d\": container with ID starting with da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d not found: ID does not exist" containerID="da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.821157 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d"} err="failed to get container status \"da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d\": rpc error: code = NotFound desc = could not find container \"da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d\": container with ID starting with da3d23b8065fe995dcca690c68f7d6ca4baf629d9957360f5835ad67504cc56d not found: ID does not exist" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.821182 4743 scope.go:117] "RemoveContainer" containerID="e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17" Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.824394 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17\": container with ID starting with e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17 not found: ID does not exist" containerID="e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.824432 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17"} err="failed to get container status \"e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17\": rpc error: code = NotFound desc = could not find container \"e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17\": container with ID starting with e2c744606cd97a762b1920784c9b341812b658adef1f337f5669f637e6bdcc17 not found: ID does not exist" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.824457 4743 scope.go:117] "RemoveContainer" containerID="f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.837554 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.851636 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.859622 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.860284 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-api" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.860377 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-api" Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.860458 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerName="nova-metadata-metadata" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.860528 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerName="nova-metadata-metadata" Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.860674 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerName="nova-metadata-log" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.863349 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerName="nova-metadata-log" Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.863599 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a220c5-b663-44ac-82f5-46769c94f7a3" containerName="nova-cell1-conductor-conductor" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.863692 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a220c5-b663-44ac-82f5-46769c94f7a3" containerName="nova-cell1-conductor-conductor" Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.863818 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-log" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.863906 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-log" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.864633 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a220c5-b663-44ac-82f5-46769c94f7a3" containerName="nova-cell1-conductor-conductor" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.868204 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerName="nova-metadata-log" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.868508 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-api" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.868659 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" containerName="nova-metadata-metadata" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.868743 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" containerName="nova-api-log" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.869709 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.870827 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.873908 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.950035 4743 scope.go:117] "RemoveContainer" containerID="f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c" Nov 22 09:56:59 crc kubenswrapper[4743]: E1122 09:56:59.953808 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c\": container with ID starting with f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c not found: ID does not exist" containerID="f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.953858 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c"} err="failed to get container status \"f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c\": rpc error: code = NotFound desc = could not find container \"f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c\": container with ID starting with f7d782efcdbcb60d0507b33b65d1ae2c8eba4b3ecb6f3fa51164dc66f20ae71c not found: ID does not exist" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.953887 4743 scope.go:117] "RemoveContainer" containerID="b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71" Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.965645 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.975434 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:56:59 crc kubenswrapper[4743]: I1122 09:56:59.998651 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.006378 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.015605 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.017256 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.019466 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.020894 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.032134 4743 scope.go:117] "RemoveContainer" containerID="bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.032607 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.040369 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.040439 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.040598 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq5vn\" (UniqueName: \"kubernetes.io/projected/e9508ef1-7649-4ffd-84af-de9884f26e1c-kube-api-access-vq5vn\") pod \"nova-cell1-conductor-0\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.041762 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.047157 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.058821 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.067243 4743 scope.go:117] "RemoveContainer" containerID="b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71" Nov 22 09:57:00 crc kubenswrapper[4743]: E1122 09:57:00.068921 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71\": container with ID starting with b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71 not found: ID does not exist" containerID="b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.068997 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71"} err="failed to get container status \"b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71\": rpc error: code = NotFound desc = could not find container \"b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71\": container with ID starting with b5e873a0687ff50af343fd83c0066dab9b2a5b66aeeff598a1873fe841212a71 not found: ID does not exist" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.069034 4743 scope.go:117] "RemoveContainer" containerID="bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c" Nov 22 09:57:00 crc kubenswrapper[4743]: E1122 09:57:00.070055 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c\": container with ID starting with bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c not found: ID does not exist" containerID="bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.070099 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c"} err="failed to get container status \"bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c\": rpc error: code = NotFound desc = could not find container \"bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c\": container with ID starting with bc730c55e5b7e99c0b2eff10e3992e0761ed76074ab5a9b55a52a5e66234cb7c not found: ID does not exist" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143394 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143484 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ced675f6-5342-4162-bf69-d8250ee6ba58-logs\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143502 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-logs\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143524 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-config-data\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143561 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-config-data\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143652 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143677 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq5vn\" (UniqueName: \"kubernetes.io/projected/e9508ef1-7649-4ffd-84af-de9884f26e1c-kube-api-access-vq5vn\") pod \"nova-cell1-conductor-0\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143709 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143744 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzksw\" (UniqueName: \"kubernetes.io/projected/ced675f6-5342-4162-bf69-d8250ee6ba58-kube-api-access-nzksw\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.143779 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xxvn\" (UniqueName: \"kubernetes.io/projected/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-kube-api-access-2xxvn\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.148132 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.148872 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.159364 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq5vn\" (UniqueName: \"kubernetes.io/projected/e9508ef1-7649-4ffd-84af-de9884f26e1c-kube-api-access-vq5vn\") pod \"nova-cell1-conductor-0\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.245515 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-config-data\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.245621 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.245718 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzksw\" (UniqueName: \"kubernetes.io/projected/ced675f6-5342-4162-bf69-d8250ee6ba58-kube-api-access-nzksw\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.245818 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xxvn\" (UniqueName: \"kubernetes.io/projected/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-kube-api-access-2xxvn\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.245869 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.245928 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ced675f6-5342-4162-bf69-d8250ee6ba58-logs\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.245952 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-logs\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.245976 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-config-data\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.246804 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ced675f6-5342-4162-bf69-d8250ee6ba58-logs\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.246810 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-logs\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.249704 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-config-data\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.249775 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-config-data\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.249812 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.251407 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.251688 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.269353 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzksw\" (UniqueName: \"kubernetes.io/projected/ced675f6-5342-4162-bf69-d8250ee6ba58-kube-api-access-nzksw\") pod \"nova-metadata-0\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.269914 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xxvn\" (UniqueName: \"kubernetes.io/projected/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-kube-api-access-2xxvn\") pod \"nova-api-0\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.351398 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.363616 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.665940 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.755286 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-combined-ca-bundle\") pod \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.756007 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw8dz\" (UniqueName: \"kubernetes.io/projected/f6baf6fa-4d48-46c6-92db-3512a541e3b4-kube-api-access-lw8dz\") pod \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.756056 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-config-data\") pod \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\" (UID: \"f6baf6fa-4d48-46c6-92db-3512a541e3b4\") " Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.764155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6baf6fa-4d48-46c6-92db-3512a541e3b4-kube-api-access-lw8dz" (OuterVolumeSpecName: "kube-api-access-lw8dz") pod "f6baf6fa-4d48-46c6-92db-3512a541e3b4" (UID: "f6baf6fa-4d48-46c6-92db-3512a541e3b4"). InnerVolumeSpecName "kube-api-access-lw8dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.771751 4743 generic.go:334] "Generic (PLEG): container finished" podID="f6baf6fa-4d48-46c6-92db-3512a541e3b4" containerID="8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589" exitCode=0 Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.771801 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f6baf6fa-4d48-46c6-92db-3512a541e3b4","Type":"ContainerDied","Data":"8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589"} Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.771830 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"f6baf6fa-4d48-46c6-92db-3512a541e3b4","Type":"ContainerDied","Data":"03b329669a9f7bf0ed08a869af519b986d09053ec92dc747e0a3b76735540216"} Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.771852 4743 scope.go:117] "RemoveContainer" containerID="8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.772189 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.787696 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6baf6fa-4d48-46c6-92db-3512a541e3b4" (UID: "f6baf6fa-4d48-46c6-92db-3512a541e3b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.818677 4743 scope.go:117] "RemoveContainer" containerID="8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.818684 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-config-data" (OuterVolumeSpecName: "config-data") pod "f6baf6fa-4d48-46c6-92db-3512a541e3b4" (UID: "f6baf6fa-4d48-46c6-92db-3512a541e3b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:00 crc kubenswrapper[4743]: E1122 09:57:00.819075 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589\": container with ID starting with 8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589 not found: ID does not exist" containerID="8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.819119 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589"} err="failed to get container status \"8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589\": rpc error: code = NotFound desc = could not find container \"8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589\": container with ID starting with 8e90b35da369dcb27c4a13bf3779bf96ed48a97a6271c47a6e6cf0ba561a5589 not found: ID does not exist" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.860108 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.860492 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw8dz\" (UniqueName: \"kubernetes.io/projected/f6baf6fa-4d48-46c6-92db-3512a541e3b4-kube-api-access-lw8dz\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.860566 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6baf6fa-4d48-46c6-92db-3512a541e3b4-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.874033 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 09:57:00 crc kubenswrapper[4743]: I1122 09:57:00.953551 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 09:57:00 crc kubenswrapper[4743]: W1122 09:57:00.999879 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3c354a8_2d13_4d82_9bd5_1311e1fc5f86.slice/crio-c2081bc6e343bda29d9986ce0d15ed78c154f2418843d610782542d621e60c81 WatchSource:0}: Error finding container c2081bc6e343bda29d9986ce0d15ed78c154f2418843d610782542d621e60c81: Status 404 returned error can't find the container with id c2081bc6e343bda29d9986ce0d15ed78c154f2418843d610782542d621e60c81 Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.057704 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.167874 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51355ddf-602c-4f1b-b5e7-ae859e1f1dbb" path="/var/lib/kubelet/pods/51355ddf-602c-4f1b-b5e7-ae859e1f1dbb/volumes" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.168622 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d05f87f5-3725-48c9-b295-dfc1dd42d40e" path="/var/lib/kubelet/pods/d05f87f5-3725-48c9-b295-dfc1dd42d40e/volumes" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.175379 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a220c5-b663-44ac-82f5-46769c94f7a3" path="/var/lib/kubelet/pods/f7a220c5-b663-44ac-82f5-46769c94f7a3/volumes" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.176846 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.192340 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.210394 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:57:01 crc kubenswrapper[4743]: E1122 09:57:01.210907 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6baf6fa-4d48-46c6-92db-3512a541e3b4" containerName="nova-cell0-conductor-conductor" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.210935 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6baf6fa-4d48-46c6-92db-3512a541e3b4" containerName="nova-cell0-conductor-conductor" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.211173 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6baf6fa-4d48-46c6-92db-3512a541e3b4" containerName="nova-cell0-conductor-conductor" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.216684 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.218964 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.221308 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.374047 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf4lk\" (UniqueName: \"kubernetes.io/projected/375736c0-507a-4cb9-bf8d-b0827eb30630-kube-api-access-gf4lk\") pod \"nova-cell0-conductor-0\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.374270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.374348 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.476641 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf4lk\" (UniqueName: \"kubernetes.io/projected/375736c0-507a-4cb9-bf8d-b0827eb30630-kube-api-access-gf4lk\") pod \"nova-cell0-conductor-0\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.476739 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.476767 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.480654 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.480694 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.497673 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf4lk\" (UniqueName: \"kubernetes.io/projected/375736c0-507a-4cb9-bf8d-b0827eb30630-kube-api-access-gf4lk\") pod \"nova-cell0-conductor-0\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.541393 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.790286 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86","Type":"ContainerStarted","Data":"5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786"} Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.790648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86","Type":"ContainerStarted","Data":"10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b"} Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.790664 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86","Type":"ContainerStarted","Data":"c2081bc6e343bda29d9986ce0d15ed78c154f2418843d610782542d621e60c81"} Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.793541 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e9508ef1-7649-4ffd-84af-de9884f26e1c","Type":"ContainerStarted","Data":"1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932"} Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.793568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e9508ef1-7649-4ffd-84af-de9884f26e1c","Type":"ContainerStarted","Data":"4f5d04b5c6f2fb4c61438f14e416ce860762637fd805754a3ad943dfa138f3a0"} Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.793896 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.798334 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ced675f6-5342-4162-bf69-d8250ee6ba58","Type":"ContainerStarted","Data":"7987c3233e70cdcf23c8dbb58b86bb2c3daa8bc7feacdc54bb9342ae2e49bbc3"} Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.798373 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ced675f6-5342-4162-bf69-d8250ee6ba58","Type":"ContainerStarted","Data":"60f51e33a98a40374c2117fb4eaa128dc055495b07382c37e2ccde713562a10d"} Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.798388 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ced675f6-5342-4162-bf69-d8250ee6ba58","Type":"ContainerStarted","Data":"e07d632807aebf976a7a4c4ce0859a47f2995c4071fa7a6fde07ed41e11c2412"} Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.825674 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.825652999 podStartE2EDuration="2.825652999s" podCreationTimestamp="2025-11-22 09:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:57:01.812176472 +0000 UTC m=+5695.518537534" watchObservedRunningTime="2025-11-22 09:57:01.825652999 +0000 UTC m=+5695.532014051" Nov 22 09:57:01 crc kubenswrapper[4743]: I1122 09:57:01.856314 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.856289299 podStartE2EDuration="2.856289299s" podCreationTimestamp="2025-11-22 09:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:57:01.831639051 +0000 UTC m=+5695.538000123" watchObservedRunningTime="2025-11-22 09:57:01.856289299 +0000 UTC m=+5695.562650351" Nov 22 09:57:02 crc kubenswrapper[4743]: I1122 09:57:02.086449 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.08642863 podStartE2EDuration="3.08642863s" podCreationTimestamp="2025-11-22 09:56:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:57:01.879517026 +0000 UTC m=+5695.585878068" watchObservedRunningTime="2025-11-22 09:57:02.08642863 +0000 UTC m=+5695.792789682" Nov 22 09:57:02 crc kubenswrapper[4743]: I1122 09:57:02.087670 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 09:57:02 crc kubenswrapper[4743]: I1122 09:57:02.392266 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:57:02 crc kubenswrapper[4743]: I1122 09:57:02.820261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"375736c0-507a-4cb9-bf8d-b0827eb30630","Type":"ContainerStarted","Data":"524a484ad4860bf6973687c1a8123e31d7919c0e0bf05519bcf74918c9eae0ae"} Nov 22 09:57:02 crc kubenswrapper[4743]: I1122 09:57:02.820716 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"375736c0-507a-4cb9-bf8d-b0827eb30630","Type":"ContainerStarted","Data":"361bddd3c49bbf423431c1a60063dce1d35a2bb28c8f44310dc2a957f5dec040"} Nov 22 09:57:02 crc kubenswrapper[4743]: I1122 09:57:02.852353 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.852335436 podStartE2EDuration="1.852335436s" podCreationTimestamp="2025-11-22 09:57:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:57:02.848857186 +0000 UTC m=+5696.555218258" watchObservedRunningTime="2025-11-22 09:57:02.852335436 +0000 UTC m=+5696.558696488" Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.163076 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6baf6fa-4d48-46c6-92db-3512a541e3b4" path="/var/lib/kubelet/pods/f6baf6fa-4d48-46c6-92db-3512a541e3b4/volumes" Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.754131 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.835262 4743 generic.go:334] "Generic (PLEG): container finished" podID="7743a75b-660a-489f-a88c-4fe0e0c793e8" containerID="f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089" exitCode=0 Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.835321 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.835322 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7743a75b-660a-489f-a88c-4fe0e0c793e8","Type":"ContainerDied","Data":"f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089"} Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.835384 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7743a75b-660a-489f-a88c-4fe0e0c793e8","Type":"ContainerDied","Data":"1fa56b120367daefeacd690c389b2401e5cc3bf96f08d41be555e92749b35770"} Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.835408 4743 scope.go:117] "RemoveContainer" containerID="f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089" Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.835488 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.862314 4743 scope.go:117] "RemoveContainer" containerID="f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089" Nov 22 09:57:03 crc kubenswrapper[4743]: E1122 09:57:03.862863 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089\": container with ID starting with f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089 not found: ID does not exist" containerID="f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089" Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.862914 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089"} err="failed to get container status \"f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089\": rpc error: code = NotFound desc = could not find container \"f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089\": container with ID starting with f5f8a44aad07e89c23c0fa379b2c60f59573f940e5062df83b620d3cde663089 not found: ID does not exist" Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.922198 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-combined-ca-bundle\") pod \"7743a75b-660a-489f-a88c-4fe0e0c793e8\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.922400 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-config-data\") pod \"7743a75b-660a-489f-a88c-4fe0e0c793e8\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.922492 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kftdq\" (UniqueName: \"kubernetes.io/projected/7743a75b-660a-489f-a88c-4fe0e0c793e8-kube-api-access-kftdq\") pod \"7743a75b-660a-489f-a88c-4fe0e0c793e8\" (UID: \"7743a75b-660a-489f-a88c-4fe0e0c793e8\") " Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.944265 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7743a75b-660a-489f-a88c-4fe0e0c793e8-kube-api-access-kftdq" (OuterVolumeSpecName: "kube-api-access-kftdq") pod "7743a75b-660a-489f-a88c-4fe0e0c793e8" (UID: "7743a75b-660a-489f-a88c-4fe0e0c793e8"). InnerVolumeSpecName "kube-api-access-kftdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.951733 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-config-data" (OuterVolumeSpecName: "config-data") pod "7743a75b-660a-489f-a88c-4fe0e0c793e8" (UID: "7743a75b-660a-489f-a88c-4fe0e0c793e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:03 crc kubenswrapper[4743]: I1122 09:57:03.953139 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7743a75b-660a-489f-a88c-4fe0e0c793e8" (UID: "7743a75b-660a-489f-a88c-4fe0e0c793e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.024498 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.025008 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kftdq\" (UniqueName: \"kubernetes.io/projected/7743a75b-660a-489f-a88c-4fe0e0c793e8-kube-api-access-kftdq\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.025020 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7743a75b-660a-489f-a88c-4fe0e0c793e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.166806 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.176260 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.203670 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:57:04 crc kubenswrapper[4743]: E1122 09:57:04.204348 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7743a75b-660a-489f-a88c-4fe0e0c793e8" containerName="nova-scheduler-scheduler" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.204474 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7743a75b-660a-489f-a88c-4fe0e0c793e8" containerName="nova-scheduler-scheduler" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.204790 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7743a75b-660a-489f-a88c-4fe0e0c793e8" containerName="nova-scheduler-scheduler" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.205678 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.212489 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.222329 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.330227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7tdj\" (UniqueName: \"kubernetes.io/projected/9b35026a-cef7-4dd0-9446-429b448f7ed9-kube-api-access-g7tdj\") pod \"nova-scheduler-0\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.330480 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.330778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-config-data\") pod \"nova-scheduler-0\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.432948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-config-data\") pod \"nova-scheduler-0\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.433025 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7tdj\" (UniqueName: \"kubernetes.io/projected/9b35026a-cef7-4dd0-9446-429b448f7ed9-kube-api-access-g7tdj\") pod \"nova-scheduler-0\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.433049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.436931 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-config-data\") pod \"nova-scheduler-0\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.450760 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.451045 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7tdj\" (UniqueName: \"kubernetes.io/projected/9b35026a-cef7-4dd0-9446-429b448f7ed9-kube-api-access-g7tdj\") pod \"nova-scheduler-0\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.528593 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 09:57:04 crc kubenswrapper[4743]: I1122 09:57:04.962532 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 09:57:05 crc kubenswrapper[4743]: I1122 09:57:05.161983 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7743a75b-660a-489f-a88c-4fe0e0c793e8" path="/var/lib/kubelet/pods/7743a75b-660a-489f-a88c-4fe0e0c793e8/volumes" Nov 22 09:57:05 crc kubenswrapper[4743]: I1122 09:57:05.364204 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:57:05 crc kubenswrapper[4743]: I1122 09:57:05.364648 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 09:57:05 crc kubenswrapper[4743]: I1122 09:57:05.854041 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b35026a-cef7-4dd0-9446-429b448f7ed9","Type":"ContainerStarted","Data":"73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635"} Nov 22 09:57:05 crc kubenswrapper[4743]: I1122 09:57:05.854086 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b35026a-cef7-4dd0-9446-429b448f7ed9","Type":"ContainerStarted","Data":"ca885b9877e2acb7dc9fdb15427eebd92441fc456b271973a7b5edec901980c0"} Nov 22 09:57:05 crc kubenswrapper[4743]: I1122 09:57:05.895120 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.895100628 podStartE2EDuration="1.895100628s" podCreationTimestamp="2025-11-22 09:57:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:57:05.868199005 +0000 UTC m=+5699.574560067" watchObservedRunningTime="2025-11-22 09:57:05.895100628 +0000 UTC m=+5699.601461680" Nov 22 09:57:07 crc kubenswrapper[4743]: I1122 09:57:07.392213 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:57:07 crc kubenswrapper[4743]: I1122 09:57:07.403466 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:57:07 crc kubenswrapper[4743]: I1122 09:57:07.895897 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Nov 22 09:57:09 crc kubenswrapper[4743]: I1122 09:57:09.528867 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 09:57:10 crc kubenswrapper[4743]: I1122 09:57:10.279036 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 22 09:57:10 crc kubenswrapper[4743]: I1122 09:57:10.351838 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:57:10 crc kubenswrapper[4743]: I1122 09:57:10.351889 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 09:57:10 crc kubenswrapper[4743]: I1122 09:57:10.363879 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 09:57:10 crc kubenswrapper[4743]: I1122 09:57:10.364277 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 09:57:11 crc kubenswrapper[4743]: I1122 09:57:11.517808 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:57:11 crc kubenswrapper[4743]: I1122 09:57:11.517869 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:57:11 crc kubenswrapper[4743]: I1122 09:57:11.517894 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:57:11 crc kubenswrapper[4743]: I1122 09:57:11.517835 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 09:57:11 crc kubenswrapper[4743]: I1122 09:57:11.570042 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 22 09:57:12 crc kubenswrapper[4743]: E1122 09:57:12.002570 4743 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.245:39902->38.102.83.245:33143: write tcp 38.102.83.245:39902->38.102.83.245:33143: write: connection reset by peer Nov 22 09:57:14 crc kubenswrapper[4743]: I1122 09:57:14.529368 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 09:57:14 crc kubenswrapper[4743]: I1122 09:57:14.553276 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 09:57:14 crc kubenswrapper[4743]: I1122 09:57:14.974891 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.047056 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.048861 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.050981 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.063046 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.133684 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-scripts\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.133737 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efd57402-07ca-47bd-9f61-8228d5b3ff4c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.133766 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.133940 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.134125 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh65l\" (UniqueName: \"kubernetes.io/projected/efd57402-07ca-47bd-9f61-8228d5b3ff4c-kube-api-access-vh65l\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.134156 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.235976 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-scripts\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.236040 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efd57402-07ca-47bd-9f61-8228d5b3ff4c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.236062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.236168 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.236212 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh65l\" (UniqueName: \"kubernetes.io/projected/efd57402-07ca-47bd-9f61-8228d5b3ff4c-kube-api-access-vh65l\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.236228 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.236889 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efd57402-07ca-47bd-9f61-8228d5b3ff4c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.250788 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.263704 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.273346 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-scripts\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.273471 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.276229 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh65l\" (UniqueName: \"kubernetes.io/projected/efd57402-07ca-47bd-9f61-8228d5b3ff4c-kube-api-access-vh65l\") pod \"cinder-scheduler-0\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.382361 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.813513 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:57:15 crc kubenswrapper[4743]: I1122 09:57:15.998882 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"efd57402-07ca-47bd-9f61-8228d5b3ff4c","Type":"ContainerStarted","Data":"8ceb241a27d25be66d1432bb71311c93b95667dbd13b84d3c366126311b42595"} Nov 22 09:57:16 crc kubenswrapper[4743]: I1122 09:57:16.695628 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:57:16 crc kubenswrapper[4743]: I1122 09:57:16.696327 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerName="cinder-api-log" containerID="cri-o://fd628be2e89ecbb90cfd902467d7fc2b03732c226593dcee6b465cafeaa721ec" gracePeriod=30 Nov 22 09:57:16 crc kubenswrapper[4743]: I1122 09:57:16.696730 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerName="cinder-api" containerID="cri-o://6d7f8971e63edda8c64a98c975296a38dc72607c2037d84f94daa432eafbfc8f" gracePeriod=30 Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.020395 4743 generic.go:334] "Generic (PLEG): container finished" podID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerID="fd628be2e89ecbb90cfd902467d7fc2b03732c226593dcee6b465cafeaa721ec" exitCode=143 Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.020487 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca","Type":"ContainerDied","Data":"fd628be2e89ecbb90cfd902467d7fc2b03732c226593dcee6b465cafeaa721ec"} Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.023586 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"efd57402-07ca-47bd-9f61-8228d5b3ff4c","Type":"ContainerStarted","Data":"dfc85cadbbd916b7c1aadb3efb8ad370f614798cd8fe55e684ddf0642b67538b"} Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.138902 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.141429 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.144243 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.149667 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.296628 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.296732 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.296752 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.296827 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.296908 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.296960 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-run\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.296992 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.297073 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.297092 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.297124 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.297180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.297239 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.297270 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.297288 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.297309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfgdz\" (UniqueName: \"kubernetes.io/projected/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-kube-api-access-pfgdz\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.297363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399364 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399394 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399432 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399463 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399515 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfgdz\" (UniqueName: \"kubernetes.io/projected/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-kube-api-access-pfgdz\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399542 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399563 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399633 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399659 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399698 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399769 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-run\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399790 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.399906 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.400854 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.400966 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-dev\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.401011 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.401157 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.401211 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.401245 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-run\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.401445 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.401474 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.401666 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-sys\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.406305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.407381 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.408801 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.409374 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.415210 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.416491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfgdz\" (UniqueName: \"kubernetes.io/projected/d2fab6eb-6c8a-405d-9bb4-b393c1706e4b-kube-api-access-pfgdz\") pod \"cinder-volume-volume1-0\" (UID: \"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b\") " pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.511654 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.708478 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.711399 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.717375 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.721324 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.813549 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-etc-nvme\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.813735 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.813793 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7z96\" (UniqueName: \"kubernetes.io/projected/49819854-1fda-4d24-b2fc-43443fb9c1ef-kube-api-access-k7z96\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.813854 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.813944 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-lib-modules\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.813983 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.814009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-dev\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.814050 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-run\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.814072 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-config-data-custom\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.814114 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49819854-1fda-4d24-b2fc-43443fb9c1ef-ceph\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.814162 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.814187 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-sys\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.814225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.814251 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.814275 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-scripts\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.818253 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-config-data\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920097 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920141 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920160 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-scripts\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920218 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-config-data\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-etc-nvme\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920324 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920377 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-etc-nvme\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920460 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7z96\" (UniqueName: \"kubernetes.io/projected/49819854-1fda-4d24-b2fc-43443fb9c1ef-kube-api-access-k7z96\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920531 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920554 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-lib-modules\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920590 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920605 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920642 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-dev\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920618 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-dev\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920668 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920656 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920711 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-lib-modules\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920758 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-run\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-run\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920812 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-config-data-custom\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920847 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49819854-1fda-4d24-b2fc-43443fb9c1ef-ceph\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920927 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.920950 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-sys\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.921106 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49819854-1fda-4d24-b2fc-43443fb9c1ef-sys\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.925687 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49819854-1fda-4d24-b2fc-43443fb9c1ef-ceph\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.925817 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.926420 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-config-data-custom\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.926764 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-config-data\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.933002 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49819854-1fda-4d24-b2fc-43443fb9c1ef-scripts\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:17 crc kubenswrapper[4743]: I1122 09:57:17.949430 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7z96\" (UniqueName: \"kubernetes.io/projected/49819854-1fda-4d24-b2fc-43443fb9c1ef-kube-api-access-k7z96\") pod \"cinder-backup-0\" (UID: \"49819854-1fda-4d24-b2fc-43443fb9c1ef\") " pod="openstack/cinder-backup-0" Nov 22 09:57:18 crc kubenswrapper[4743]: I1122 09:57:18.033268 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"efd57402-07ca-47bd-9f61-8228d5b3ff4c","Type":"ContainerStarted","Data":"82d24f02b577255c5bde1a13bc50c12544aba7acf65f94aae2726367c1b40b96"} Nov 22 09:57:18 crc kubenswrapper[4743]: I1122 09:57:18.054065 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Nov 22 09:57:18 crc kubenswrapper[4743]: I1122 09:57:18.094133 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.094115741 podStartE2EDuration="3.094115741s" podCreationTimestamp="2025-11-22 09:57:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:57:18.053793813 +0000 UTC m=+5711.760154865" watchObservedRunningTime="2025-11-22 09:57:18.094115741 +0000 UTC m=+5711.800476793" Nov 22 09:57:18 crc kubenswrapper[4743]: I1122 09:57:18.103339 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Nov 22 09:57:18 crc kubenswrapper[4743]: W1122 09:57:18.115300 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2fab6eb_6c8a_405d_9bb4_b393c1706e4b.slice/crio-882398b3a449d2acf95b47424182c113f5d14c9c262d15ff882a0965863d6bf4 WatchSource:0}: Error finding container 882398b3a449d2acf95b47424182c113f5d14c9c262d15ff882a0965863d6bf4: Status 404 returned error can't find the container with id 882398b3a449d2acf95b47424182c113f5d14c9c262d15ff882a0965863d6bf4 Nov 22 09:57:18 crc kubenswrapper[4743]: I1122 09:57:18.120247 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 09:57:18 crc kubenswrapper[4743]: I1122 09:57:18.644986 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Nov 22 09:57:18 crc kubenswrapper[4743]: W1122 09:57:18.651849 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49819854_1fda_4d24_b2fc_43443fb9c1ef.slice/crio-05a3646a38c64e049cd791f5745c8013559e8250e5d7eafd4f679d4c5df961f1 WatchSource:0}: Error finding container 05a3646a38c64e049cd791f5745c8013559e8250e5d7eafd4f679d4c5df961f1: Status 404 returned error can't find the container with id 05a3646a38c64e049cd791f5745c8013559e8250e5d7eafd4f679d4c5df961f1 Nov 22 09:57:19 crc kubenswrapper[4743]: I1122 09:57:19.044026 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b","Type":"ContainerStarted","Data":"882398b3a449d2acf95b47424182c113f5d14c9c262d15ff882a0965863d6bf4"} Nov 22 09:57:19 crc kubenswrapper[4743]: I1122 09:57:19.046368 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"49819854-1fda-4d24-b2fc-43443fb9c1ef","Type":"ContainerStarted","Data":"05a3646a38c64e049cd791f5745c8013559e8250e5d7eafd4f679d4c5df961f1"} Nov 22 09:57:19 crc kubenswrapper[4743]: I1122 09:57:19.834522 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.78:8776/healthcheck\": read tcp 10.217.0.2:59226->10.217.1.78:8776: read: connection reset by peer" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.069496 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b","Type":"ContainerStarted","Data":"8bd4c775ff4d9fafc438057069999000edfeb0725e61c30c0001abce400128f7"} Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.072645 4743 generic.go:334] "Generic (PLEG): container finished" podID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerID="6d7f8971e63edda8c64a98c975296a38dc72607c2037d84f94daa432eafbfc8f" exitCode=0 Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.072689 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca","Type":"ContainerDied","Data":"6d7f8971e63edda8c64a98c975296a38dc72607c2037d84f94daa432eafbfc8f"} Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.356828 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.357226 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.357620 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.357653 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.360328 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.363248 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.370058 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.370207 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.375906 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.378188 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.383371 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.746250 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.884737 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-etc-machine-id\") pod \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.885183 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data-custom\") pod \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.885226 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data\") pod \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.885276 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-scripts\") pod \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.885302 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-logs\") pod \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.885326 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-combined-ca-bundle\") pod \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.885467 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5l8wh\" (UniqueName: \"kubernetes.io/projected/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-kube-api-access-5l8wh\") pod \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\" (UID: \"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca\") " Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.886264 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-logs" (OuterVolumeSpecName: "logs") pod "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" (UID: "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.886623 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" (UID: "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.890713 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-kube-api-access-5l8wh" (OuterVolumeSpecName: "kube-api-access-5l8wh") pod "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" (UID: "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca"). InnerVolumeSpecName "kube-api-access-5l8wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.891509 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" (UID: "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.893304 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-scripts" (OuterVolumeSpecName: "scripts") pod "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" (UID: "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.922735 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" (UID: "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.946465 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data" (OuterVolumeSpecName: "config-data") pod "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" (UID: "f54cb1ce-6eb2-466a-901c-9a9d2b5256ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.988466 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-logs\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.988522 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.988540 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5l8wh\" (UniqueName: \"kubernetes.io/projected/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-kube-api-access-5l8wh\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.988553 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.988564 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.988601 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:20 crc kubenswrapper[4743]: I1122 09:57:20.988615 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.087720 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"d2fab6eb-6c8a-405d-9bb4-b393c1706e4b","Type":"ContainerStarted","Data":"bfe821c86e0821bfd7f4c2087a2758f0cabe92a910d64db9e15b768f4fe900bc"} Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.090151 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.090356 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f54cb1ce-6eb2-466a-901c-9a9d2b5256ca","Type":"ContainerDied","Data":"3bb2a9cc6ecb6da377e05296bfc4d27bf6f8b1fe246bc50f95517e7aa4bef2a1"} Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.090434 4743 scope.go:117] "RemoveContainer" containerID="6d7f8971e63edda8c64a98c975296a38dc72607c2037d84f94daa432eafbfc8f" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.120647 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.715134785 podStartE2EDuration="4.120630937s" podCreationTimestamp="2025-11-22 09:57:17 +0000 UTC" firstStartedPulling="2025-11-22 09:57:18.119897762 +0000 UTC m=+5711.826258814" lastFinishedPulling="2025-11-22 09:57:19.525393914 +0000 UTC m=+5713.231754966" observedRunningTime="2025-11-22 09:57:21.108378045 +0000 UTC m=+5714.814739097" watchObservedRunningTime="2025-11-22 09:57:21.120630937 +0000 UTC m=+5714.826991989" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.139233 4743 scope.go:117] "RemoveContainer" containerID="fd628be2e89ecbb90cfd902467d7fc2b03732c226593dcee6b465cafeaa721ec" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.145330 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.172226 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.172704 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:57:21 crc kubenswrapper[4743]: E1122 09:57:21.172962 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerName="cinder-api" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.172979 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerName="cinder-api" Nov 22 09:57:21 crc kubenswrapper[4743]: E1122 09:57:21.172990 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerName="cinder-api-log" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.172996 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerName="cinder-api-log" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.173164 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerName="cinder-api" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.173184 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" containerName="cinder-api-log" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.176342 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.178444 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.181403 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.293460 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-config-data\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.293531 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-scripts\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.293588 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-config-data-custom\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.293755 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dss6p\" (UniqueName: \"kubernetes.io/projected/d12c6f64-316e-4bd4-bdd3-5644106566a0-kube-api-access-dss6p\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.293863 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.294030 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12c6f64-316e-4bd4-bdd3-5644106566a0-logs\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.294087 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d12c6f64-316e-4bd4-bdd3-5644106566a0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.396447 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-scripts\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.396539 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-config-data-custom\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.396606 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dss6p\" (UniqueName: \"kubernetes.io/projected/d12c6f64-316e-4bd4-bdd3-5644106566a0-kube-api-access-dss6p\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.396646 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.396708 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12c6f64-316e-4bd4-bdd3-5644106566a0-logs\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.396738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d12c6f64-316e-4bd4-bdd3-5644106566a0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.396784 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-config-data\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.397069 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d12c6f64-316e-4bd4-bdd3-5644106566a0-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.397373 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d12c6f64-316e-4bd4-bdd3-5644106566a0-logs\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.400913 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-config-data-custom\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.401445 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-config-data\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.402421 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.402996 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d12c6f64-316e-4bd4-bdd3-5644106566a0-scripts\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.412201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dss6p\" (UniqueName: \"kubernetes.io/projected/d12c6f64-316e-4bd4-bdd3-5644106566a0-kube-api-access-dss6p\") pod \"cinder-api-0\" (UID: \"d12c6f64-316e-4bd4-bdd3-5644106566a0\") " pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.490916 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Nov 22 09:57:21 crc kubenswrapper[4743]: I1122 09:57:21.998318 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Nov 22 09:57:22 crc kubenswrapper[4743]: I1122 09:57:22.106251 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"49819854-1fda-4d24-b2fc-43443fb9c1ef","Type":"ContainerStarted","Data":"d7cdabb9ebce24bf7caba5ab22b75c9fb83f5c450502c9913b3847816a09a9dc"} Nov 22 09:57:22 crc kubenswrapper[4743]: I1122 09:57:22.106722 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"49819854-1fda-4d24-b2fc-43443fb9c1ef","Type":"ContainerStarted","Data":"874050c2a1ea26b8ee182be063750ce751087e5e78b252ba2fb1c593e4bae843"} Nov 22 09:57:22 crc kubenswrapper[4743]: I1122 09:57:22.116213 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d12c6f64-316e-4bd4-bdd3-5644106566a0","Type":"ContainerStarted","Data":"7dc8d1be7df93f7abd0a95f3e28ea1f834e8e0dede077fa0219d14e6dacc1560"} Nov 22 09:57:22 crc kubenswrapper[4743]: I1122 09:57:22.131570 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.041006339 podStartE2EDuration="5.131553302s" podCreationTimestamp="2025-11-22 09:57:17 +0000 UTC" firstStartedPulling="2025-11-22 09:57:18.654313047 +0000 UTC m=+5712.360674099" lastFinishedPulling="2025-11-22 09:57:20.74486001 +0000 UTC m=+5714.451221062" observedRunningTime="2025-11-22 09:57:22.130614525 +0000 UTC m=+5715.836975587" watchObservedRunningTime="2025-11-22 09:57:22.131553302 +0000 UTC m=+5715.837914354" Nov 22 09:57:22 crc kubenswrapper[4743]: I1122 09:57:22.512429 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:23 crc kubenswrapper[4743]: I1122 09:57:23.055247 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Nov 22 09:57:23 crc kubenswrapper[4743]: I1122 09:57:23.142873 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d12c6f64-316e-4bd4-bdd3-5644106566a0","Type":"ContainerStarted","Data":"1d421d3a7a0ea75ab6d696c698a90d6edaae004479e0875ecf0988b885ffd6ce"} Nov 22 09:57:23 crc kubenswrapper[4743]: I1122 09:57:23.166808 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54cb1ce-6eb2-466a-901c-9a9d2b5256ca" path="/var/lib/kubelet/pods/f54cb1ce-6eb2-466a-901c-9a9d2b5256ca/volumes" Nov 22 09:57:24 crc kubenswrapper[4743]: I1122 09:57:24.150801 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d12c6f64-316e-4bd4-bdd3-5644106566a0","Type":"ContainerStarted","Data":"6bfbfb7902e90c6ccafc26ea61236146ef2bbfa02369022cf02114ffa3b066ce"} Nov 22 09:57:24 crc kubenswrapper[4743]: I1122 09:57:24.151264 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Nov 22 09:57:24 crc kubenswrapper[4743]: I1122 09:57:24.177359 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.17733776 podStartE2EDuration="3.17733776s" podCreationTimestamp="2025-11-22 09:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:57:24.166533659 +0000 UTC m=+5717.872894711" watchObservedRunningTime="2025-11-22 09:57:24.17733776 +0000 UTC m=+5717.883698812" Nov 22 09:57:25 crc kubenswrapper[4743]: I1122 09:57:25.575805 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 09:57:25 crc kubenswrapper[4743]: I1122 09:57:25.633903 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:57:26 crc kubenswrapper[4743]: I1122 09:57:26.172893 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" containerName="cinder-scheduler" containerID="cri-o://dfc85cadbbd916b7c1aadb3efb8ad370f614798cd8fe55e684ddf0642b67538b" gracePeriod=30 Nov 22 09:57:26 crc kubenswrapper[4743]: I1122 09:57:26.172970 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" containerName="probe" containerID="cri-o://82d24f02b577255c5bde1a13bc50c12544aba7acf65f94aae2726367c1b40b96" gracePeriod=30 Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.185604 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"efd57402-07ca-47bd-9f61-8228d5b3ff4c","Type":"ContainerDied","Data":"82d24f02b577255c5bde1a13bc50c12544aba7acf65f94aae2726367c1b40b96"} Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.185656 4743 generic.go:334] "Generic (PLEG): container finished" podID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" containerID="82d24f02b577255c5bde1a13bc50c12544aba7acf65f94aae2726367c1b40b96" exitCode=0 Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.186108 4743 generic.go:334] "Generic (PLEG): container finished" podID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" containerID="dfc85cadbbd916b7c1aadb3efb8ad370f614798cd8fe55e684ddf0642b67538b" exitCode=0 Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.186130 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"efd57402-07ca-47bd-9f61-8228d5b3ff4c","Type":"ContainerDied","Data":"dfc85cadbbd916b7c1aadb3efb8ad370f614798cd8fe55e684ddf0642b67538b"} Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.186145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"efd57402-07ca-47bd-9f61-8228d5b3ff4c","Type":"ContainerDied","Data":"8ceb241a27d25be66d1432bb71311c93b95667dbd13b84d3c366126311b42595"} Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.186156 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ceb241a27d25be66d1432bb71311c93b95667dbd13b84d3c366126311b42595" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.227541 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.345892 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-scripts\") pod \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.345937 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efd57402-07ca-47bd-9f61-8228d5b3ff4c-etc-machine-id\") pod \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.346152 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/efd57402-07ca-47bd-9f61-8228d5b3ff4c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "efd57402-07ca-47bd-9f61-8228d5b3ff4c" (UID: "efd57402-07ca-47bd-9f61-8228d5b3ff4c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.346183 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data-custom\") pod \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.346223 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-combined-ca-bundle\") pod \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.346253 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data\") pod \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.346275 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh65l\" (UniqueName: \"kubernetes.io/projected/efd57402-07ca-47bd-9f61-8228d5b3ff4c-kube-api-access-vh65l\") pod \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\" (UID: \"efd57402-07ca-47bd-9f61-8228d5b3ff4c\") " Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.346827 4743 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/efd57402-07ca-47bd-9f61-8228d5b3ff4c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.352834 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd57402-07ca-47bd-9f61-8228d5b3ff4c-kube-api-access-vh65l" (OuterVolumeSpecName: "kube-api-access-vh65l") pod "efd57402-07ca-47bd-9f61-8228d5b3ff4c" (UID: "efd57402-07ca-47bd-9f61-8228d5b3ff4c"). InnerVolumeSpecName "kube-api-access-vh65l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.352831 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-scripts" (OuterVolumeSpecName: "scripts") pod "efd57402-07ca-47bd-9f61-8228d5b3ff4c" (UID: "efd57402-07ca-47bd-9f61-8228d5b3ff4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.353175 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "efd57402-07ca-47bd-9f61-8228d5b3ff4c" (UID: "efd57402-07ca-47bd-9f61-8228d5b3ff4c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.403707 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efd57402-07ca-47bd-9f61-8228d5b3ff4c" (UID: "efd57402-07ca-47bd-9f61-8228d5b3ff4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.442814 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data" (OuterVolumeSpecName: "config-data") pod "efd57402-07ca-47bd-9f61-8228d5b3ff4c" (UID: "efd57402-07ca-47bd-9f61-8228d5b3ff4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.449163 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data-custom\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.449204 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.449219 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.449231 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh65l\" (UniqueName: \"kubernetes.io/projected/efd57402-07ca-47bd-9f61-8228d5b3ff4c-kube-api-access-vh65l\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.449245 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efd57402-07ca-47bd-9f61-8228d5b3ff4c-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:57:27 crc kubenswrapper[4743]: I1122 09:57:27.743197 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.196330 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.236731 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.245032 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.268490 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:57:28 crc kubenswrapper[4743]: E1122 09:57:28.268884 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" containerName="probe" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.268903 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" containerName="probe" Nov 22 09:57:28 crc kubenswrapper[4743]: E1122 09:57:28.268923 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" containerName="cinder-scheduler" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.268930 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" containerName="cinder-scheduler" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.269269 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" containerName="probe" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.269312 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" containerName="cinder-scheduler" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.270173 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.278673 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.286113 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.287127 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.470717 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.471461 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68lz6\" (UniqueName: \"kubernetes.io/projected/b1d2762c-ee40-42c3-84a9-5057136d2208-kube-api-access-68lz6\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.471954 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.472101 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.472230 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.472797 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d2762c-ee40-42c3-84a9-5057136d2208-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.573620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68lz6\" (UniqueName: \"kubernetes.io/projected/b1d2762c-ee40-42c3-84a9-5057136d2208-kube-api-access-68lz6\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.573670 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.573720 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.573753 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.573779 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d2762c-ee40-42c3-84a9-5057136d2208-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.573806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.575671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b1d2762c-ee40-42c3-84a9-5057136d2208-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.578791 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.580617 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-config-data\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.581025 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-scripts\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.585084 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b1d2762c-ee40-42c3-84a9-5057136d2208-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.593658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68lz6\" (UniqueName: \"kubernetes.io/projected/b1d2762c-ee40-42c3-84a9-5057136d2208-kube-api-access-68lz6\") pod \"cinder-scheduler-0\" (UID: \"b1d2762c-ee40-42c3-84a9-5057136d2208\") " pod="openstack/cinder-scheduler-0" Nov 22 09:57:28 crc kubenswrapper[4743]: I1122 09:57:28.891297 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Nov 22 09:57:29 crc kubenswrapper[4743]: I1122 09:57:29.162364 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efd57402-07ca-47bd-9f61-8228d5b3ff4c" path="/var/lib/kubelet/pods/efd57402-07ca-47bd-9f61-8228d5b3ff4c/volumes" Nov 22 09:57:29 crc kubenswrapper[4743]: I1122 09:57:29.318083 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Nov 22 09:57:30 crc kubenswrapper[4743]: I1122 09:57:30.215824 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d2762c-ee40-42c3-84a9-5057136d2208","Type":"ContainerStarted","Data":"e145742dd439148ae8df73aa73d4f8c3e5b361238c2fa52d157a23ac7d50f855"} Nov 22 09:57:30 crc kubenswrapper[4743]: I1122 09:57:30.216621 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d2762c-ee40-42c3-84a9-5057136d2208","Type":"ContainerStarted","Data":"f5787c1cb7ac31f020577389ff8cb869a9d0bb1e9740c839b3b9f24b9e92fad7"} Nov 22 09:57:31 crc kubenswrapper[4743]: I1122 09:57:31.227813 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b1d2762c-ee40-42c3-84a9-5057136d2208","Type":"ContainerStarted","Data":"3365fc03e7848dc28b37cba702c304d9c114c4c6efdeeb9e41cef31d7ae8deed"} Nov 22 09:57:31 crc kubenswrapper[4743]: I1122 09:57:31.247505 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:57:31 crc kubenswrapper[4743]: I1122 09:57:31.247637 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:57:33 crc kubenswrapper[4743]: I1122 09:57:33.386698 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Nov 22 09:57:33 crc kubenswrapper[4743]: I1122 09:57:33.425940 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.425915963 podStartE2EDuration="5.425915963s" podCreationTimestamp="2025-11-22 09:57:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:57:31.259694374 +0000 UTC m=+5724.966055426" watchObservedRunningTime="2025-11-22 09:57:33.425915963 +0000 UTC m=+5727.132277015" Nov 22 09:57:33 crc kubenswrapper[4743]: I1122 09:57:33.891799 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Nov 22 09:57:39 crc kubenswrapper[4743]: I1122 09:57:39.083098 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Nov 22 09:58:01 crc kubenswrapper[4743]: I1122 09:58:01.241308 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:58:01 crc kubenswrapper[4743]: I1122 09:58:01.242027 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:58:31 crc kubenswrapper[4743]: I1122 09:58:31.241437 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 09:58:31 crc kubenswrapper[4743]: I1122 09:58:31.242180 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 09:58:31 crc kubenswrapper[4743]: I1122 09:58:31.242248 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 09:58:31 crc kubenswrapper[4743]: I1122 09:58:31.243073 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 09:58:31 crc kubenswrapper[4743]: I1122 09:58:31.243145 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" gracePeriod=600 Nov 22 09:58:31 crc kubenswrapper[4743]: E1122 09:58:31.365967 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:58:31 crc kubenswrapper[4743]: I1122 09:58:31.788443 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" exitCode=0 Nov 22 09:58:31 crc kubenswrapper[4743]: I1122 09:58:31.788507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68"} Nov 22 09:58:31 crc kubenswrapper[4743]: I1122 09:58:31.788647 4743 scope.go:117] "RemoveContainer" containerID="3eb3f31edfdd1f055cdfe5d03bc7d720df5251d68525b808fed42d018024ae04" Nov 22 09:58:31 crc kubenswrapper[4743]: I1122 09:58:31.789433 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 09:58:31 crc kubenswrapper[4743]: E1122 09:58:31.789894 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:58:47 crc kubenswrapper[4743]: I1122 09:58:47.166754 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 09:58:47 crc kubenswrapper[4743]: E1122 09:58:47.167427 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:58:58 crc kubenswrapper[4743]: I1122 09:58:58.089369 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2ac5-account-create-9fhb7"] Nov 22 09:58:58 crc kubenswrapper[4743]: I1122 09:58:58.105199 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2ac5-account-create-9fhb7"] Nov 22 09:58:58 crc kubenswrapper[4743]: I1122 09:58:58.114956 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-796kq"] Nov 22 09:58:58 crc kubenswrapper[4743]: I1122 09:58:58.124828 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-796kq"] Nov 22 09:58:58 crc kubenswrapper[4743]: I1122 09:58:58.153360 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 09:58:58 crc kubenswrapper[4743]: E1122 09:58:58.153888 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:58:59 crc kubenswrapper[4743]: I1122 09:58:59.165819 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="329ae2e9-ea30-483f-ab8d-c35659e1fc6d" path="/var/lib/kubelet/pods/329ae2e9-ea30-483f-ab8d-c35659e1fc6d/volumes" Nov 22 09:58:59 crc kubenswrapper[4743]: I1122 09:58:59.167772 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff060041-1da3-4a68-8ec6-87d2e80387d1" path="/var/lib/kubelet/pods/ff060041-1da3-4a68-8ec6-87d2e80387d1/volumes" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.154561 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g66xn"] Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.158974 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.180399 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g66xn"] Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.244430 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-catalog-content\") pod \"redhat-marketplace-g66xn\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.244512 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-utilities\") pod \"redhat-marketplace-g66xn\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.244808 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slxsc\" (UniqueName: \"kubernetes.io/projected/e194e95d-2240-4b6f-880e-799dc7b059f6-kube-api-access-slxsc\") pod \"redhat-marketplace-g66xn\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.346045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slxsc\" (UniqueName: \"kubernetes.io/projected/e194e95d-2240-4b6f-880e-799dc7b059f6-kube-api-access-slxsc\") pod \"redhat-marketplace-g66xn\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.346304 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-catalog-content\") pod \"redhat-marketplace-g66xn\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.346418 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-utilities\") pod \"redhat-marketplace-g66xn\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.346940 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-utilities\") pod \"redhat-marketplace-g66xn\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.347049 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-catalog-content\") pod \"redhat-marketplace-g66xn\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.368813 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slxsc\" (UniqueName: \"kubernetes.io/projected/e194e95d-2240-4b6f-880e-799dc7b059f6-kube-api-access-slxsc\") pod \"redhat-marketplace-g66xn\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.484376 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:02 crc kubenswrapper[4743]: I1122 09:59:02.970512 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g66xn"] Nov 22 09:59:02 crc kubenswrapper[4743]: W1122 09:59:02.974102 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode194e95d_2240_4b6f_880e_799dc7b059f6.slice/crio-70a9a481ee5658e27d80032b4d831598034b42ecb3f34f9ceb3bfbdb502bb807 WatchSource:0}: Error finding container 70a9a481ee5658e27d80032b4d831598034b42ecb3f34f9ceb3bfbdb502bb807: Status 404 returned error can't find the container with id 70a9a481ee5658e27d80032b4d831598034b42ecb3f34f9ceb3bfbdb502bb807 Nov 22 09:59:03 crc kubenswrapper[4743]: I1122 09:59:03.167116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g66xn" event={"ID":"e194e95d-2240-4b6f-880e-799dc7b059f6","Type":"ContainerStarted","Data":"70a9a481ee5658e27d80032b4d831598034b42ecb3f34f9ceb3bfbdb502bb807"} Nov 22 09:59:04 crc kubenswrapper[4743]: I1122 09:59:04.180514 4743 generic.go:334] "Generic (PLEG): container finished" podID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerID="161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c" exitCode=0 Nov 22 09:59:04 crc kubenswrapper[4743]: I1122 09:59:04.180595 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g66xn" event={"ID":"e194e95d-2240-4b6f-880e-799dc7b059f6","Type":"ContainerDied","Data":"161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c"} Nov 22 09:59:07 crc kubenswrapper[4743]: I1122 09:59:07.209018 4743 generic.go:334] "Generic (PLEG): container finished" podID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerID="67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc" exitCode=0 Nov 22 09:59:07 crc kubenswrapper[4743]: I1122 09:59:07.209254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g66xn" event={"ID":"e194e95d-2240-4b6f-880e-799dc7b059f6","Type":"ContainerDied","Data":"67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc"} Nov 22 09:59:09 crc kubenswrapper[4743]: I1122 09:59:09.032540 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-zcjpk"] Nov 22 09:59:09 crc kubenswrapper[4743]: I1122 09:59:09.040612 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-zcjpk"] Nov 22 09:59:09 crc kubenswrapper[4743]: I1122 09:59:09.172754 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5267038d-28f8-49a0-b908-677a4f11e493" path="/var/lib/kubelet/pods/5267038d-28f8-49a0-b908-677a4f11e493/volumes" Nov 22 09:59:09 crc kubenswrapper[4743]: I1122 09:59:09.231569 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g66xn" event={"ID":"e194e95d-2240-4b6f-880e-799dc7b059f6","Type":"ContainerStarted","Data":"ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b"} Nov 22 09:59:11 crc kubenswrapper[4743]: I1122 09:59:11.165557 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 09:59:11 crc kubenswrapper[4743]: E1122 09:59:11.166573 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:59:12 crc kubenswrapper[4743]: I1122 09:59:12.484744 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:12 crc kubenswrapper[4743]: I1122 09:59:12.485049 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:12 crc kubenswrapper[4743]: I1122 09:59:12.546477 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:12 crc kubenswrapper[4743]: I1122 09:59:12.564852 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g66xn" podStartSLOduration=6.306620355 podStartE2EDuration="10.564833979s" podCreationTimestamp="2025-11-22 09:59:02 +0000 UTC" firstStartedPulling="2025-11-22 09:59:04.185112228 +0000 UTC m=+5817.891473280" lastFinishedPulling="2025-11-22 09:59:08.443325852 +0000 UTC m=+5822.149686904" observedRunningTime="2025-11-22 09:59:09.254325683 +0000 UTC m=+5822.960686745" watchObservedRunningTime="2025-11-22 09:59:12.564833979 +0000 UTC m=+5826.271195031" Nov 22 09:59:13 crc kubenswrapper[4743]: I1122 09:59:13.318110 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:13 crc kubenswrapper[4743]: I1122 09:59:13.367815 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g66xn"] Nov 22 09:59:15 crc kubenswrapper[4743]: I1122 09:59:15.290327 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g66xn" podUID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerName="registry-server" containerID="cri-o://ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b" gracePeriod=2 Nov 22 09:59:15 crc kubenswrapper[4743]: I1122 09:59:15.850226 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:15 crc kubenswrapper[4743]: I1122 09:59:15.957719 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-catalog-content\") pod \"e194e95d-2240-4b6f-880e-799dc7b059f6\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " Nov 22 09:59:15 crc kubenswrapper[4743]: I1122 09:59:15.957803 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-utilities\") pod \"e194e95d-2240-4b6f-880e-799dc7b059f6\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " Nov 22 09:59:15 crc kubenswrapper[4743]: I1122 09:59:15.957928 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slxsc\" (UniqueName: \"kubernetes.io/projected/e194e95d-2240-4b6f-880e-799dc7b059f6-kube-api-access-slxsc\") pod \"e194e95d-2240-4b6f-880e-799dc7b059f6\" (UID: \"e194e95d-2240-4b6f-880e-799dc7b059f6\") " Nov 22 09:59:15 crc kubenswrapper[4743]: I1122 09:59:15.959667 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-utilities" (OuterVolumeSpecName: "utilities") pod "e194e95d-2240-4b6f-880e-799dc7b059f6" (UID: "e194e95d-2240-4b6f-880e-799dc7b059f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:59:15 crc kubenswrapper[4743]: I1122 09:59:15.985073 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e194e95d-2240-4b6f-880e-799dc7b059f6-kube-api-access-slxsc" (OuterVolumeSpecName: "kube-api-access-slxsc") pod "e194e95d-2240-4b6f-880e-799dc7b059f6" (UID: "e194e95d-2240-4b6f-880e-799dc7b059f6"). InnerVolumeSpecName "kube-api-access-slxsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.018398 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e194e95d-2240-4b6f-880e-799dc7b059f6" (UID: "e194e95d-2240-4b6f-880e-799dc7b059f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.059488 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.059527 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e194e95d-2240-4b6f-880e-799dc7b059f6-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.059538 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slxsc\" (UniqueName: \"kubernetes.io/projected/e194e95d-2240-4b6f-880e-799dc7b059f6-kube-api-access-slxsc\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.304194 4743 generic.go:334] "Generic (PLEG): container finished" podID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerID="ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b" exitCode=0 Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.304241 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g66xn" event={"ID":"e194e95d-2240-4b6f-880e-799dc7b059f6","Type":"ContainerDied","Data":"ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b"} Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.304282 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g66xn" event={"ID":"e194e95d-2240-4b6f-880e-799dc7b059f6","Type":"ContainerDied","Data":"70a9a481ee5658e27d80032b4d831598034b42ecb3f34f9ceb3bfbdb502bb807"} Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.304290 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g66xn" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.304304 4743 scope.go:117] "RemoveContainer" containerID="ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.328377 4743 scope.go:117] "RemoveContainer" containerID="67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.345806 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g66xn"] Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.353935 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g66xn"] Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.377766 4743 scope.go:117] "RemoveContainer" containerID="161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.394507 4743 scope.go:117] "RemoveContainer" containerID="ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b" Nov 22 09:59:16 crc kubenswrapper[4743]: E1122 09:59:16.395000 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b\": container with ID starting with ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b not found: ID does not exist" containerID="ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.395045 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b"} err="failed to get container status \"ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b\": rpc error: code = NotFound desc = could not find container \"ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b\": container with ID starting with ffd566123d937b1d42c749d75b63a9ed7487f78f000c6a621330b1e781002e3b not found: ID does not exist" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.395072 4743 scope.go:117] "RemoveContainer" containerID="67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc" Nov 22 09:59:16 crc kubenswrapper[4743]: E1122 09:59:16.395344 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc\": container with ID starting with 67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc not found: ID does not exist" containerID="67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.395375 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc"} err="failed to get container status \"67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc\": rpc error: code = NotFound desc = could not find container \"67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc\": container with ID starting with 67d407778b983926d7d82f8ca1a7138f7cbc915a14f041d1026ee0186a9191cc not found: ID does not exist" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.395403 4743 scope.go:117] "RemoveContainer" containerID="161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c" Nov 22 09:59:16 crc kubenswrapper[4743]: E1122 09:59:16.395799 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c\": container with ID starting with 161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c not found: ID does not exist" containerID="161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c" Nov 22 09:59:16 crc kubenswrapper[4743]: I1122 09:59:16.395820 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c"} err="failed to get container status \"161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c\": rpc error: code = NotFound desc = could not find container \"161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c\": container with ID starting with 161f53205dd1f419db5bcc6465b15625de7be986d3f03c4eea6a0f9d7c22c23c not found: ID does not exist" Nov 22 09:59:17 crc kubenswrapper[4743]: I1122 09:59:17.167316 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e194e95d-2240-4b6f-880e-799dc7b059f6" path="/var/lib/kubelet/pods/e194e95d-2240-4b6f-880e-799dc7b059f6/volumes" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.257380 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lzbkj"] Nov 22 09:59:20 crc kubenswrapper[4743]: E1122 09:59:20.258283 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerName="registry-server" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.258302 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerName="registry-server" Nov 22 09:59:20 crc kubenswrapper[4743]: E1122 09:59:20.258316 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerName="extract-utilities" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.258322 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerName="extract-utilities" Nov 22 09:59:20 crc kubenswrapper[4743]: E1122 09:59:20.258348 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerName="extract-content" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.258355 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerName="extract-content" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.258545 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e194e95d-2240-4b6f-880e-799dc7b059f6" containerName="registry-server" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.259391 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.261917 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-7v5wr" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.261969 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.281445 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5jq6t"] Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.283638 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.293721 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lzbkj"] Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.306860 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5jq6t"] Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.439782 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-var-log\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.439829 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt72q\" (UniqueName: \"kubernetes.io/projected/005d696c-bc18-45cc-bcd9-8d22455874e7-kube-api-access-kt72q\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.439958 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-scripts\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.439984 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-var-lib\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.440011 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-var-run-ovn\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.440033 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dtkr\" (UniqueName: \"kubernetes.io/projected/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-kube-api-access-2dtkr\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.440069 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-etc-ovs\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.440085 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-var-run\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.440102 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-var-log-ovn\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.440116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005d696c-bc18-45cc-bcd9-8d22455874e7-scripts\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.440196 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-var-run\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.541965 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-scripts\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542029 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-var-lib\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-var-run-ovn\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542080 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dtkr\" (UniqueName: \"kubernetes.io/projected/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-kube-api-access-2dtkr\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542101 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-etc-ovs\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542121 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-var-run\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-var-log-ovn\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542158 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005d696c-bc18-45cc-bcd9-8d22455874e7-scripts\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542181 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-var-run\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542249 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-var-log\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt72q\" (UniqueName: \"kubernetes.io/projected/005d696c-bc18-45cc-bcd9-8d22455874e7-kube-api-access-kt72q\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542400 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-var-lib\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-var-run\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542517 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-var-run-ovn\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-var-log-ovn\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542859 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-etc-ovs\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542924 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-var-run\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.542990 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/005d696c-bc18-45cc-bcd9-8d22455874e7-var-log\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.543986 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-scripts\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.544428 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/005d696c-bc18-45cc-bcd9-8d22455874e7-scripts\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.567224 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dtkr\" (UniqueName: \"kubernetes.io/projected/5bb3a009-b9ed-4054-ac5f-c7bd866f9634-kube-api-access-2dtkr\") pod \"ovn-controller-lzbkj\" (UID: \"5bb3a009-b9ed-4054-ac5f-c7bd866f9634\") " pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.571118 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt72q\" (UniqueName: \"kubernetes.io/projected/005d696c-bc18-45cc-bcd9-8d22455874e7-kube-api-access-kt72q\") pod \"ovn-controller-ovs-5jq6t\" (UID: \"005d696c-bc18-45cc-bcd9-8d22455874e7\") " pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.582085 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:20 crc kubenswrapper[4743]: I1122 09:59:20.621786 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.077509 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lzbkj"] Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.351775 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzbkj" event={"ID":"5bb3a009-b9ed-4054-ac5f-c7bd866f9634","Type":"ContainerStarted","Data":"2610676da7797b91a3d1078c68620622e14f261f2dd7ddd82ab3a9920d0576e3"} Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.482940 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5jq6t"] Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.595981 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-l4tln"] Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.597658 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.600548 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.605757 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l4tln"] Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.764613 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/580be048-ba5a-4927-bd45-28d898c01ca1-ovs-rundir\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.764929 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9phhr\" (UniqueName: \"kubernetes.io/projected/580be048-ba5a-4927-bd45-28d898c01ca1-kube-api-access-9phhr\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.764972 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/580be048-ba5a-4927-bd45-28d898c01ca1-ovn-rundir\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.765006 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/580be048-ba5a-4927-bd45-28d898c01ca1-config\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.866824 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/580be048-ba5a-4927-bd45-28d898c01ca1-ovs-rundir\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.867185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/580be048-ba5a-4927-bd45-28d898c01ca1-ovs-rundir\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.867274 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9phhr\" (UniqueName: \"kubernetes.io/projected/580be048-ba5a-4927-bd45-28d898c01ca1-kube-api-access-9phhr\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.867311 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/580be048-ba5a-4927-bd45-28d898c01ca1-ovn-rundir\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.867341 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/580be048-ba5a-4927-bd45-28d898c01ca1-config\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.867510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/580be048-ba5a-4927-bd45-28d898c01ca1-ovn-rundir\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.868276 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/580be048-ba5a-4927-bd45-28d898c01ca1-config\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.888962 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9phhr\" (UniqueName: \"kubernetes.io/projected/580be048-ba5a-4927-bd45-28d898c01ca1-kube-api-access-9phhr\") pod \"ovn-controller-metrics-l4tln\" (UID: \"580be048-ba5a-4927-bd45-28d898c01ca1\") " pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:21 crc kubenswrapper[4743]: I1122 09:59:21.939401 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l4tln" Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.039793 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vqf7n"] Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.049456 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vqf7n"] Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.363355 4743 generic.go:334] "Generic (PLEG): container finished" podID="005d696c-bc18-45cc-bcd9-8d22455874e7" containerID="f622661d51342d087fc0301e813177efb2ed842cab4bf00078a1ee3823c0f6b7" exitCode=0 Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.363810 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5jq6t" event={"ID":"005d696c-bc18-45cc-bcd9-8d22455874e7","Type":"ContainerDied","Data":"f622661d51342d087fc0301e813177efb2ed842cab4bf00078a1ee3823c0f6b7"} Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.363841 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5jq6t" event={"ID":"005d696c-bc18-45cc-bcd9-8d22455874e7","Type":"ContainerStarted","Data":"e6230ab5f84eecd0cf41ce95069ec4de1fa958845a4ea5de49e3f5ca2b3c9d7e"} Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.368376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzbkj" event={"ID":"5bb3a009-b9ed-4054-ac5f-c7bd866f9634","Type":"ContainerStarted","Data":"cc595693fda99076716caa92af7cbf30812d41056d970a028935d9c6c3b3800e"} Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.369221 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lzbkj" Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.381265 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-nm4ks"] Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.382927 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nm4ks" Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.388509 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-nm4ks"] Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.478601 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfdnv\" (UniqueName: \"kubernetes.io/projected/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-kube-api-access-rfdnv\") pod \"octavia-db-create-nm4ks\" (UID: \"0e2d3a4e-8b93-40b6-80fd-79dc6c707264\") " pod="openstack/octavia-db-create-nm4ks" Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.478937 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-operator-scripts\") pod \"octavia-db-create-nm4ks\" (UID: \"0e2d3a4e-8b93-40b6-80fd-79dc6c707264\") " pod="openstack/octavia-db-create-nm4ks" Nov 22 09:59:22 crc kubenswrapper[4743]: W1122 09:59:22.524912 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod580be048_ba5a_4927_bd45_28d898c01ca1.slice/crio-73c25b3f6b4b7f50b4b15f1d7ffaae2d2075fd42eaaccc5cb408303604d003bf WatchSource:0}: Error finding container 73c25b3f6b4b7f50b4b15f1d7ffaae2d2075fd42eaaccc5cb408303604d003bf: Status 404 returned error can't find the container with id 73c25b3f6b4b7f50b4b15f1d7ffaae2d2075fd42eaaccc5cb408303604d003bf Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.539176 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l4tln"] Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.540519 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lzbkj" podStartSLOduration=2.5404990720000002 podStartE2EDuration="2.540499072s" podCreationTimestamp="2025-11-22 09:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:59:22.509688276 +0000 UTC m=+5836.216049328" watchObservedRunningTime="2025-11-22 09:59:22.540499072 +0000 UTC m=+5836.246860134" Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.581773 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfdnv\" (UniqueName: \"kubernetes.io/projected/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-kube-api-access-rfdnv\") pod \"octavia-db-create-nm4ks\" (UID: \"0e2d3a4e-8b93-40b6-80fd-79dc6c707264\") " pod="openstack/octavia-db-create-nm4ks" Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.581828 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-operator-scripts\") pod \"octavia-db-create-nm4ks\" (UID: \"0e2d3a4e-8b93-40b6-80fd-79dc6c707264\") " pod="openstack/octavia-db-create-nm4ks" Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.582547 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-operator-scripts\") pod \"octavia-db-create-nm4ks\" (UID: \"0e2d3a4e-8b93-40b6-80fd-79dc6c707264\") " pod="openstack/octavia-db-create-nm4ks" Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.606627 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfdnv\" (UniqueName: \"kubernetes.io/projected/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-kube-api-access-rfdnv\") pod \"octavia-db-create-nm4ks\" (UID: \"0e2d3a4e-8b93-40b6-80fd-79dc6c707264\") " pod="openstack/octavia-db-create-nm4ks" Nov 22 09:59:22 crc kubenswrapper[4743]: I1122 09:59:22.779141 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nm4ks" Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.166348 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="478564b9-9f44-489e-bacd-c9bb128f8e28" path="/var/lib/kubelet/pods/478564b9-9f44-489e-bacd-c9bb128f8e28/volumes" Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.308043 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-nm4ks"] Nov 22 09:59:23 crc kubenswrapper[4743]: W1122 09:59:23.315787 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e2d3a4e_8b93_40b6_80fd_79dc6c707264.slice/crio-ca718f59c3a6e0eb588a06a0a49043ed0b47fb69616eb6ff78a96779df06d955 WatchSource:0}: Error finding container ca718f59c3a6e0eb588a06a0a49043ed0b47fb69616eb6ff78a96779df06d955: Status 404 returned error can't find the container with id ca718f59c3a6e0eb588a06a0a49043ed0b47fb69616eb6ff78a96779df06d955 Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.379262 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-nm4ks" event={"ID":"0e2d3a4e-8b93-40b6-80fd-79dc6c707264","Type":"ContainerStarted","Data":"ca718f59c3a6e0eb588a06a0a49043ed0b47fb69616eb6ff78a96779df06d955"} Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.381531 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l4tln" event={"ID":"580be048-ba5a-4927-bd45-28d898c01ca1","Type":"ContainerStarted","Data":"b6784f8ff2998c587d68f9d731397a913e98ca1c4805488ae0cf2df2a46542b7"} Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.381558 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l4tln" event={"ID":"580be048-ba5a-4927-bd45-28d898c01ca1","Type":"ContainerStarted","Data":"73c25b3f6b4b7f50b4b15f1d7ffaae2d2075fd42eaaccc5cb408303604d003bf"} Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.385724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5jq6t" event={"ID":"005d696c-bc18-45cc-bcd9-8d22455874e7","Type":"ContainerStarted","Data":"415e5b6a7b6ee09c75de5be5d32a0f115c3f4d55e9db50b11457748a4de893a1"} Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.385764 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5jq6t" event={"ID":"005d696c-bc18-45cc-bcd9-8d22455874e7","Type":"ContainerStarted","Data":"35f1478ec514611f48d1cc9dd08152038af625bec4f820c1ea8393eb03d1c187"} Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.385943 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.408102 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-l4tln" podStartSLOduration=2.4080827879999998 podStartE2EDuration="2.408082788s" podCreationTimestamp="2025-11-22 09:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:59:23.401153139 +0000 UTC m=+5837.107514201" watchObservedRunningTime="2025-11-22 09:59:23.408082788 +0000 UTC m=+5837.114443830" Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.431766 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5jq6t" podStartSLOduration=3.431749269 podStartE2EDuration="3.431749269s" podCreationTimestamp="2025-11-22 09:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:59:23.424579632 +0000 UTC m=+5837.130940684" watchObservedRunningTime="2025-11-22 09:59:23.431749269 +0000 UTC m=+5837.138110321" Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.749565 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-7645-account-create-d5rhk"] Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.750758 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-7645-account-create-d5rhk" Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.753082 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.759678 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-7645-account-create-d5rhk"] Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.903430 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsv9w\" (UniqueName: \"kubernetes.io/projected/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-kube-api-access-hsv9w\") pod \"octavia-7645-account-create-d5rhk\" (UID: \"8fd0777e-d097-4e1a-b0ca-953d57f46f0f\") " pod="openstack/octavia-7645-account-create-d5rhk" Nov 22 09:59:23 crc kubenswrapper[4743]: I1122 09:59:23.903559 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-operator-scripts\") pod \"octavia-7645-account-create-d5rhk\" (UID: \"8fd0777e-d097-4e1a-b0ca-953d57f46f0f\") " pod="openstack/octavia-7645-account-create-d5rhk" Nov 22 09:59:24 crc kubenswrapper[4743]: I1122 09:59:24.005676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-operator-scripts\") pod \"octavia-7645-account-create-d5rhk\" (UID: \"8fd0777e-d097-4e1a-b0ca-953d57f46f0f\") " pod="openstack/octavia-7645-account-create-d5rhk" Nov 22 09:59:24 crc kubenswrapper[4743]: I1122 09:59:24.006183 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsv9w\" (UniqueName: \"kubernetes.io/projected/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-kube-api-access-hsv9w\") pod \"octavia-7645-account-create-d5rhk\" (UID: \"8fd0777e-d097-4e1a-b0ca-953d57f46f0f\") " pod="openstack/octavia-7645-account-create-d5rhk" Nov 22 09:59:24 crc kubenswrapper[4743]: I1122 09:59:24.006729 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-operator-scripts\") pod \"octavia-7645-account-create-d5rhk\" (UID: \"8fd0777e-d097-4e1a-b0ca-953d57f46f0f\") " pod="openstack/octavia-7645-account-create-d5rhk" Nov 22 09:59:24 crc kubenswrapper[4743]: I1122 09:59:24.033528 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsv9w\" (UniqueName: \"kubernetes.io/projected/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-kube-api-access-hsv9w\") pod \"octavia-7645-account-create-d5rhk\" (UID: \"8fd0777e-d097-4e1a-b0ca-953d57f46f0f\") " pod="openstack/octavia-7645-account-create-d5rhk" Nov 22 09:59:24 crc kubenswrapper[4743]: I1122 09:59:24.078522 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-7645-account-create-d5rhk" Nov 22 09:59:24 crc kubenswrapper[4743]: I1122 09:59:24.396848 4743 generic.go:334] "Generic (PLEG): container finished" podID="0e2d3a4e-8b93-40b6-80fd-79dc6c707264" containerID="a395112858fa8c07826ad2df2a9b95961ec3008005cc1ec208b942e9be38365e" exitCode=0 Nov 22 09:59:24 crc kubenswrapper[4743]: I1122 09:59:24.397011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-nm4ks" event={"ID":"0e2d3a4e-8b93-40b6-80fd-79dc6c707264","Type":"ContainerDied","Data":"a395112858fa8c07826ad2df2a9b95961ec3008005cc1ec208b942e9be38365e"} Nov 22 09:59:24 crc kubenswrapper[4743]: I1122 09:59:24.398556 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:24 crc kubenswrapper[4743]: I1122 09:59:24.548030 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-7645-account-create-d5rhk"] Nov 22 09:59:25 crc kubenswrapper[4743]: I1122 09:59:25.412101 4743 generic.go:334] "Generic (PLEG): container finished" podID="8fd0777e-d097-4e1a-b0ca-953d57f46f0f" containerID="0db14e04cac86c24013d603638b4f73e7a9c7cc262e9e6d3384046a2491e5a77" exitCode=0 Nov 22 09:59:25 crc kubenswrapper[4743]: I1122 09:59:25.412182 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-7645-account-create-d5rhk" event={"ID":"8fd0777e-d097-4e1a-b0ca-953d57f46f0f","Type":"ContainerDied","Data":"0db14e04cac86c24013d603638b4f73e7a9c7cc262e9e6d3384046a2491e5a77"} Nov 22 09:59:25 crc kubenswrapper[4743]: I1122 09:59:25.412269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-7645-account-create-d5rhk" event={"ID":"8fd0777e-d097-4e1a-b0ca-953d57f46f0f","Type":"ContainerStarted","Data":"dda2250659cbc7a54bd803c8b8d79a72c8065a05eb4de1abbec224c8b743550f"} Nov 22 09:59:25 crc kubenswrapper[4743]: I1122 09:59:25.759013 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nm4ks" Nov 22 09:59:25 crc kubenswrapper[4743]: I1122 09:59:25.854232 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-operator-scripts\") pod \"0e2d3a4e-8b93-40b6-80fd-79dc6c707264\" (UID: \"0e2d3a4e-8b93-40b6-80fd-79dc6c707264\") " Nov 22 09:59:25 crc kubenswrapper[4743]: I1122 09:59:25.855123 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e2d3a4e-8b93-40b6-80fd-79dc6c707264" (UID: "0e2d3a4e-8b93-40b6-80fd-79dc6c707264"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:59:25 crc kubenswrapper[4743]: I1122 09:59:25.956719 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfdnv\" (UniqueName: \"kubernetes.io/projected/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-kube-api-access-rfdnv\") pod \"0e2d3a4e-8b93-40b6-80fd-79dc6c707264\" (UID: \"0e2d3a4e-8b93-40b6-80fd-79dc6c707264\") " Nov 22 09:59:25 crc kubenswrapper[4743]: I1122 09:59:25.957145 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:25 crc kubenswrapper[4743]: I1122 09:59:25.962091 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-kube-api-access-rfdnv" (OuterVolumeSpecName: "kube-api-access-rfdnv") pod "0e2d3a4e-8b93-40b6-80fd-79dc6c707264" (UID: "0e2d3a4e-8b93-40b6-80fd-79dc6c707264"). InnerVolumeSpecName "kube-api-access-rfdnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:59:26 crc kubenswrapper[4743]: I1122 09:59:26.058328 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfdnv\" (UniqueName: \"kubernetes.io/projected/0e2d3a4e-8b93-40b6-80fd-79dc6c707264-kube-api-access-rfdnv\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:26 crc kubenswrapper[4743]: I1122 09:59:26.151580 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 09:59:26 crc kubenswrapper[4743]: E1122 09:59:26.152081 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:59:26 crc kubenswrapper[4743]: I1122 09:59:26.423240 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-nm4ks" event={"ID":"0e2d3a4e-8b93-40b6-80fd-79dc6c707264","Type":"ContainerDied","Data":"ca718f59c3a6e0eb588a06a0a49043ed0b47fb69616eb6ff78a96779df06d955"} Nov 22 09:59:26 crc kubenswrapper[4743]: I1122 09:59:26.423290 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca718f59c3a6e0eb588a06a0a49043ed0b47fb69616eb6ff78a96779df06d955" Nov 22 09:59:26 crc kubenswrapper[4743]: I1122 09:59:26.423299 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-nm4ks" Nov 22 09:59:26 crc kubenswrapper[4743]: I1122 09:59:26.774438 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-7645-account-create-d5rhk" Nov 22 09:59:26 crc kubenswrapper[4743]: I1122 09:59:26.974466 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-operator-scripts\") pod \"8fd0777e-d097-4e1a-b0ca-953d57f46f0f\" (UID: \"8fd0777e-d097-4e1a-b0ca-953d57f46f0f\") " Nov 22 09:59:26 crc kubenswrapper[4743]: I1122 09:59:26.974789 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsv9w\" (UniqueName: \"kubernetes.io/projected/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-kube-api-access-hsv9w\") pod \"8fd0777e-d097-4e1a-b0ca-953d57f46f0f\" (UID: \"8fd0777e-d097-4e1a-b0ca-953d57f46f0f\") " Nov 22 09:59:26 crc kubenswrapper[4743]: I1122 09:59:26.975454 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fd0777e-d097-4e1a-b0ca-953d57f46f0f" (UID: "8fd0777e-d097-4e1a-b0ca-953d57f46f0f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:59:26 crc kubenswrapper[4743]: I1122 09:59:26.980053 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-kube-api-access-hsv9w" (OuterVolumeSpecName: "kube-api-access-hsv9w") pod "8fd0777e-d097-4e1a-b0ca-953d57f46f0f" (UID: "8fd0777e-d097-4e1a-b0ca-953d57f46f0f"). InnerVolumeSpecName "kube-api-access-hsv9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:59:27 crc kubenswrapper[4743]: I1122 09:59:27.077179 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsv9w\" (UniqueName: \"kubernetes.io/projected/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-kube-api-access-hsv9w\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:27 crc kubenswrapper[4743]: I1122 09:59:27.077226 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fd0777e-d097-4e1a-b0ca-953d57f46f0f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:27 crc kubenswrapper[4743]: I1122 09:59:27.434090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-7645-account-create-d5rhk" event={"ID":"8fd0777e-d097-4e1a-b0ca-953d57f46f0f","Type":"ContainerDied","Data":"dda2250659cbc7a54bd803c8b8d79a72c8065a05eb4de1abbec224c8b743550f"} Nov 22 09:59:27 crc kubenswrapper[4743]: I1122 09:59:27.434450 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda2250659cbc7a54bd803c8b8d79a72c8065a05eb4de1abbec224c8b743550f" Nov 22 09:59:27 crc kubenswrapper[4743]: I1122 09:59:27.434138 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-7645-account-create-d5rhk" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.361003 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-dc4b9"] Nov 22 09:59:29 crc kubenswrapper[4743]: E1122 09:59:29.361729 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd0777e-d097-4e1a-b0ca-953d57f46f0f" containerName="mariadb-account-create" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.361740 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd0777e-d097-4e1a-b0ca-953d57f46f0f" containerName="mariadb-account-create" Nov 22 09:59:29 crc kubenswrapper[4743]: E1122 09:59:29.361758 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e2d3a4e-8b93-40b6-80fd-79dc6c707264" containerName="mariadb-database-create" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.361764 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e2d3a4e-8b93-40b6-80fd-79dc6c707264" containerName="mariadb-database-create" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.361939 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd0777e-d097-4e1a-b0ca-953d57f46f0f" containerName="mariadb-account-create" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.361967 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e2d3a4e-8b93-40b6-80fd-79dc6c707264" containerName="mariadb-database-create" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.362600 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-dc4b9" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.374165 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-dc4b9"] Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.531808 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvjlk\" (UniqueName: \"kubernetes.io/projected/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-kube-api-access-nvjlk\") pod \"octavia-persistence-db-create-dc4b9\" (UID: \"e9ef38b4-5852-4537-94c3-f2cc93dbb21f\") " pod="openstack/octavia-persistence-db-create-dc4b9" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.531898 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-operator-scripts\") pod \"octavia-persistence-db-create-dc4b9\" (UID: \"e9ef38b4-5852-4537-94c3-f2cc93dbb21f\") " pod="openstack/octavia-persistence-db-create-dc4b9" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.633758 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvjlk\" (UniqueName: \"kubernetes.io/projected/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-kube-api-access-nvjlk\") pod \"octavia-persistence-db-create-dc4b9\" (UID: \"e9ef38b4-5852-4537-94c3-f2cc93dbb21f\") " pod="openstack/octavia-persistence-db-create-dc4b9" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.633885 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-operator-scripts\") pod \"octavia-persistence-db-create-dc4b9\" (UID: \"e9ef38b4-5852-4537-94c3-f2cc93dbb21f\") " pod="openstack/octavia-persistence-db-create-dc4b9" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.636767 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-operator-scripts\") pod \"octavia-persistence-db-create-dc4b9\" (UID: \"e9ef38b4-5852-4537-94c3-f2cc93dbb21f\") " pod="openstack/octavia-persistence-db-create-dc4b9" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.660425 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvjlk\" (UniqueName: \"kubernetes.io/projected/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-kube-api-access-nvjlk\") pod \"octavia-persistence-db-create-dc4b9\" (UID: \"e9ef38b4-5852-4537-94c3-f2cc93dbb21f\") " pod="openstack/octavia-persistence-db-create-dc4b9" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.741386 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-dc4b9" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.873884 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-70ee-account-create-g8k5f"] Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.875296 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70ee-account-create-g8k5f" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.888044 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Nov 22 09:59:29 crc kubenswrapper[4743]: I1122 09:59:29.913201 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-70ee-account-create-g8k5f"] Nov 22 09:59:30 crc kubenswrapper[4743]: I1122 09:59:30.041719 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwmwg\" (UniqueName: \"kubernetes.io/projected/77960df9-960d-47e5-851f-4f6a5df2384c-kube-api-access-dwmwg\") pod \"octavia-70ee-account-create-g8k5f\" (UID: \"77960df9-960d-47e5-851f-4f6a5df2384c\") " pod="openstack/octavia-70ee-account-create-g8k5f" Nov 22 09:59:30 crc kubenswrapper[4743]: I1122 09:59:30.042205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77960df9-960d-47e5-851f-4f6a5df2384c-operator-scripts\") pod \"octavia-70ee-account-create-g8k5f\" (UID: \"77960df9-960d-47e5-851f-4f6a5df2384c\") " pod="openstack/octavia-70ee-account-create-g8k5f" Nov 22 09:59:30 crc kubenswrapper[4743]: I1122 09:59:30.144358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwmwg\" (UniqueName: \"kubernetes.io/projected/77960df9-960d-47e5-851f-4f6a5df2384c-kube-api-access-dwmwg\") pod \"octavia-70ee-account-create-g8k5f\" (UID: \"77960df9-960d-47e5-851f-4f6a5df2384c\") " pod="openstack/octavia-70ee-account-create-g8k5f" Nov 22 09:59:30 crc kubenswrapper[4743]: I1122 09:59:30.144453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77960df9-960d-47e5-851f-4f6a5df2384c-operator-scripts\") pod \"octavia-70ee-account-create-g8k5f\" (UID: \"77960df9-960d-47e5-851f-4f6a5df2384c\") " pod="openstack/octavia-70ee-account-create-g8k5f" Nov 22 09:59:30 crc kubenswrapper[4743]: I1122 09:59:30.145365 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77960df9-960d-47e5-851f-4f6a5df2384c-operator-scripts\") pod \"octavia-70ee-account-create-g8k5f\" (UID: \"77960df9-960d-47e5-851f-4f6a5df2384c\") " pod="openstack/octavia-70ee-account-create-g8k5f" Nov 22 09:59:30 crc kubenswrapper[4743]: I1122 09:59:30.177966 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwmwg\" (UniqueName: \"kubernetes.io/projected/77960df9-960d-47e5-851f-4f6a5df2384c-kube-api-access-dwmwg\") pod \"octavia-70ee-account-create-g8k5f\" (UID: \"77960df9-960d-47e5-851f-4f6a5df2384c\") " pod="openstack/octavia-70ee-account-create-g8k5f" Nov 22 09:59:30 crc kubenswrapper[4743]: I1122 09:59:30.217702 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70ee-account-create-g8k5f" Nov 22 09:59:30 crc kubenswrapper[4743]: I1122 09:59:30.260984 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-dc4b9"] Nov 22 09:59:30 crc kubenswrapper[4743]: I1122 09:59:30.461492 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-dc4b9" event={"ID":"e9ef38b4-5852-4537-94c3-f2cc93dbb21f","Type":"ContainerStarted","Data":"785c5cf44c3bc862c7b8f1737f77f3df464a99168d992f6f8e647218876e2f51"} Nov 22 09:59:30 crc kubenswrapper[4743]: I1122 09:59:30.684401 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-70ee-account-create-g8k5f"] Nov 22 09:59:30 crc kubenswrapper[4743]: W1122 09:59:30.690530 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77960df9_960d_47e5_851f_4f6a5df2384c.slice/crio-c4c9e594e46c900ce6d89089bae66bb91598bcf933da4db287cd4382d3aece84 WatchSource:0}: Error finding container c4c9e594e46c900ce6d89089bae66bb91598bcf933da4db287cd4382d3aece84: Status 404 returned error can't find the container with id c4c9e594e46c900ce6d89089bae66bb91598bcf933da4db287cd4382d3aece84 Nov 22 09:59:31 crc kubenswrapper[4743]: I1122 09:59:31.474008 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-dc4b9" event={"ID":"e9ef38b4-5852-4537-94c3-f2cc93dbb21f","Type":"ContainerStarted","Data":"631616f9386105f81aba653a922adaef78b96dfb7ce2edbda255565489ba1c32"} Nov 22 09:59:31 crc kubenswrapper[4743]: I1122 09:59:31.478012 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-70ee-account-create-g8k5f" event={"ID":"77960df9-960d-47e5-851f-4f6a5df2384c","Type":"ContainerStarted","Data":"d8d0062a96c27d5409fcbf7ad8b8abe495b2c91bc13e7275d05ec09424884a52"} Nov 22 09:59:31 crc kubenswrapper[4743]: I1122 09:59:31.478042 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-70ee-account-create-g8k5f" event={"ID":"77960df9-960d-47e5-851f-4f6a5df2384c","Type":"ContainerStarted","Data":"c4c9e594e46c900ce6d89089bae66bb91598bcf933da4db287cd4382d3aece84"} Nov 22 09:59:31 crc kubenswrapper[4743]: I1122 09:59:31.526959 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-70ee-account-create-g8k5f" podStartSLOduration=2.526931453 podStartE2EDuration="2.526931453s" podCreationTimestamp="2025-11-22 09:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:59:31.514065404 +0000 UTC m=+5845.220426466" watchObservedRunningTime="2025-11-22 09:59:31.526931453 +0000 UTC m=+5845.233292515" Nov 22 09:59:32 crc kubenswrapper[4743]: I1122 09:59:32.486963 4743 generic.go:334] "Generic (PLEG): container finished" podID="e9ef38b4-5852-4537-94c3-f2cc93dbb21f" containerID="631616f9386105f81aba653a922adaef78b96dfb7ce2edbda255565489ba1c32" exitCode=0 Nov 22 09:59:32 crc kubenswrapper[4743]: I1122 09:59:32.487069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-dc4b9" event={"ID":"e9ef38b4-5852-4537-94c3-f2cc93dbb21f","Type":"ContainerDied","Data":"631616f9386105f81aba653a922adaef78b96dfb7ce2edbda255565489ba1c32"} Nov 22 09:59:32 crc kubenswrapper[4743]: I1122 09:59:32.488734 4743 generic.go:334] "Generic (PLEG): container finished" podID="77960df9-960d-47e5-851f-4f6a5df2384c" containerID="d8d0062a96c27d5409fcbf7ad8b8abe495b2c91bc13e7275d05ec09424884a52" exitCode=0 Nov 22 09:59:32 crc kubenswrapper[4743]: I1122 09:59:32.488770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-70ee-account-create-g8k5f" event={"ID":"77960df9-960d-47e5-851f-4f6a5df2384c","Type":"ContainerDied","Data":"d8d0062a96c27d5409fcbf7ad8b8abe495b2c91bc13e7275d05ec09424884a52"} Nov 22 09:59:32 crc kubenswrapper[4743]: I1122 09:59:32.856706 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-dc4b9" Nov 22 09:59:32 crc kubenswrapper[4743]: I1122 09:59:32.998168 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-operator-scripts\") pod \"e9ef38b4-5852-4537-94c3-f2cc93dbb21f\" (UID: \"e9ef38b4-5852-4537-94c3-f2cc93dbb21f\") " Nov 22 09:59:32 crc kubenswrapper[4743]: I1122 09:59:32.998348 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvjlk\" (UniqueName: \"kubernetes.io/projected/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-kube-api-access-nvjlk\") pod \"e9ef38b4-5852-4537-94c3-f2cc93dbb21f\" (UID: \"e9ef38b4-5852-4537-94c3-f2cc93dbb21f\") " Nov 22 09:59:32 crc kubenswrapper[4743]: I1122 09:59:32.998752 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9ef38b4-5852-4537-94c3-f2cc93dbb21f" (UID: "e9ef38b4-5852-4537-94c3-f2cc93dbb21f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.005699 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-kube-api-access-nvjlk" (OuterVolumeSpecName: "kube-api-access-nvjlk") pod "e9ef38b4-5852-4537-94c3-f2cc93dbb21f" (UID: "e9ef38b4-5852-4537-94c3-f2cc93dbb21f"). InnerVolumeSpecName "kube-api-access-nvjlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.101105 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvjlk\" (UniqueName: \"kubernetes.io/projected/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-kube-api-access-nvjlk\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.101143 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9ef38b4-5852-4537-94c3-f2cc93dbb21f-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.273118 4743 scope.go:117] "RemoveContainer" containerID="e1d1c9ad481460496e72f548e011da607d3c4ba5806b5842a2e2f1cb2e868cad" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.303451 4743 scope.go:117] "RemoveContainer" containerID="9840829eda763bffcc00181d2397c321389a4a4e85764eb6bdd4164c5141ef04" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.352070 4743 scope.go:117] "RemoveContainer" containerID="139fe17e6226ce28ea3b7c40c59f999dd29e99fb60f95ed5dfbd1359b0fd74b5" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.393448 4743 scope.go:117] "RemoveContainer" containerID="700120de0c73ba804db0c2824490ffe15dae21c5db97102a4323d0d0c4eadc59" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.512563 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-dc4b9" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.512573 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-dc4b9" event={"ID":"e9ef38b4-5852-4537-94c3-f2cc93dbb21f","Type":"ContainerDied","Data":"785c5cf44c3bc862c7b8f1737f77f3df464a99168d992f6f8e647218876e2f51"} Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.512642 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="785c5cf44c3bc862c7b8f1737f77f3df464a99168d992f6f8e647218876e2f51" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.757461 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70ee-account-create-g8k5f" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.916616 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77960df9-960d-47e5-851f-4f6a5df2384c-operator-scripts\") pod \"77960df9-960d-47e5-851f-4f6a5df2384c\" (UID: \"77960df9-960d-47e5-851f-4f6a5df2384c\") " Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.917186 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwmwg\" (UniqueName: \"kubernetes.io/projected/77960df9-960d-47e5-851f-4f6a5df2384c-kube-api-access-dwmwg\") pod \"77960df9-960d-47e5-851f-4f6a5df2384c\" (UID: \"77960df9-960d-47e5-851f-4f6a5df2384c\") " Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.917621 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77960df9-960d-47e5-851f-4f6a5df2384c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77960df9-960d-47e5-851f-4f6a5df2384c" (UID: "77960df9-960d-47e5-851f-4f6a5df2384c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 09:59:33 crc kubenswrapper[4743]: I1122 09:59:33.927805 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77960df9-960d-47e5-851f-4f6a5df2384c-kube-api-access-dwmwg" (OuterVolumeSpecName: "kube-api-access-dwmwg") pod "77960df9-960d-47e5-851f-4f6a5df2384c" (UID: "77960df9-960d-47e5-851f-4f6a5df2384c"). InnerVolumeSpecName "kube-api-access-dwmwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 09:59:34 crc kubenswrapper[4743]: I1122 09:59:34.019418 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77960df9-960d-47e5-851f-4f6a5df2384c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:34 crc kubenswrapper[4743]: I1122 09:59:34.019452 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwmwg\" (UniqueName: \"kubernetes.io/projected/77960df9-960d-47e5-851f-4f6a5df2384c-kube-api-access-dwmwg\") on node \"crc\" DevicePath \"\"" Nov 22 09:59:34 crc kubenswrapper[4743]: I1122 09:59:34.522254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-70ee-account-create-g8k5f" event={"ID":"77960df9-960d-47e5-851f-4f6a5df2384c","Type":"ContainerDied","Data":"c4c9e594e46c900ce6d89089bae66bb91598bcf933da4db287cd4382d3aece84"} Nov 22 09:59:34 crc kubenswrapper[4743]: I1122 09:59:34.522313 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70ee-account-create-g8k5f" Nov 22 09:59:34 crc kubenswrapper[4743]: I1122 09:59:34.522336 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4c9e594e46c900ce6d89089bae66bb91598bcf933da4db287cd4382d3aece84" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.479206 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-69f8b8c646-hc89r"] Nov 22 09:59:35 crc kubenswrapper[4743]: E1122 09:59:35.479980 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77960df9-960d-47e5-851f-4f6a5df2384c" containerName="mariadb-account-create" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.479996 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="77960df9-960d-47e5-851f-4f6a5df2384c" containerName="mariadb-account-create" Nov 22 09:59:35 crc kubenswrapper[4743]: E1122 09:59:35.480021 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ef38b4-5852-4537-94c3-f2cc93dbb21f" containerName="mariadb-database-create" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.480027 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ef38b4-5852-4537-94c3-f2cc93dbb21f" containerName="mariadb-database-create" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.480196 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="77960df9-960d-47e5-851f-4f6a5df2384c" containerName="mariadb-account-create" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.480220 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ef38b4-5852-4537-94c3-f2cc93dbb21f" containerName="mariadb-database-create" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.481565 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.485352 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.485357 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-b75d5" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.487874 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-69f8b8c646-hc89r"] Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.501252 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.648480 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-config-data\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.648556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-combined-ca-bundle\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.648687 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-scripts\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.648705 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-config-data-merged\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.648723 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-octavia-run\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.750158 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-scripts\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.750211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-config-data-merged\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.750237 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-octavia-run\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.750287 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-config-data\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.750324 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-combined-ca-bundle\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.750823 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-config-data-merged\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.750886 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-octavia-run\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.755027 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-scripts\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.757057 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-config-data\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.761614 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3ee158-5b93-4fa3-b8ef-13f9e0f19747-combined-ca-bundle\") pod \"octavia-api-69f8b8c646-hc89r\" (UID: \"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747\") " pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:35 crc kubenswrapper[4743]: I1122 09:59:35.806457 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:36 crc kubenswrapper[4743]: I1122 09:59:36.330894 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-69f8b8c646-hc89r"] Nov 22 09:59:36 crc kubenswrapper[4743]: I1122 09:59:36.538808 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-69f8b8c646-hc89r" event={"ID":"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747","Type":"ContainerStarted","Data":"056378e08fc35093291c7bc762c22c032e355bd11474a97756bfdb9ec8a73d25"} Nov 22 09:59:38 crc kubenswrapper[4743]: I1122 09:59:38.153413 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 09:59:38 crc kubenswrapper[4743]: E1122 09:59:38.154493 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:59:45 crc kubenswrapper[4743]: I1122 09:59:45.621448 4743 generic.go:334] "Generic (PLEG): container finished" podID="ed3ee158-5b93-4fa3-b8ef-13f9e0f19747" containerID="c649dc1c06b08e66bd112aca37e7c69932e0d0fd929f1f4cf091f34fa6c8659b" exitCode=0 Nov 22 09:59:45 crc kubenswrapper[4743]: I1122 09:59:45.621521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-69f8b8c646-hc89r" event={"ID":"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747","Type":"ContainerDied","Data":"c649dc1c06b08e66bd112aca37e7c69932e0d0fd929f1f4cf091f34fa6c8659b"} Nov 22 09:59:46 crc kubenswrapper[4743]: I1122 09:59:46.631789 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-69f8b8c646-hc89r" event={"ID":"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747","Type":"ContainerStarted","Data":"e080d811e7ed604a923f3478907e4c2351023f330c633b6185c9fa381eb4b3af"} Nov 22 09:59:46 crc kubenswrapper[4743]: I1122 09:59:46.632964 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-69f8b8c646-hc89r" event={"ID":"ed3ee158-5b93-4fa3-b8ef-13f9e0f19747","Type":"ContainerStarted","Data":"574f29b0bca1db5abeada4cbbef9d131d49a2bf906ec2cdb32c1dd779855d373"} Nov 22 09:59:46 crc kubenswrapper[4743]: I1122 09:59:46.633095 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:46 crc kubenswrapper[4743]: I1122 09:59:46.633227 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:46 crc kubenswrapper[4743]: I1122 09:59:46.657057 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-69f8b8c646-hc89r" podStartSLOduration=3.057184884 podStartE2EDuration="11.657037889s" podCreationTimestamp="2025-11-22 09:59:35 +0000 UTC" firstStartedPulling="2025-11-22 09:59:36.34529849 +0000 UTC m=+5850.051659542" lastFinishedPulling="2025-11-22 09:59:44.945151505 +0000 UTC m=+5858.651512547" observedRunningTime="2025-11-22 09:59:46.653159558 +0000 UTC m=+5860.359520610" watchObservedRunningTime="2025-11-22 09:59:46.657037889 +0000 UTC m=+5860.363398961" Nov 22 09:59:50 crc kubenswrapper[4743]: I1122 09:59:50.152796 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 09:59:50 crc kubenswrapper[4743]: E1122 09:59:50.155216 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.064596 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-446fq"] Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.066527 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.069506 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.069781 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.069936 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.104676 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-446fq"] Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.171230 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-config-data-merged\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.171289 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-hm-ports\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.171328 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-scripts\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.171440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-config-data\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.273507 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-config-data-merged\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.273605 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-hm-ports\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.273647 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-scripts\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.273663 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-config-data\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.275316 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-config-data-merged\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.275921 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-hm-ports\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.281776 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-config-data\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.282866 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748-scripts\") pod \"octavia-rsyslog-446fq\" (UID: \"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748\") " pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.391108 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-446fq" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.813502 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-686fm"] Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.815329 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-686fm" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.818613 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.830943 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-686fm"] Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.883109 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/634fdefe-cd9c-4a8a-8605-568f6a38be5e-amphora-image\") pod \"octavia-image-upload-59f8cff499-686fm\" (UID: \"634fdefe-cd9c-4a8a-8605-568f6a38be5e\") " pod="openstack/octavia-image-upload-59f8cff499-686fm" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.883210 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/634fdefe-cd9c-4a8a-8605-568f6a38be5e-httpd-config\") pod \"octavia-image-upload-59f8cff499-686fm\" (UID: \"634fdefe-cd9c-4a8a-8605-568f6a38be5e\") " pod="openstack/octavia-image-upload-59f8cff499-686fm" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.971074 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-446fq"] Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.985730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/634fdefe-cd9c-4a8a-8605-568f6a38be5e-amphora-image\") pod \"octavia-image-upload-59f8cff499-686fm\" (UID: \"634fdefe-cd9c-4a8a-8605-568f6a38be5e\") " pod="openstack/octavia-image-upload-59f8cff499-686fm" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.985836 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/634fdefe-cd9c-4a8a-8605-568f6a38be5e-httpd-config\") pod \"octavia-image-upload-59f8cff499-686fm\" (UID: \"634fdefe-cd9c-4a8a-8605-568f6a38be5e\") " pod="openstack/octavia-image-upload-59f8cff499-686fm" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.988791 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/634fdefe-cd9c-4a8a-8605-568f6a38be5e-amphora-image\") pod \"octavia-image-upload-59f8cff499-686fm\" (UID: \"634fdefe-cd9c-4a8a-8605-568f6a38be5e\") " pod="openstack/octavia-image-upload-59f8cff499-686fm" Nov 22 09:59:52 crc kubenswrapper[4743]: I1122 09:59:52.997477 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/634fdefe-cd9c-4a8a-8605-568f6a38be5e-httpd-config\") pod \"octavia-image-upload-59f8cff499-686fm\" (UID: \"634fdefe-cd9c-4a8a-8605-568f6a38be5e\") " pod="openstack/octavia-image-upload-59f8cff499-686fm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.150534 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-686fm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.415965 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-chxkm"] Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.441368 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.444600 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.460166 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-chxkm"] Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.496379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-combined-ca-bundle\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.496462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data-merged\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.496519 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-scripts\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.496542 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.599528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-combined-ca-bundle\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.599947 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data-merged\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.599991 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-scripts\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.600019 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.602606 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data-merged\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.605817 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-combined-ca-bundle\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.606054 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-scripts\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.608867 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data\") pod \"octavia-db-sync-chxkm\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:53 crc kubenswrapper[4743]: W1122 09:59:53.714385 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod634fdefe_cd9c_4a8a_8605_568f6a38be5e.slice/crio-e03817eb96e170dba744efdd1bc5d90c98544e01307edab5fbc243c253006963 WatchSource:0}: Error finding container e03817eb96e170dba744efdd1bc5d90c98544e01307edab5fbc243c253006963: Status 404 returned error can't find the container with id e03817eb96e170dba744efdd1bc5d90c98544e01307edab5fbc243c253006963 Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.717001 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-686fm"] Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.724254 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-446fq" event={"ID":"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748","Type":"ContainerStarted","Data":"4de5d6c7af8e70f4e7f5f0667bf5e747eddf2395b1f60c930e8a355087092f99"} Nov 22 09:59:53 crc kubenswrapper[4743]: I1122 09:59:53.768250 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-chxkm" Nov 22 09:59:54 crc kubenswrapper[4743]: I1122 09:59:54.216528 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-chxkm"] Nov 22 09:59:54 crc kubenswrapper[4743]: W1122 09:59:54.358989 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8a318d4_4c4c_4125_81e5_346f22bf3073.slice/crio-70ed5591f7189688acff4dffa86c9f207cb0d5331c7390b1f4bd812692915c29 WatchSource:0}: Error finding container 70ed5591f7189688acff4dffa86c9f207cb0d5331c7390b1f4bd812692915c29: Status 404 returned error can't find the container with id 70ed5591f7189688acff4dffa86c9f207cb0d5331c7390b1f4bd812692915c29 Nov 22 09:59:54 crc kubenswrapper[4743]: I1122 09:59:54.737745 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-686fm" event={"ID":"634fdefe-cd9c-4a8a-8605-568f6a38be5e","Type":"ContainerStarted","Data":"e03817eb96e170dba744efdd1bc5d90c98544e01307edab5fbc243c253006963"} Nov 22 09:59:54 crc kubenswrapper[4743]: I1122 09:59:54.742216 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-chxkm" event={"ID":"b8a318d4-4c4c-4125-81e5-346f22bf3073","Type":"ContainerStarted","Data":"da6e43ebf416def8a950513c7fe5d6fdb93879cdda990d1e87185bab101f98bf"} Nov 22 09:59:54 crc kubenswrapper[4743]: I1122 09:59:54.742271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-chxkm" event={"ID":"b8a318d4-4c4c-4125-81e5-346f22bf3073","Type":"ContainerStarted","Data":"70ed5591f7189688acff4dffa86c9f207cb0d5331c7390b1f4bd812692915c29"} Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.261000 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.307556 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-69f8b8c646-hc89r" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.676522 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lzbkj" podUID="5bb3a009-b9ed-4054-ac5f-c7bd866f9634" containerName="ovn-controller" probeResult="failure" output=< Nov 22 09:59:55 crc kubenswrapper[4743]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Nov 22 09:59:55 crc kubenswrapper[4743]: > Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.682672 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.682725 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5jq6t" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.754381 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-446fq" event={"ID":"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748","Type":"ContainerStarted","Data":"d94e8490f95433fdce0f1bc74e80168b74a9439244285e48e427b4e11ee8fb3b"} Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.761228 4743 generic.go:334] "Generic (PLEG): container finished" podID="b8a318d4-4c4c-4125-81e5-346f22bf3073" containerID="da6e43ebf416def8a950513c7fe5d6fdb93879cdda990d1e87185bab101f98bf" exitCode=0 Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.761341 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-chxkm" event={"ID":"b8a318d4-4c4c-4125-81e5-346f22bf3073","Type":"ContainerDied","Data":"da6e43ebf416def8a950513c7fe5d6fdb93879cdda990d1e87185bab101f98bf"} Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.781499 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lzbkj-config-j87p4"] Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.782750 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.796841 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.829626 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lzbkj-config-j87p4"] Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.845995 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gb5h\" (UniqueName: \"kubernetes.io/projected/90cceaba-39ed-4439-a80e-8938c227a8b0-kube-api-access-7gb5h\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.846400 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-scripts\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.846433 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.846599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-log-ovn\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.846637 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run-ovn\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.846677 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-additional-scripts\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.948287 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-log-ovn\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.948362 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run-ovn\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.948402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-additional-scripts\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.948446 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gb5h\" (UniqueName: \"kubernetes.io/projected/90cceaba-39ed-4439-a80e-8938c227a8b0-kube-api-access-7gb5h\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.948474 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-scripts\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.948492 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.948896 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-log-ovn\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.948963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run-ovn\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.949147 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.949995 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-additional-scripts\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.953230 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-scripts\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:55 crc kubenswrapper[4743]: I1122 09:59:55.967445 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gb5h\" (UniqueName: \"kubernetes.io/projected/90cceaba-39ed-4439-a80e-8938c227a8b0-kube-api-access-7gb5h\") pod \"ovn-controller-lzbkj-config-j87p4\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:56 crc kubenswrapper[4743]: I1122 09:59:56.223225 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 09:59:56 crc kubenswrapper[4743]: I1122 09:59:56.689880 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lzbkj-config-j87p4"] Nov 22 09:59:56 crc kubenswrapper[4743]: W1122 09:59:56.703167 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90cceaba_39ed_4439_a80e_8938c227a8b0.slice/crio-8a4a353756d52b379e563d8ca15bd13b0b23905bf49c048a49e516dd8c8b3f3e WatchSource:0}: Error finding container 8a4a353756d52b379e563d8ca15bd13b0b23905bf49c048a49e516dd8c8b3f3e: Status 404 returned error can't find the container with id 8a4a353756d52b379e563d8ca15bd13b0b23905bf49c048a49e516dd8c8b3f3e Nov 22 09:59:56 crc kubenswrapper[4743]: I1122 09:59:56.780862 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-chxkm" event={"ID":"b8a318d4-4c4c-4125-81e5-346f22bf3073","Type":"ContainerStarted","Data":"59e93eea519761ca46145223b879769b5b62892442c960ff2624fb0357997e6e"} Nov 22 09:59:56 crc kubenswrapper[4743]: I1122 09:59:56.784704 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzbkj-config-j87p4" event={"ID":"90cceaba-39ed-4439-a80e-8938c227a8b0","Type":"ContainerStarted","Data":"8a4a353756d52b379e563d8ca15bd13b0b23905bf49c048a49e516dd8c8b3f3e"} Nov 22 09:59:56 crc kubenswrapper[4743]: I1122 09:59:56.804892 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-chxkm" podStartSLOduration=3.80486106 podStartE2EDuration="3.80486106s" podCreationTimestamp="2025-11-22 09:59:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 09:59:56.798211169 +0000 UTC m=+5870.504572221" watchObservedRunningTime="2025-11-22 09:59:56.80486106 +0000 UTC m=+5870.511222112" Nov 22 09:59:58 crc kubenswrapper[4743]: I1122 09:59:58.804169 4743 generic.go:334] "Generic (PLEG): container finished" podID="a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748" containerID="d94e8490f95433fdce0f1bc74e80168b74a9439244285e48e427b4e11ee8fb3b" exitCode=0 Nov 22 09:59:58 crc kubenswrapper[4743]: I1122 09:59:58.804271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-446fq" event={"ID":"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748","Type":"ContainerDied","Data":"d94e8490f95433fdce0f1bc74e80168b74a9439244285e48e427b4e11ee8fb3b"} Nov 22 09:59:58 crc kubenswrapper[4743]: I1122 09:59:58.807313 4743 generic.go:334] "Generic (PLEG): container finished" podID="90cceaba-39ed-4439-a80e-8938c227a8b0" containerID="41d0b842029ab69023af12098c71b7d6acb9d47dab746e781b16764d3f1d21c1" exitCode=0 Nov 22 09:59:58 crc kubenswrapper[4743]: I1122 09:59:58.807355 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzbkj-config-j87p4" event={"ID":"90cceaba-39ed-4439-a80e-8938c227a8b0","Type":"ContainerDied","Data":"41d0b842029ab69023af12098c71b7d6acb9d47dab746e781b16764d3f1d21c1"} Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.145781 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t"] Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.160152 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.163019 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.163535 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.171567 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t"] Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.242930 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f767578d-897b-492f-afa7-e61b6134690d-config-volume\") pod \"collect-profiles-29396760-j8p6t\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.243074 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rt9r\" (UniqueName: \"kubernetes.io/projected/f767578d-897b-492f-afa7-e61b6134690d-kube-api-access-8rt9r\") pod \"collect-profiles-29396760-j8p6t\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.243133 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f767578d-897b-492f-afa7-e61b6134690d-secret-volume\") pod \"collect-profiles-29396760-j8p6t\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.344786 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rt9r\" (UniqueName: \"kubernetes.io/projected/f767578d-897b-492f-afa7-e61b6134690d-kube-api-access-8rt9r\") pod \"collect-profiles-29396760-j8p6t\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.344862 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f767578d-897b-492f-afa7-e61b6134690d-secret-volume\") pod \"collect-profiles-29396760-j8p6t\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.344940 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f767578d-897b-492f-afa7-e61b6134690d-config-volume\") pod \"collect-profiles-29396760-j8p6t\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.345937 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f767578d-897b-492f-afa7-e61b6134690d-config-volume\") pod \"collect-profiles-29396760-j8p6t\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.353480 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f767578d-897b-492f-afa7-e61b6134690d-secret-volume\") pod \"collect-profiles-29396760-j8p6t\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.361458 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rt9r\" (UniqueName: \"kubernetes.io/projected/f767578d-897b-492f-afa7-e61b6134690d-kube-api-access-8rt9r\") pod \"collect-profiles-29396760-j8p6t\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.479251 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:00 crc kubenswrapper[4743]: I1122 10:00:00.623792 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lzbkj" Nov 22 10:00:01 crc kubenswrapper[4743]: I1122 10:00:01.840684 4743 generic.go:334] "Generic (PLEG): container finished" podID="b8a318d4-4c4c-4125-81e5-346f22bf3073" containerID="59e93eea519761ca46145223b879769b5b62892442c960ff2624fb0357997e6e" exitCode=0 Nov 22 10:00:01 crc kubenswrapper[4743]: I1122 10:00:01.840733 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-chxkm" event={"ID":"b8a318d4-4c4c-4125-81e5-346f22bf3073","Type":"ContainerDied","Data":"59e93eea519761ca46145223b879769b5b62892442c960ff2624fb0357997e6e"} Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.334272 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-chxkm" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.403607 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-scripts\") pod \"b8a318d4-4c4c-4125-81e5-346f22bf3073\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.403725 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-combined-ca-bundle\") pod \"b8a318d4-4c4c-4125-81e5-346f22bf3073\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.404150 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data-merged\") pod \"b8a318d4-4c4c-4125-81e5-346f22bf3073\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.404223 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data\") pod \"b8a318d4-4c4c-4125-81e5-346f22bf3073\" (UID: \"b8a318d4-4c4c-4125-81e5-346f22bf3073\") " Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.413245 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data" (OuterVolumeSpecName: "config-data") pod "b8a318d4-4c4c-4125-81e5-346f22bf3073" (UID: "b8a318d4-4c4c-4125-81e5-346f22bf3073"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.413549 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-scripts" (OuterVolumeSpecName: "scripts") pod "b8a318d4-4c4c-4125-81e5-346f22bf3073" (UID: "b8a318d4-4c4c-4125-81e5-346f22bf3073"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.432658 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "b8a318d4-4c4c-4125-81e5-346f22bf3073" (UID: "b8a318d4-4c4c-4125-81e5-346f22bf3073"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.452429 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8a318d4-4c4c-4125-81e5-346f22bf3073" (UID: "b8a318d4-4c4c-4125-81e5-346f22bf3073"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.506977 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.507014 4743 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data-merged\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.507023 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.507033 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b8a318d4-4c4c-4125-81e5-346f22bf3073-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.543542 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.608312 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-log-ovn\") pod \"90cceaba-39ed-4439-a80e-8938c227a8b0\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.608449 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "90cceaba-39ed-4439-a80e-8938c227a8b0" (UID: "90cceaba-39ed-4439-a80e-8938c227a8b0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.608550 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-scripts\") pod \"90cceaba-39ed-4439-a80e-8938c227a8b0\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.608612 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run\") pod \"90cceaba-39ed-4439-a80e-8938c227a8b0\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.608682 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-additional-scripts\") pod \"90cceaba-39ed-4439-a80e-8938c227a8b0\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.608702 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gb5h\" (UniqueName: \"kubernetes.io/projected/90cceaba-39ed-4439-a80e-8938c227a8b0-kube-api-access-7gb5h\") pod \"90cceaba-39ed-4439-a80e-8938c227a8b0\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.608744 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run-ovn\") pod \"90cceaba-39ed-4439-a80e-8938c227a8b0\" (UID: \"90cceaba-39ed-4439-a80e-8938c227a8b0\") " Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.608798 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run" (OuterVolumeSpecName: "var-run") pod "90cceaba-39ed-4439-a80e-8938c227a8b0" (UID: "90cceaba-39ed-4439-a80e-8938c227a8b0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.609112 4743 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.609129 4743 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.609169 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "90cceaba-39ed-4439-a80e-8938c227a8b0" (UID: "90cceaba-39ed-4439-a80e-8938c227a8b0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.609351 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "90cceaba-39ed-4439-a80e-8938c227a8b0" (UID: "90cceaba-39ed-4439-a80e-8938c227a8b0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.609613 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-scripts" (OuterVolumeSpecName: "scripts") pod "90cceaba-39ed-4439-a80e-8938c227a8b0" (UID: "90cceaba-39ed-4439-a80e-8938c227a8b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.612655 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cceaba-39ed-4439-a80e-8938c227a8b0-kube-api-access-7gb5h" (OuterVolumeSpecName: "kube-api-access-7gb5h") pod "90cceaba-39ed-4439-a80e-8938c227a8b0" (UID: "90cceaba-39ed-4439-a80e-8938c227a8b0"). InnerVolumeSpecName "kube-api-access-7gb5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.711302 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.711340 4743 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/90cceaba-39ed-4439-a80e-8938c227a8b0-additional-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.711355 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gb5h\" (UniqueName: \"kubernetes.io/projected/90cceaba-39ed-4439-a80e-8938c227a8b0-kube-api-access-7gb5h\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.711367 4743 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/90cceaba-39ed-4439-a80e-8938c227a8b0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.874916 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-chxkm" event={"ID":"b8a318d4-4c4c-4125-81e5-346f22bf3073","Type":"ContainerDied","Data":"70ed5591f7189688acff4dffa86c9f207cb0d5331c7390b1f4bd812692915c29"} Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.875851 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70ed5591f7189688acff4dffa86c9f207cb0d5331c7390b1f4bd812692915c29" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.875453 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-chxkm" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.884948 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lzbkj-config-j87p4" event={"ID":"90cceaba-39ed-4439-a80e-8938c227a8b0","Type":"ContainerDied","Data":"8a4a353756d52b379e563d8ca15bd13b0b23905bf49c048a49e516dd8c8b3f3e"} Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.884988 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a4a353756d52b379e563d8ca15bd13b0b23905bf49c048a49e516dd8c8b3f3e" Nov 22 10:00:03 crc kubenswrapper[4743]: I1122 10:00:03.884999 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lzbkj-config-j87p4" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.077903 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t"] Nov 22 10:00:04 crc kubenswrapper[4743]: W1122 10:00:04.091218 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf767578d_897b_492f_afa7_e61b6134690d.slice/crio-dbe23b7422249ca248370edd835c389ce84ff3b8245de1449b9e793080d0a6eb WatchSource:0}: Error finding container dbe23b7422249ca248370edd835c389ce84ff3b8245de1449b9e793080d0a6eb: Status 404 returned error can't find the container with id dbe23b7422249ca248370edd835c389ce84ff3b8245de1449b9e793080d0a6eb Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.151719 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:00:04 crc kubenswrapper[4743]: E1122 10:00:04.152025 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.175711 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zkxbc"] Nov 22 10:00:04 crc kubenswrapper[4743]: E1122 10:00:04.181283 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cceaba-39ed-4439-a80e-8938c227a8b0" containerName="ovn-config" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.181831 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cceaba-39ed-4439-a80e-8938c227a8b0" containerName="ovn-config" Nov 22 10:00:04 crc kubenswrapper[4743]: E1122 10:00:04.181926 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a318d4-4c4c-4125-81e5-346f22bf3073" containerName="init" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.181937 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a318d4-4c4c-4125-81e5-346f22bf3073" containerName="init" Nov 22 10:00:04 crc kubenswrapper[4743]: E1122 10:00:04.182005 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a318d4-4c4c-4125-81e5-346f22bf3073" containerName="octavia-db-sync" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.182016 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a318d4-4c4c-4125-81e5-346f22bf3073" containerName="octavia-db-sync" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.183932 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a318d4-4c4c-4125-81e5-346f22bf3073" containerName="octavia-db-sync" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.184093 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cceaba-39ed-4439-a80e-8938c227a8b0" containerName="ovn-config" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.190762 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.229268 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-utilities\") pod \"community-operators-zkxbc\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.231193 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8vnf\" (UniqueName: \"kubernetes.io/projected/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-kube-api-access-z8vnf\") pod \"community-operators-zkxbc\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.231472 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-catalog-content\") pod \"community-operators-zkxbc\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.235594 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zkxbc"] Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.337132 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-catalog-content\") pod \"community-operators-zkxbc\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.337208 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-utilities\") pod \"community-operators-zkxbc\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.337265 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8vnf\" (UniqueName: \"kubernetes.io/projected/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-kube-api-access-z8vnf\") pod \"community-operators-zkxbc\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.338022 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-catalog-content\") pod \"community-operators-zkxbc\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.338278 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-utilities\") pod \"community-operators-zkxbc\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.362001 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8vnf\" (UniqueName: \"kubernetes.io/projected/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-kube-api-access-z8vnf\") pod \"community-operators-zkxbc\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.537657 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.638649 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lzbkj-config-j87p4"] Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.653094 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lzbkj-config-j87p4"] Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.893938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-686fm" event={"ID":"634fdefe-cd9c-4a8a-8605-568f6a38be5e","Type":"ContainerStarted","Data":"627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179"} Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.895934 4743 generic.go:334] "Generic (PLEG): container finished" podID="f767578d-897b-492f-afa7-e61b6134690d" containerID="af41ec3c5c6d477bd4ccb2d43edda1d8a6b077978a23a6398f02bb0ebacbfb34" exitCode=0 Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.895979 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" event={"ID":"f767578d-897b-492f-afa7-e61b6134690d","Type":"ContainerDied","Data":"af41ec3c5c6d477bd4ccb2d43edda1d8a6b077978a23a6398f02bb0ebacbfb34"} Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.895995 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" event={"ID":"f767578d-897b-492f-afa7-e61b6134690d","Type":"ContainerStarted","Data":"dbe23b7422249ca248370edd835c389ce84ff3b8245de1449b9e793080d0a6eb"} Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.898351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-446fq" event={"ID":"a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748","Type":"ContainerStarted","Data":"a3801c56e5917cac9efa3d3b9c51661c3c0f02e105f3bdeb28d9be7510434962"} Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.898667 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-446fq" Nov 22 10:00:04 crc kubenswrapper[4743]: I1122 10:00:04.932096 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-446fq" podStartSLOduration=1.904038834 podStartE2EDuration="12.932075804s" podCreationTimestamp="2025-11-22 09:59:52 +0000 UTC" firstStartedPulling="2025-11-22 09:59:52.976531947 +0000 UTC m=+5866.682892999" lastFinishedPulling="2025-11-22 10:00:04.004568917 +0000 UTC m=+5877.710929969" observedRunningTime="2025-11-22 10:00:04.930514089 +0000 UTC m=+5878.636875141" watchObservedRunningTime="2025-11-22 10:00:04.932075804 +0000 UTC m=+5878.638436856" Nov 22 10:00:05 crc kubenswrapper[4743]: I1122 10:00:05.092211 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zkxbc"] Nov 22 10:00:05 crc kubenswrapper[4743]: I1122 10:00:05.168063 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cceaba-39ed-4439-a80e-8938c227a8b0" path="/var/lib/kubelet/pods/90cceaba-39ed-4439-a80e-8938c227a8b0/volumes" Nov 22 10:00:05 crc kubenswrapper[4743]: I1122 10:00:05.909836 4743 generic.go:334] "Generic (PLEG): container finished" podID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerID="563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657" exitCode=0 Nov 22 10:00:05 crc kubenswrapper[4743]: I1122 10:00:05.909900 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkxbc" event={"ID":"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf","Type":"ContainerDied","Data":"563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657"} Nov 22 10:00:05 crc kubenswrapper[4743]: I1122 10:00:05.910181 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkxbc" event={"ID":"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf","Type":"ContainerStarted","Data":"299b3bef47fcf0350cb5dc9f6f537a7bd40ea1232bf759f716436befd7f241eb"} Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.274639 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.378286 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f767578d-897b-492f-afa7-e61b6134690d-config-volume\") pod \"f767578d-897b-492f-afa7-e61b6134690d\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.378476 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f767578d-897b-492f-afa7-e61b6134690d-secret-volume\") pod \"f767578d-897b-492f-afa7-e61b6134690d\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.378533 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rt9r\" (UniqueName: \"kubernetes.io/projected/f767578d-897b-492f-afa7-e61b6134690d-kube-api-access-8rt9r\") pod \"f767578d-897b-492f-afa7-e61b6134690d\" (UID: \"f767578d-897b-492f-afa7-e61b6134690d\") " Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.379058 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f767578d-897b-492f-afa7-e61b6134690d-config-volume" (OuterVolumeSpecName: "config-volume") pod "f767578d-897b-492f-afa7-e61b6134690d" (UID: "f767578d-897b-492f-afa7-e61b6134690d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.384630 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f767578d-897b-492f-afa7-e61b6134690d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f767578d-897b-492f-afa7-e61b6134690d" (UID: "f767578d-897b-492f-afa7-e61b6134690d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.384847 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f767578d-897b-492f-afa7-e61b6134690d-kube-api-access-8rt9r" (OuterVolumeSpecName: "kube-api-access-8rt9r") pod "f767578d-897b-492f-afa7-e61b6134690d" (UID: "f767578d-897b-492f-afa7-e61b6134690d"). InnerVolumeSpecName "kube-api-access-8rt9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.481070 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f767578d-897b-492f-afa7-e61b6134690d-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.481413 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rt9r\" (UniqueName: \"kubernetes.io/projected/f767578d-897b-492f-afa7-e61b6134690d-kube-api-access-8rt9r\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.481424 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f767578d-897b-492f-afa7-e61b6134690d-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.919506 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" event={"ID":"f767578d-897b-492f-afa7-e61b6134690d","Type":"ContainerDied","Data":"dbe23b7422249ca248370edd835c389ce84ff3b8245de1449b9e793080d0a6eb"} Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.919548 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbe23b7422249ca248370edd835c389ce84ff3b8245de1449b9e793080d0a6eb" Nov 22 10:00:06 crc kubenswrapper[4743]: I1122 10:00:06.919627 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t" Nov 22 10:00:07 crc kubenswrapper[4743]: I1122 10:00:07.340196 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc"] Nov 22 10:00:07 crc kubenswrapper[4743]: I1122 10:00:07.348474 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396715-ntpmc"] Nov 22 10:00:09 crc kubenswrapper[4743]: I1122 10:00:09.165334 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa98797-b1ed-4fbd-9168-3cc290092457" path="/var/lib/kubelet/pods/1aa98797-b1ed-4fbd-9168-3cc290092457/volumes" Nov 22 10:00:09 crc kubenswrapper[4743]: I1122 10:00:09.951387 4743 generic.go:334] "Generic (PLEG): container finished" podID="634fdefe-cd9c-4a8a-8605-568f6a38be5e" containerID="627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179" exitCode=0 Nov 22 10:00:09 crc kubenswrapper[4743]: I1122 10:00:09.951440 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-686fm" event={"ID":"634fdefe-cd9c-4a8a-8605-568f6a38be5e","Type":"ContainerDied","Data":"627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179"} Nov 22 10:00:10 crc kubenswrapper[4743]: I1122 10:00:10.962328 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkxbc" event={"ID":"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf","Type":"ContainerStarted","Data":"1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e"} Nov 22 10:00:11 crc kubenswrapper[4743]: I1122 10:00:11.978096 4743 generic.go:334] "Generic (PLEG): container finished" podID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerID="1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e" exitCode=0 Nov 22 10:00:11 crc kubenswrapper[4743]: I1122 10:00:11.978140 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkxbc" event={"ID":"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf","Type":"ContainerDied","Data":"1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e"} Nov 22 10:00:12 crc kubenswrapper[4743]: I1122 10:00:12.988648 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-686fm" event={"ID":"634fdefe-cd9c-4a8a-8605-568f6a38be5e","Type":"ContainerStarted","Data":"075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b"} Nov 22 10:00:12 crc kubenswrapper[4743]: I1122 10:00:12.991447 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkxbc" event={"ID":"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf","Type":"ContainerStarted","Data":"ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025"} Nov 22 10:00:13 crc kubenswrapper[4743]: I1122 10:00:13.032101 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zkxbc" podStartSLOduration=2.5774402480000003 podStartE2EDuration="9.032084588s" podCreationTimestamp="2025-11-22 10:00:04 +0000 UTC" firstStartedPulling="2025-11-22 10:00:05.912463412 +0000 UTC m=+5879.618824464" lastFinishedPulling="2025-11-22 10:00:12.367107752 +0000 UTC m=+5886.073468804" observedRunningTime="2025-11-22 10:00:13.029243566 +0000 UTC m=+5886.735604628" watchObservedRunningTime="2025-11-22 10:00:13.032084588 +0000 UTC m=+5886.738445640" Nov 22 10:00:13 crc kubenswrapper[4743]: I1122 10:00:13.035914 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-686fm" podStartSLOduration=2.851906818 podStartE2EDuration="21.035894417s" podCreationTimestamp="2025-11-22 09:59:52 +0000 UTC" firstStartedPulling="2025-11-22 09:59:53.718288788 +0000 UTC m=+5867.424649840" lastFinishedPulling="2025-11-22 10:00:11.902276387 +0000 UTC m=+5885.608637439" observedRunningTime="2025-11-22 10:00:13.009348005 +0000 UTC m=+5886.715709057" watchObservedRunningTime="2025-11-22 10:00:13.035894417 +0000 UTC m=+5886.742255469" Nov 22 10:00:14 crc kubenswrapper[4743]: I1122 10:00:14.539094 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:14 crc kubenswrapper[4743]: I1122 10:00:14.539225 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:14 crc kubenswrapper[4743]: I1122 10:00:14.591716 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:19 crc kubenswrapper[4743]: I1122 10:00:19.152474 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:00:19 crc kubenswrapper[4743]: E1122 10:00:19.153491 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:00:22 crc kubenswrapper[4743]: I1122 10:00:22.423704 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-446fq" Nov 22 10:00:24 crc kubenswrapper[4743]: I1122 10:00:24.586854 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:24 crc kubenswrapper[4743]: I1122 10:00:24.651373 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zkxbc"] Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.140711 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zkxbc" podUID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerName="registry-server" containerID="cri-o://ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025" gracePeriod=2 Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.756634 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.846876 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8vnf\" (UniqueName: \"kubernetes.io/projected/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-kube-api-access-z8vnf\") pod \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.846992 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-catalog-content\") pod \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.847125 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-utilities\") pod \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\" (UID: \"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf\") " Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.848088 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-utilities" (OuterVolumeSpecName: "utilities") pod "7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" (UID: "7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.856982 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-kube-api-access-z8vnf" (OuterVolumeSpecName: "kube-api-access-z8vnf") pod "7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" (UID: "7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf"). InnerVolumeSpecName "kube-api-access-z8vnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.897877 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" (UID: "7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.950010 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8vnf\" (UniqueName: \"kubernetes.io/projected/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-kube-api-access-z8vnf\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.950215 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:25 crc kubenswrapper[4743]: I1122 10:00:25.950349 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.152792 4743 generic.go:334] "Generic (PLEG): container finished" podID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerID="ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025" exitCode=0 Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.152828 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkxbc" event={"ID":"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf","Type":"ContainerDied","Data":"ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025"} Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.152870 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zkxbc" event={"ID":"7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf","Type":"ContainerDied","Data":"299b3bef47fcf0350cb5dc9f6f537a7bd40ea1232bf759f716436befd7f241eb"} Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.152889 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zkxbc" Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.152898 4743 scope.go:117] "RemoveContainer" containerID="ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025" Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.174065 4743 scope.go:117] "RemoveContainer" containerID="1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e" Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.195247 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zkxbc"] Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.206746 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zkxbc"] Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.211020 4743 scope.go:117] "RemoveContainer" containerID="563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657" Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.261003 4743 scope.go:117] "RemoveContainer" containerID="ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025" Nov 22 10:00:26 crc kubenswrapper[4743]: E1122 10:00:26.261488 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025\": container with ID starting with ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025 not found: ID does not exist" containerID="ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025" Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.261646 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025"} err="failed to get container status \"ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025\": rpc error: code = NotFound desc = could not find container \"ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025\": container with ID starting with ca14c84eab4ab1a19f57cafe43ad5027854b4271af0fe85a58c559829cb70025 not found: ID does not exist" Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.261748 4743 scope.go:117] "RemoveContainer" containerID="1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e" Nov 22 10:00:26 crc kubenswrapper[4743]: E1122 10:00:26.262276 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e\": container with ID starting with 1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e not found: ID does not exist" containerID="1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e" Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.262327 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e"} err="failed to get container status \"1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e\": rpc error: code = NotFound desc = could not find container \"1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e\": container with ID starting with 1a8572ad9e19583d0eb5908fff0992f099022cbfc46600045fe42744a042415e not found: ID does not exist" Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.262354 4743 scope.go:117] "RemoveContainer" containerID="563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657" Nov 22 10:00:26 crc kubenswrapper[4743]: E1122 10:00:26.262734 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657\": container with ID starting with 563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657 not found: ID does not exist" containerID="563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657" Nov 22 10:00:26 crc kubenswrapper[4743]: I1122 10:00:26.262826 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657"} err="failed to get container status \"563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657\": rpc error: code = NotFound desc = could not find container \"563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657\": container with ID starting with 563a511fd3ab8542a795a60bc362e4dc9976497a9c5fc5122ee4d35d2ab79657 not found: ID does not exist" Nov 22 10:00:27 crc kubenswrapper[4743]: I1122 10:00:27.164551 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" path="/var/lib/kubelet/pods/7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf/volumes" Nov 22 10:00:30 crc kubenswrapper[4743]: I1122 10:00:30.151922 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:00:30 crc kubenswrapper[4743]: E1122 10:00:30.152617 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:00:32 crc kubenswrapper[4743]: I1122 10:00:32.522724 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-686fm"] Nov 22 10:00:32 crc kubenswrapper[4743]: I1122 10:00:32.523259 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-686fm" podUID="634fdefe-cd9c-4a8a-8605-568f6a38be5e" containerName="octavia-amphora-httpd" containerID="cri-o://075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b" gracePeriod=30 Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.164871 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-686fm" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.200212 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/634fdefe-cd9c-4a8a-8605-568f6a38be5e-httpd-config\") pod \"634fdefe-cd9c-4a8a-8605-568f6a38be5e\" (UID: \"634fdefe-cd9c-4a8a-8605-568f6a38be5e\") " Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.200280 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/634fdefe-cd9c-4a8a-8605-568f6a38be5e-amphora-image\") pod \"634fdefe-cd9c-4a8a-8605-568f6a38be5e\" (UID: \"634fdefe-cd9c-4a8a-8605-568f6a38be5e\") " Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.223737 4743 generic.go:334] "Generic (PLEG): container finished" podID="634fdefe-cd9c-4a8a-8605-568f6a38be5e" containerID="075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b" exitCode=0 Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.223796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-686fm" event={"ID":"634fdefe-cd9c-4a8a-8605-568f6a38be5e","Type":"ContainerDied","Data":"075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b"} Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.223827 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-686fm" event={"ID":"634fdefe-cd9c-4a8a-8605-568f6a38be5e","Type":"ContainerDied","Data":"e03817eb96e170dba744efdd1bc5d90c98544e01307edab5fbc243c253006963"} Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.223849 4743 scope.go:117] "RemoveContainer" containerID="075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.223998 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-686fm" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.231606 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634fdefe-cd9c-4a8a-8605-568f6a38be5e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "634fdefe-cd9c-4a8a-8605-568f6a38be5e" (UID: "634fdefe-cd9c-4a8a-8605-568f6a38be5e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.279467 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/634fdefe-cd9c-4a8a-8605-568f6a38be5e-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "634fdefe-cd9c-4a8a-8605-568f6a38be5e" (UID: "634fdefe-cd9c-4a8a-8605-568f6a38be5e"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.305564 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/634fdefe-cd9c-4a8a-8605-568f6a38be5e-httpd-config\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.305622 4743 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/634fdefe-cd9c-4a8a-8605-568f6a38be5e-amphora-image\") on node \"crc\" DevicePath \"\"" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.325659 4743 scope.go:117] "RemoveContainer" containerID="627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.344371 4743 scope.go:117] "RemoveContainer" containerID="075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b" Nov 22 10:00:33 crc kubenswrapper[4743]: E1122 10:00:33.344782 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b\": container with ID starting with 075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b not found: ID does not exist" containerID="075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.344841 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b"} err="failed to get container status \"075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b\": rpc error: code = NotFound desc = could not find container \"075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b\": container with ID starting with 075907e46075f013c3939fa5f82fd2fa7435d89807d88036d1e1d61b95f2e94b not found: ID does not exist" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.344874 4743 scope.go:117] "RemoveContainer" containerID="627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179" Nov 22 10:00:33 crc kubenswrapper[4743]: E1122 10:00:33.345256 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179\": container with ID starting with 627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179 not found: ID does not exist" containerID="627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.345288 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179"} err="failed to get container status \"627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179\": rpc error: code = NotFound desc = could not find container \"627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179\": container with ID starting with 627061f3c21a1490004cc1536ff3b1deb1f2be169d1a6298b537a20e01ef3179 not found: ID does not exist" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.523422 4743 scope.go:117] "RemoveContainer" containerID="baa90a54d8dc526a42ac9b3ab529b3f056512058a59e155799a246a1c012e83d" Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.559749 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-686fm"] Nov 22 10:00:33 crc kubenswrapper[4743]: I1122 10:00:33.568390 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-686fm"] Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.173338 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="634fdefe-cd9c-4a8a-8605-568f6a38be5e" path="/var/lib/kubelet/pods/634fdefe-cd9c-4a8a-8605-568f6a38be5e/volumes" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.737416 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-zwj9g"] Nov 22 10:00:35 crc kubenswrapper[4743]: E1122 10:00:35.738061 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerName="registry-server" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.738082 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerName="registry-server" Nov 22 10:00:35 crc kubenswrapper[4743]: E1122 10:00:35.738107 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634fdefe-cd9c-4a8a-8605-568f6a38be5e" containerName="octavia-amphora-httpd" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.738116 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="634fdefe-cd9c-4a8a-8605-568f6a38be5e" containerName="octavia-amphora-httpd" Nov 22 10:00:35 crc kubenswrapper[4743]: E1122 10:00:35.738139 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerName="extract-content" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.738146 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerName="extract-content" Nov 22 10:00:35 crc kubenswrapper[4743]: E1122 10:00:35.738164 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634fdefe-cd9c-4a8a-8605-568f6a38be5e" containerName="init" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.738170 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="634fdefe-cd9c-4a8a-8605-568f6a38be5e" containerName="init" Nov 22 10:00:35 crc kubenswrapper[4743]: E1122 10:00:35.738178 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f767578d-897b-492f-afa7-e61b6134690d" containerName="collect-profiles" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.738185 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f767578d-897b-492f-afa7-e61b6134690d" containerName="collect-profiles" Nov 22 10:00:35 crc kubenswrapper[4743]: E1122 10:00:35.738213 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerName="extract-utilities" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.738220 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerName="extract-utilities" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.738436 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="634fdefe-cd9c-4a8a-8605-568f6a38be5e" containerName="octavia-amphora-httpd" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.738708 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f767578d-897b-492f-afa7-e61b6134690d" containerName="collect-profiles" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.738719 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0fd44d-3bcf-4f2e-8993-8c74012ae9cf" containerName="registry-server" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.740140 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-zwj9g" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.746159 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.754048 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/52777522-005a-4fa2-97dd-3b3c26efc6f9-httpd-config\") pod \"octavia-image-upload-59f8cff499-zwj9g\" (UID: \"52777522-005a-4fa2-97dd-3b3c26efc6f9\") " pod="openstack/octavia-image-upload-59f8cff499-zwj9g" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.754197 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/52777522-005a-4fa2-97dd-3b3c26efc6f9-amphora-image\") pod \"octavia-image-upload-59f8cff499-zwj9g\" (UID: \"52777522-005a-4fa2-97dd-3b3c26efc6f9\") " pod="openstack/octavia-image-upload-59f8cff499-zwj9g" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.757266 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-zwj9g"] Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.857211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/52777522-005a-4fa2-97dd-3b3c26efc6f9-amphora-image\") pod \"octavia-image-upload-59f8cff499-zwj9g\" (UID: \"52777522-005a-4fa2-97dd-3b3c26efc6f9\") " pod="openstack/octavia-image-upload-59f8cff499-zwj9g" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.857728 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/52777522-005a-4fa2-97dd-3b3c26efc6f9-amphora-image\") pod \"octavia-image-upload-59f8cff499-zwj9g\" (UID: \"52777522-005a-4fa2-97dd-3b3c26efc6f9\") " pod="openstack/octavia-image-upload-59f8cff499-zwj9g" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.857984 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/52777522-005a-4fa2-97dd-3b3c26efc6f9-httpd-config\") pod \"octavia-image-upload-59f8cff499-zwj9g\" (UID: \"52777522-005a-4fa2-97dd-3b3c26efc6f9\") " pod="openstack/octavia-image-upload-59f8cff499-zwj9g" Nov 22 10:00:35 crc kubenswrapper[4743]: I1122 10:00:35.866809 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/52777522-005a-4fa2-97dd-3b3c26efc6f9-httpd-config\") pod \"octavia-image-upload-59f8cff499-zwj9g\" (UID: \"52777522-005a-4fa2-97dd-3b3c26efc6f9\") " pod="openstack/octavia-image-upload-59f8cff499-zwj9g" Nov 22 10:00:36 crc kubenswrapper[4743]: I1122 10:00:36.071600 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-zwj9g" Nov 22 10:00:36 crc kubenswrapper[4743]: I1122 10:00:36.571486 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-zwj9g"] Nov 22 10:00:37 crc kubenswrapper[4743]: I1122 10:00:37.326011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-zwj9g" event={"ID":"52777522-005a-4fa2-97dd-3b3c26efc6f9","Type":"ContainerStarted","Data":"1a4fe1ca4bbdc95ecc268a8e26832e38ff8ddf83524288d2c581ccdaf1248b06"} Nov 22 10:00:38 crc kubenswrapper[4743]: I1122 10:00:38.359556 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-zwj9g" event={"ID":"52777522-005a-4fa2-97dd-3b3c26efc6f9","Type":"ContainerStarted","Data":"3ca28549042f6c69e0e303cc62daaa3d0632e3ea3e1babd432ec6cf1d253dacd"} Nov 22 10:00:39 crc kubenswrapper[4743]: I1122 10:00:39.373907 4743 generic.go:334] "Generic (PLEG): container finished" podID="52777522-005a-4fa2-97dd-3b3c26efc6f9" containerID="3ca28549042f6c69e0e303cc62daaa3d0632e3ea3e1babd432ec6cf1d253dacd" exitCode=0 Nov 22 10:00:39 crc kubenswrapper[4743]: I1122 10:00:39.373953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-zwj9g" event={"ID":"52777522-005a-4fa2-97dd-3b3c26efc6f9","Type":"ContainerDied","Data":"3ca28549042f6c69e0e303cc62daaa3d0632e3ea3e1babd432ec6cf1d253dacd"} Nov 22 10:00:41 crc kubenswrapper[4743]: I1122 10:00:41.393235 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-zwj9g" event={"ID":"52777522-005a-4fa2-97dd-3b3c26efc6f9","Type":"ContainerStarted","Data":"76d9aff741dc9fe1ce0a9665920896957ee9e2f7400dd9e60a4243b898a72bbc"} Nov 22 10:00:41 crc kubenswrapper[4743]: I1122 10:00:41.410036 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-zwj9g" podStartSLOduration=2.263706253 podStartE2EDuration="6.410016721s" podCreationTimestamp="2025-11-22 10:00:35 +0000 UTC" firstStartedPulling="2025-11-22 10:00:36.571999139 +0000 UTC m=+5910.278360191" lastFinishedPulling="2025-11-22 10:00:40.718309607 +0000 UTC m=+5914.424670659" observedRunningTime="2025-11-22 10:00:41.404034829 +0000 UTC m=+5915.110395891" watchObservedRunningTime="2025-11-22 10:00:41.410016721 +0000 UTC m=+5915.116377773" Nov 22 10:00:44 crc kubenswrapper[4743]: I1122 10:00:44.152017 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:00:44 crc kubenswrapper[4743]: E1122 10:00:44.153065 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:00:55 crc kubenswrapper[4743]: I1122 10:00:55.904438 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-hb4xf"] Nov 22 10:00:55 crc kubenswrapper[4743]: I1122 10:00:55.906871 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:55 crc kubenswrapper[4743]: I1122 10:00:55.914144 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Nov 22 10:00:55 crc kubenswrapper[4743]: I1122 10:00:55.915075 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Nov 22 10:00:55 crc kubenswrapper[4743]: I1122 10:00:55.927931 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Nov 22 10:00:55 crc kubenswrapper[4743]: I1122 10:00:55.949408 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-hb4xf"] Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.021599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-scripts\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.021677 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-config-data\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.021829 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/48db469e-30ab-4c16-9720-4c7d33df686f-config-data-merged\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.021930 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-combined-ca-bundle\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.021979 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/48db469e-30ab-4c16-9720-4c7d33df686f-hm-ports\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.022015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-amphora-certs\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.123867 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-amphora-certs\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.124236 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-scripts\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.124385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-config-data\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.124569 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/48db469e-30ab-4c16-9720-4c7d33df686f-config-data-merged\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.124805 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-combined-ca-bundle\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.124948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/48db469e-30ab-4c16-9720-4c7d33df686f-hm-ports\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.125436 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/48db469e-30ab-4c16-9720-4c7d33df686f-config-data-merged\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.125863 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/48db469e-30ab-4c16-9720-4c7d33df686f-hm-ports\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.130086 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-amphora-certs\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.130824 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-combined-ca-bundle\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.144280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-config-data\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.145197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48db469e-30ab-4c16-9720-4c7d33df686f-scripts\") pod \"octavia-healthmanager-hb4xf\" (UID: \"48db469e-30ab-4c16-9720-4c7d33df686f\") " pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.241275 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:00:56 crc kubenswrapper[4743]: I1122 10:00:56.734842 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-hb4xf"] Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.563059 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hb4xf" event={"ID":"48db469e-30ab-4c16-9720-4c7d33df686f","Type":"ContainerStarted","Data":"51fbd1762b03462361a5ec9f6b3dbbe5a194e11bea406ceb9f115aa0fc864098"} Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.563423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hb4xf" event={"ID":"48db469e-30ab-4c16-9720-4c7d33df686f","Type":"ContainerStarted","Data":"c9180db205277b7e0a690c126ee4d2c93e95b17996b35c14b6e2884933356a93"} Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.738863 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-r45h9"] Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.740723 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.743398 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.748868 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-r45h9"] Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.756027 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.855774 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-scripts\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.855855 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-amphora-certs\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.856002 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/dea004b0-e9a6-4823-8692-af0a4c143d7d-hm-ports\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.856223 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dea004b0-e9a6-4823-8692-af0a4c143d7d-config-data-merged\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.856391 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-config-data\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.856419 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-combined-ca-bundle\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.958818 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-config-data\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.959131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-combined-ca-bundle\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.959201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-scripts\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.959287 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-amphora-certs\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.959315 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/dea004b0-e9a6-4823-8692-af0a4c143d7d-hm-ports\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.959404 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dea004b0-e9a6-4823-8692-af0a4c143d7d-config-data-merged\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.960164 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/dea004b0-e9a6-4823-8692-af0a4c143d7d-config-data-merged\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.960702 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/dea004b0-e9a6-4823-8692-af0a4c143d7d-hm-ports\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.965421 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-amphora-certs\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.965643 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-combined-ca-bundle\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.965748 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-config-data\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:57 crc kubenswrapper[4743]: I1122 10:00:57.969443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dea004b0-e9a6-4823-8692-af0a4c143d7d-scripts\") pod \"octavia-housekeeping-r45h9\" (UID: \"dea004b0-e9a6-4823-8692-af0a4c143d7d\") " pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:58 crc kubenswrapper[4743]: I1122 10:00:58.057871 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:00:58 crc kubenswrapper[4743]: I1122 10:00:58.662216 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-r45h9"] Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.152166 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:00:59 crc kubenswrapper[4743]: E1122 10:00:59.153528 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.374806 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-5b2dc"] Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.376866 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.379479 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.379508 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.384871 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-5b2dc"] Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.491175 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9d8f4208-5b31-406d-a7cd-813b92c49e16-config-data-merged\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.491301 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9d8f4208-5b31-406d-a7cd-813b92c49e16-hm-ports\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.491371 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-config-data\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.491415 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-amphora-certs\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.491457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-scripts\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.491518 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-combined-ca-bundle\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.593475 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-combined-ca-bundle\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.593700 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9d8f4208-5b31-406d-a7cd-813b92c49e16-config-data-merged\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.593797 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9d8f4208-5b31-406d-a7cd-813b92c49e16-hm-ports\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.593841 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-config-data\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.593876 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-amphora-certs\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.593909 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-scripts\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.595016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9d8f4208-5b31-406d-a7cd-813b92c49e16-config-data-merged\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.595216 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9d8f4208-5b31-406d-a7cd-813b92c49e16-hm-ports\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.598179 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-r45h9" event={"ID":"dea004b0-e9a6-4823-8692-af0a4c143d7d","Type":"ContainerStarted","Data":"782d1460617b7721a9cd82252d2166b2446f37bafa64ca7a5bedb547524977ef"} Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.600690 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-amphora-certs\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.601046 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-scripts\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.601524 4743 generic.go:334] "Generic (PLEG): container finished" podID="48db469e-30ab-4c16-9720-4c7d33df686f" containerID="51fbd1762b03462361a5ec9f6b3dbbe5a194e11bea406ceb9f115aa0fc864098" exitCode=0 Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.601617 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hb4xf" event={"ID":"48db469e-30ab-4c16-9720-4c7d33df686f","Type":"ContainerDied","Data":"51fbd1762b03462361a5ec9f6b3dbbe5a194e11bea406ceb9f115aa0fc864098"} Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.605195 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-config-data\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.606372 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d8f4208-5b31-406d-a7cd-813b92c49e16-combined-ca-bundle\") pod \"octavia-worker-5b2dc\" (UID: \"9d8f4208-5b31-406d-a7cd-813b92c49e16\") " pod="openstack/octavia-worker-5b2dc" Nov 22 10:00:59 crc kubenswrapper[4743]: I1122 10:00:59.705117 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-5b2dc" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.132254 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29396761-5jwng"] Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.133601 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.146237 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29396761-5jwng"] Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.207180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6cm9\" (UniqueName: \"kubernetes.io/projected/882fc21a-125e-4e4c-816d-d273f8bc6078-kube-api-access-w6cm9\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.207246 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-combined-ca-bundle\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.207338 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-fernet-keys\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.207391 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-config-data\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.309849 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-fernet-keys\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.309942 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-config-data\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.310070 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6cm9\" (UniqueName: \"kubernetes.io/projected/882fc21a-125e-4e4c-816d-d273f8bc6078-kube-api-access-w6cm9\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.310100 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-combined-ca-bundle\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.327715 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-combined-ca-bundle\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.328138 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-fernet-keys\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.328623 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-config-data\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.365078 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6cm9\" (UniqueName: \"kubernetes.io/projected/882fc21a-125e-4e4c-816d-d273f8bc6078-kube-api-access-w6cm9\") pod \"keystone-cron-29396761-5jwng\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.457199 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:00 crc kubenswrapper[4743]: I1122 10:01:00.907252 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-5b2dc"] Nov 22 10:01:00 crc kubenswrapper[4743]: W1122 10:01:00.918046 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d8f4208_5b31_406d_a7cd_813b92c49e16.slice/crio-c1f259d16e3ebd05510beb68d04de1fece8ee8f62cbf9699ae0dde948ec69268 WatchSource:0}: Error finding container c1f259d16e3ebd05510beb68d04de1fece8ee8f62cbf9699ae0dde948ec69268: Status 404 returned error can't find the container with id c1f259d16e3ebd05510beb68d04de1fece8ee8f62cbf9699ae0dde948ec69268 Nov 22 10:01:01 crc kubenswrapper[4743]: I1122 10:01:01.089776 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29396761-5jwng"] Nov 22 10:01:01 crc kubenswrapper[4743]: W1122 10:01:01.093866 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod882fc21a_125e_4e4c_816d_d273f8bc6078.slice/crio-8ac17eb72d94170554a3f51a8f7aa36b18542ce854202cd468e867e0f39e66a2 WatchSource:0}: Error finding container 8ac17eb72d94170554a3f51a8f7aa36b18542ce854202cd468e867e0f39e66a2: Status 404 returned error can't find the container with id 8ac17eb72d94170554a3f51a8f7aa36b18542ce854202cd468e867e0f39e66a2 Nov 22 10:01:01 crc kubenswrapper[4743]: I1122 10:01:01.620670 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-r45h9" event={"ID":"dea004b0-e9a6-4823-8692-af0a4c143d7d","Type":"ContainerStarted","Data":"5d3c4c80415a89e1f4e51921533ab62157498ac8a659ec3e3f1028dfebe7cfa9"} Nov 22 10:01:01 crc kubenswrapper[4743]: I1122 10:01:01.622469 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5b2dc" event={"ID":"9d8f4208-5b31-406d-a7cd-813b92c49e16","Type":"ContainerStarted","Data":"c1f259d16e3ebd05510beb68d04de1fece8ee8f62cbf9699ae0dde948ec69268"} Nov 22 10:01:01 crc kubenswrapper[4743]: I1122 10:01:01.625222 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hb4xf" event={"ID":"48db469e-30ab-4c16-9720-4c7d33df686f","Type":"ContainerStarted","Data":"def15501d65f0b809eedf7538f8ae576db312d39787be5b6fa5953dc4be1192e"} Nov 22 10:01:01 crc kubenswrapper[4743]: I1122 10:01:01.625826 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:01:01 crc kubenswrapper[4743]: I1122 10:01:01.642782 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396761-5jwng" event={"ID":"882fc21a-125e-4e4c-816d-d273f8bc6078","Type":"ContainerStarted","Data":"bb52617c7e6f348d20996d45137b4baaa1d1bd43b844e0ba0908ead0fa3a302a"} Nov 22 10:01:01 crc kubenswrapper[4743]: I1122 10:01:01.642834 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396761-5jwng" event={"ID":"882fc21a-125e-4e4c-816d-d273f8bc6078","Type":"ContainerStarted","Data":"8ac17eb72d94170554a3f51a8f7aa36b18542ce854202cd468e867e0f39e66a2"} Nov 22 10:01:01 crc kubenswrapper[4743]: I1122 10:01:01.663957 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-hb4xf" podStartSLOduration=6.663938232 podStartE2EDuration="6.663938232s" podCreationTimestamp="2025-11-22 10:00:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:01:01.656660993 +0000 UTC m=+5935.363022065" watchObservedRunningTime="2025-11-22 10:01:01.663938232 +0000 UTC m=+5935.370299284" Nov 22 10:01:01 crc kubenswrapper[4743]: I1122 10:01:01.676273 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29396761-5jwng" podStartSLOduration=1.6762535760000001 podStartE2EDuration="1.676253576s" podCreationTimestamp="2025-11-22 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:01:01.669651746 +0000 UTC m=+5935.376012798" watchObservedRunningTime="2025-11-22 10:01:01.676253576 +0000 UTC m=+5935.382614628" Nov 22 10:01:02 crc kubenswrapper[4743]: I1122 10:01:02.671893 4743 generic.go:334] "Generic (PLEG): container finished" podID="dea004b0-e9a6-4823-8692-af0a4c143d7d" containerID="5d3c4c80415a89e1f4e51921533ab62157498ac8a659ec3e3f1028dfebe7cfa9" exitCode=0 Nov 22 10:01:02 crc kubenswrapper[4743]: I1122 10:01:02.672081 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-r45h9" event={"ID":"dea004b0-e9a6-4823-8692-af0a4c143d7d","Type":"ContainerDied","Data":"5d3c4c80415a89e1f4e51921533ab62157498ac8a659ec3e3f1028dfebe7cfa9"} Nov 22 10:01:03 crc kubenswrapper[4743]: I1122 10:01:03.682015 4743 generic.go:334] "Generic (PLEG): container finished" podID="882fc21a-125e-4e4c-816d-d273f8bc6078" containerID="bb52617c7e6f348d20996d45137b4baaa1d1bd43b844e0ba0908ead0fa3a302a" exitCode=0 Nov 22 10:01:03 crc kubenswrapper[4743]: I1122 10:01:03.682126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396761-5jwng" event={"ID":"882fc21a-125e-4e4c-816d-d273f8bc6078","Type":"ContainerDied","Data":"bb52617c7e6f348d20996d45137b4baaa1d1bd43b844e0ba0908ead0fa3a302a"} Nov 22 10:01:04 crc kubenswrapper[4743]: I1122 10:01:04.693621 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-r45h9" event={"ID":"dea004b0-e9a6-4823-8692-af0a4c143d7d","Type":"ContainerStarted","Data":"bf8074d9b6edd5ad39d5f2cdb96d6085bb976dd61c651e1c30a38b0ea87d3d37"} Nov 22 10:01:04 crc kubenswrapper[4743]: I1122 10:01:04.717467 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-r45h9" podStartSLOduration=5.769579968 podStartE2EDuration="7.717443363s" podCreationTimestamp="2025-11-22 10:00:57 +0000 UTC" firstStartedPulling="2025-11-22 10:00:58.680492873 +0000 UTC m=+5932.386853935" lastFinishedPulling="2025-11-22 10:01:00.628356278 +0000 UTC m=+5934.334717330" observedRunningTime="2025-11-22 10:01:04.71140889 +0000 UTC m=+5938.417769962" watchObservedRunningTime="2025-11-22 10:01:04.717443363 +0000 UTC m=+5938.423804415" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.124201 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.221185 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-config-data\") pod \"882fc21a-125e-4e4c-816d-d273f8bc6078\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.221288 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-fernet-keys\") pod \"882fc21a-125e-4e4c-816d-d273f8bc6078\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.221361 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-combined-ca-bundle\") pod \"882fc21a-125e-4e4c-816d-d273f8bc6078\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.221410 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6cm9\" (UniqueName: \"kubernetes.io/projected/882fc21a-125e-4e4c-816d-d273f8bc6078-kube-api-access-w6cm9\") pod \"882fc21a-125e-4e4c-816d-d273f8bc6078\" (UID: \"882fc21a-125e-4e4c-816d-d273f8bc6078\") " Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.242350 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "882fc21a-125e-4e4c-816d-d273f8bc6078" (UID: "882fc21a-125e-4e4c-816d-d273f8bc6078"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.244902 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/882fc21a-125e-4e4c-816d-d273f8bc6078-kube-api-access-w6cm9" (OuterVolumeSpecName: "kube-api-access-w6cm9") pod "882fc21a-125e-4e4c-816d-d273f8bc6078" (UID: "882fc21a-125e-4e4c-816d-d273f8bc6078"). InnerVolumeSpecName "kube-api-access-w6cm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.282209 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "882fc21a-125e-4e4c-816d-d273f8bc6078" (UID: "882fc21a-125e-4e4c-816d-d273f8bc6078"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.294858 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-config-data" (OuterVolumeSpecName: "config-data") pod "882fc21a-125e-4e4c-816d-d273f8bc6078" (UID: "882fc21a-125e-4e4c-816d-d273f8bc6078"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.325037 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.325074 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.325083 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882fc21a-125e-4e4c-816d-d273f8bc6078-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.325093 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6cm9\" (UniqueName: \"kubernetes.io/projected/882fc21a-125e-4e4c-816d-d273f8bc6078-kube-api-access-w6cm9\") on node \"crc\" DevicePath \"\"" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.702552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5b2dc" event={"ID":"9d8f4208-5b31-406d-a7cd-813b92c49e16","Type":"ContainerStarted","Data":"f177cbb75e48ef95f50f96f0503b5059d49afa11bbd925b6b9315736faf831f8"} Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.705215 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396761-5jwng" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.707661 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396761-5jwng" event={"ID":"882fc21a-125e-4e4c-816d-d273f8bc6078","Type":"ContainerDied","Data":"8ac17eb72d94170554a3f51a8f7aa36b18542ce854202cd468e867e0f39e66a2"} Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.707732 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ac17eb72d94170554a3f51a8f7aa36b18542ce854202cd468e867e0f39e66a2" Nov 22 10:01:05 crc kubenswrapper[4743]: I1122 10:01:05.707761 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:01:07 crc kubenswrapper[4743]: I1122 10:01:07.736332 4743 generic.go:334] "Generic (PLEG): container finished" podID="9d8f4208-5b31-406d-a7cd-813b92c49e16" containerID="f177cbb75e48ef95f50f96f0503b5059d49afa11bbd925b6b9315736faf831f8" exitCode=0 Nov 22 10:01:07 crc kubenswrapper[4743]: I1122 10:01:07.736651 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5b2dc" event={"ID":"9d8f4208-5b31-406d-a7cd-813b92c49e16","Type":"ContainerDied","Data":"f177cbb75e48ef95f50f96f0503b5059d49afa11bbd925b6b9315736faf831f8"} Nov 22 10:01:08 crc kubenswrapper[4743]: I1122 10:01:08.748178 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-5b2dc" event={"ID":"9d8f4208-5b31-406d-a7cd-813b92c49e16","Type":"ContainerStarted","Data":"de24f051b1c62c73b0dcd2a0b237aa134e845b9b9d28750d7635a89e26e7fb5e"} Nov 22 10:01:08 crc kubenswrapper[4743]: I1122 10:01:08.749408 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-5b2dc" Nov 22 10:01:11 crc kubenswrapper[4743]: I1122 10:01:11.270879 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-hb4xf" Nov 22 10:01:11 crc kubenswrapper[4743]: I1122 10:01:11.296336 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-5b2dc" podStartSLOduration=8.464622172 podStartE2EDuration="12.296320082s" podCreationTimestamp="2025-11-22 10:00:59 +0000 UTC" firstStartedPulling="2025-11-22 10:01:00.925826945 +0000 UTC m=+5934.632187997" lastFinishedPulling="2025-11-22 10:01:04.757524864 +0000 UTC m=+5938.463885907" observedRunningTime="2025-11-22 10:01:08.78989653 +0000 UTC m=+5942.496257582" watchObservedRunningTime="2025-11-22 10:01:11.296320082 +0000 UTC m=+5945.002681134" Nov 22 10:01:12 crc kubenswrapper[4743]: I1122 10:01:12.151556 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:01:12 crc kubenswrapper[4743]: E1122 10:01:12.152109 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:01:13 crc kubenswrapper[4743]: I1122 10:01:13.090733 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-r45h9" Nov 22 10:01:14 crc kubenswrapper[4743]: I1122 10:01:14.736162 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-5b2dc" Nov 22 10:01:26 crc kubenswrapper[4743]: I1122 10:01:26.152000 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:01:26 crc kubenswrapper[4743]: E1122 10:01:26.152872 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:01:39 crc kubenswrapper[4743]: I1122 10:01:39.152683 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:01:39 crc kubenswrapper[4743]: E1122 10:01:39.153544 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:01:41 crc kubenswrapper[4743]: I1122 10:01:41.043775 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-738f-account-create-42z4p"] Nov 22 10:01:41 crc kubenswrapper[4743]: I1122 10:01:41.063978 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-h9md7"] Nov 22 10:01:41 crc kubenswrapper[4743]: I1122 10:01:41.081282 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-738f-account-create-42z4p"] Nov 22 10:01:41 crc kubenswrapper[4743]: I1122 10:01:41.091230 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-h9md7"] Nov 22 10:01:41 crc kubenswrapper[4743]: I1122 10:01:41.161752 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee26b83-925c-4a05-9064-dda33c5dc513" path="/var/lib/kubelet/pods/bee26b83-925c-4a05-9064-dda33c5dc513/volumes" Nov 22 10:01:41 crc kubenswrapper[4743]: I1122 10:01:41.162321 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db038e67-3aec-4bc7-be68-5f8e3cea3a83" path="/var/lib/kubelet/pods/db038e67-3aec-4bc7-be68-5f8e3cea3a83/volumes" Nov 22 10:01:46 crc kubenswrapper[4743]: I1122 10:01:46.030412 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ssk2b"] Nov 22 10:01:46 crc kubenswrapper[4743]: I1122 10:01:46.045725 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ssk2b"] Nov 22 10:01:47 crc kubenswrapper[4743]: I1122 10:01:47.167135 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452d5af5-a474-4459-b454-a1600d09fba8" path="/var/lib/kubelet/pods/452d5af5-a474-4459-b454-a1600d09fba8/volumes" Nov 22 10:01:53 crc kubenswrapper[4743]: I1122 10:01:53.152883 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:01:53 crc kubenswrapper[4743]: E1122 10:01:53.153738 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.650690 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5847b769d5-2dvqm"] Nov 22 10:02:02 crc kubenswrapper[4743]: E1122 10:02:02.651658 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="882fc21a-125e-4e4c-816d-d273f8bc6078" containerName="keystone-cron" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.651673 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="882fc21a-125e-4e4c-816d-d273f8bc6078" containerName="keystone-cron" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.651856 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="882fc21a-125e-4e4c-816d-d273f8bc6078" containerName="keystone-cron" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.653008 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.658195 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.658413 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.658570 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.658827 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-fsxx8" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.673889 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5847b769d5-2dvqm"] Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.717337 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.717592 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b9de402d-3bf9-4e99-bac2-6f241134b16a" containerName="glance-log" containerID="cri-o://d3f44e6f73d7c2e9f49e5a06efeed4d63eb89e965c90dd505edb518de23da647" gracePeriod=30 Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.718023 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b9de402d-3bf9-4e99-bac2-6f241134b16a" containerName="glance-httpd" containerID="cri-o://88a9c587c0e34f007e0973a7375f9960c50ccffc10eaad7452e7a87c2ff01fc4" gracePeriod=30 Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.762410 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-config-data\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.762715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-horizon-secret-key\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.762824 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-scripts\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.762906 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvpmg\" (UniqueName: \"kubernetes.io/projected/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-kube-api-access-pvpmg\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.762896 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6568dc69d5-6d7ql"] Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.763076 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-logs\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.764765 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.774243 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6568dc69d5-6d7ql"] Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.822299 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.823649 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="31ccda11-3d5b-4530-9cbe-e3a994610b08" containerName="glance-log" containerID="cri-o://4be7b35b414a045785d62915dd2fd1f1cf18324df0d99b8222170e6f5ba65c09" gracePeriod=30 Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.823716 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="31ccda11-3d5b-4530-9cbe-e3a994610b08" containerName="glance-httpd" containerID="cri-o://b61db55f10eacbf9d0ad87b0c1c93c80f3ed1bb015e8b4a8cae8f464dfe032dc" gracePeriod=30 Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.864728 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-horizon-secret-key\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.864794 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-logs\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.864846 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdllg\" (UniqueName: \"kubernetes.io/projected/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-kube-api-access-kdllg\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.864895 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-config-data\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.864922 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-horizon-secret-key\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.864952 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-config-data\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.864977 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-scripts\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.865002 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvpmg\" (UniqueName: \"kubernetes.io/projected/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-kube-api-access-pvpmg\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.865027 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-scripts\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.865058 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-logs\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.865536 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-logs\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.866197 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-scripts\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.866669 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-config-data\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.870735 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-horizon-secret-key\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.881280 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvpmg\" (UniqueName: \"kubernetes.io/projected/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-kube-api-access-pvpmg\") pod \"horizon-5847b769d5-2dvqm\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.967458 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-horizon-secret-key\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.967619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-logs\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.967733 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdllg\" (UniqueName: \"kubernetes.io/projected/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-kube-api-access-kdllg\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.967869 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-config-data\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.967913 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-scripts\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.968002 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-logs\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.968887 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-scripts\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.969355 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-config-data\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.972946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-horizon-secret-key\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.981893 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:02 crc kubenswrapper[4743]: I1122 10:02:02.982870 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdllg\" (UniqueName: \"kubernetes.io/projected/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-kube-api-access-kdllg\") pod \"horizon-6568dc69d5-6d7ql\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.087519 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.113634 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5847b769d5-2dvqm"] Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.173764 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-864787bfcc-j7q6g"] Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.175847 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-864787bfcc-j7q6g"] Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.175884 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.273221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b445ab4a-9430-4c45-bd96-006a3c22598e-horizon-secret-key\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.273790 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-scripts\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.273998 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzpjh\" (UniqueName: \"kubernetes.io/projected/b445ab4a-9430-4c45-bd96-006a3c22598e-kube-api-access-lzpjh\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.274044 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b445ab4a-9430-4c45-bd96-006a3c22598e-logs\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.274147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-config-data\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.315566 4743 generic.go:334] "Generic (PLEG): container finished" podID="b9de402d-3bf9-4e99-bac2-6f241134b16a" containerID="d3f44e6f73d7c2e9f49e5a06efeed4d63eb89e965c90dd505edb518de23da647" exitCode=143 Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.315675 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b9de402d-3bf9-4e99-bac2-6f241134b16a","Type":"ContainerDied","Data":"d3f44e6f73d7c2e9f49e5a06efeed4d63eb89e965c90dd505edb518de23da647"} Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.325399 4743 generic.go:334] "Generic (PLEG): container finished" podID="31ccda11-3d5b-4530-9cbe-e3a994610b08" containerID="4be7b35b414a045785d62915dd2fd1f1cf18324df0d99b8222170e6f5ba65c09" exitCode=143 Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.325453 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31ccda11-3d5b-4530-9cbe-e3a994610b08","Type":"ContainerDied","Data":"4be7b35b414a045785d62915dd2fd1f1cf18324df0d99b8222170e6f5ba65c09"} Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.377127 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b445ab4a-9430-4c45-bd96-006a3c22598e-horizon-secret-key\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.377174 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-scripts\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.377202 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzpjh\" (UniqueName: \"kubernetes.io/projected/b445ab4a-9430-4c45-bd96-006a3c22598e-kube-api-access-lzpjh\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.377258 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b445ab4a-9430-4c45-bd96-006a3c22598e-logs\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.377317 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-config-data\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.377942 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-scripts\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.377974 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b445ab4a-9430-4c45-bd96-006a3c22598e-logs\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.378542 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-config-data\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.383343 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b445ab4a-9430-4c45-bd96-006a3c22598e-horizon-secret-key\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.399080 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzpjh\" (UniqueName: \"kubernetes.io/projected/b445ab4a-9430-4c45-bd96-006a3c22598e-kube-api-access-lzpjh\") pod \"horizon-864787bfcc-j7q6g\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.498298 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.682718 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5847b769d5-2dvqm"] Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.691270 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6568dc69d5-6d7ql"] Nov 22 10:02:03 crc kubenswrapper[4743]: W1122 10:02:03.694097 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5d765c2_95ac_4af8_ad8c_9c7ed79c62fb.slice/crio-7438b03d377708330a99044c78ea084b66ccbffa4fe4ed8fdf584e082171a398 WatchSource:0}: Error finding container 7438b03d377708330a99044c78ea084b66ccbffa4fe4ed8fdf584e082171a398: Status 404 returned error can't find the container with id 7438b03d377708330a99044c78ea084b66ccbffa4fe4ed8fdf584e082171a398 Nov 22 10:02:03 crc kubenswrapper[4743]: I1122 10:02:03.960562 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-864787bfcc-j7q6g"] Nov 22 10:02:03 crc kubenswrapper[4743]: W1122 10:02:03.978000 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb445ab4a_9430_4c45_bd96_006a3c22598e.slice/crio-f17f88dafa1db6a8c70b36780a13f99f5e8e3267948dd6ecbe18e54004345013 WatchSource:0}: Error finding container f17f88dafa1db6a8c70b36780a13f99f5e8e3267948dd6ecbe18e54004345013: Status 404 returned error can't find the container with id f17f88dafa1db6a8c70b36780a13f99f5e8e3267948dd6ecbe18e54004345013 Nov 22 10:02:04 crc kubenswrapper[4743]: I1122 10:02:04.337330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5847b769d5-2dvqm" event={"ID":"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb","Type":"ContainerStarted","Data":"7438b03d377708330a99044c78ea084b66ccbffa4fe4ed8fdf584e082171a398"} Nov 22 10:02:04 crc kubenswrapper[4743]: I1122 10:02:04.338339 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-864787bfcc-j7q6g" event={"ID":"b445ab4a-9430-4c45-bd96-006a3c22598e","Type":"ContainerStarted","Data":"f17f88dafa1db6a8c70b36780a13f99f5e8e3267948dd6ecbe18e54004345013"} Nov 22 10:02:04 crc kubenswrapper[4743]: I1122 10:02:04.339362 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6568dc69d5-6d7ql" event={"ID":"cd9642c9-2a43-40ac-bda7-37b8c0d8b668","Type":"ContainerStarted","Data":"e770aee2c5eee4d19cd4266303eae10002990a927bc62e407350e12d7d9171ea"} Nov 22 10:02:06 crc kubenswrapper[4743]: I1122 10:02:06.372479 4743 generic.go:334] "Generic (PLEG): container finished" podID="31ccda11-3d5b-4530-9cbe-e3a994610b08" containerID="b61db55f10eacbf9d0ad87b0c1c93c80f3ed1bb015e8b4a8cae8f464dfe032dc" exitCode=0 Nov 22 10:02:06 crc kubenswrapper[4743]: I1122 10:02:06.372568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31ccda11-3d5b-4530-9cbe-e3a994610b08","Type":"ContainerDied","Data":"b61db55f10eacbf9d0ad87b0c1c93c80f3ed1bb015e8b4a8cae8f464dfe032dc"} Nov 22 10:02:06 crc kubenswrapper[4743]: I1122 10:02:06.375430 4743 generic.go:334] "Generic (PLEG): container finished" podID="b9de402d-3bf9-4e99-bac2-6f241134b16a" containerID="88a9c587c0e34f007e0973a7375f9960c50ccffc10eaad7452e7a87c2ff01fc4" exitCode=0 Nov 22 10:02:06 crc kubenswrapper[4743]: I1122 10:02:06.375471 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b9de402d-3bf9-4e99-bac2-6f241134b16a","Type":"ContainerDied","Data":"88a9c587c0e34f007e0973a7375f9960c50ccffc10eaad7452e7a87c2ff01fc4"} Nov 22 10:02:08 crc kubenswrapper[4743]: I1122 10:02:08.151886 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:02:08 crc kubenswrapper[4743]: E1122 10:02:08.152830 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.342338 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.353681 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.464479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b9de402d-3bf9-4e99-bac2-6f241134b16a","Type":"ContainerDied","Data":"572ed50d92610d1a1422c0acf6dbaabc56378b2049bc6017fa0f41cbb9800c54"} Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.464558 4743 scope.go:117] "RemoveContainer" containerID="88a9c587c0e34f007e0973a7375f9960c50ccffc10eaad7452e7a87c2ff01fc4" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.464568 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.467095 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"31ccda11-3d5b-4530-9cbe-e3a994610b08","Type":"ContainerDied","Data":"f9cadb364ad3b22c1b5d69f557279b6a5fbe3913b087285993a29b946de41aac"} Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.467237 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.491498 4743 scope.go:117] "RemoveContainer" containerID="d3f44e6f73d7c2e9f49e5a06efeed4d63eb89e965c90dd505edb518de23da647" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.498636 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-scripts\") pod \"b9de402d-3bf9-4e99-bac2-6f241134b16a\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.498693 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-ceph\") pod \"31ccda11-3d5b-4530-9cbe-e3a994610b08\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.498730 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-httpd-run\") pod \"31ccda11-3d5b-4530-9cbe-e3a994610b08\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.498792 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-httpd-run\") pod \"b9de402d-3bf9-4e99-bac2-6f241134b16a\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.498854 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-combined-ca-bundle\") pod \"31ccda11-3d5b-4530-9cbe-e3a994610b08\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.498888 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-ceph\") pod \"b9de402d-3bf9-4e99-bac2-6f241134b16a\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.498920 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-scripts\") pod \"31ccda11-3d5b-4530-9cbe-e3a994610b08\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.498969 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-config-data\") pod \"b9de402d-3bf9-4e99-bac2-6f241134b16a\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.498999 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-config-data\") pod \"31ccda11-3d5b-4530-9cbe-e3a994610b08\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.499064 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n44dx\" (UniqueName: \"kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-kube-api-access-n44dx\") pod \"31ccda11-3d5b-4530-9cbe-e3a994610b08\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.499095 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-logs\") pod \"b9de402d-3bf9-4e99-bac2-6f241134b16a\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.499144 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp7l4\" (UniqueName: \"kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-kube-api-access-jp7l4\") pod \"b9de402d-3bf9-4e99-bac2-6f241134b16a\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.499202 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-logs\") pod \"31ccda11-3d5b-4530-9cbe-e3a994610b08\" (UID: \"31ccda11-3d5b-4530-9cbe-e3a994610b08\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.499238 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-combined-ca-bundle\") pod \"b9de402d-3bf9-4e99-bac2-6f241134b16a\" (UID: \"b9de402d-3bf9-4e99-bac2-6f241134b16a\") " Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.500761 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-logs" (OuterVolumeSpecName: "logs") pod "31ccda11-3d5b-4530-9cbe-e3a994610b08" (UID: "31ccda11-3d5b-4530-9cbe-e3a994610b08"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.501094 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "31ccda11-3d5b-4530-9cbe-e3a994610b08" (UID: "31ccda11-3d5b-4530-9cbe-e3a994610b08"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.501592 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b9de402d-3bf9-4e99-bac2-6f241134b16a" (UID: "b9de402d-3bf9-4e99-bac2-6f241134b16a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.501685 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-logs" (OuterVolumeSpecName: "logs") pod "b9de402d-3bf9-4e99-bac2-6f241134b16a" (UID: "b9de402d-3bf9-4e99-bac2-6f241134b16a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.506831 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-ceph" (OuterVolumeSpecName: "ceph") pod "b9de402d-3bf9-4e99-bac2-6f241134b16a" (UID: "b9de402d-3bf9-4e99-bac2-6f241134b16a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.512744 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-scripts" (OuterVolumeSpecName: "scripts") pod "31ccda11-3d5b-4530-9cbe-e3a994610b08" (UID: "31ccda11-3d5b-4530-9cbe-e3a994610b08"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.514284 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-scripts" (OuterVolumeSpecName: "scripts") pod "b9de402d-3bf9-4e99-bac2-6f241134b16a" (UID: "b9de402d-3bf9-4e99-bac2-6f241134b16a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.515154 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-ceph" (OuterVolumeSpecName: "ceph") pod "31ccda11-3d5b-4530-9cbe-e3a994610b08" (UID: "31ccda11-3d5b-4530-9cbe-e3a994610b08"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.517912 4743 scope.go:117] "RemoveContainer" containerID="b61db55f10eacbf9d0ad87b0c1c93c80f3ed1bb015e8b4a8cae8f464dfe032dc" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.521272 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-kube-api-access-n44dx" (OuterVolumeSpecName: "kube-api-access-n44dx") pod "31ccda11-3d5b-4530-9cbe-e3a994610b08" (UID: "31ccda11-3d5b-4530-9cbe-e3a994610b08"). InnerVolumeSpecName "kube-api-access-n44dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.521400 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-kube-api-access-jp7l4" (OuterVolumeSpecName: "kube-api-access-jp7l4") pod "b9de402d-3bf9-4e99-bac2-6f241134b16a" (UID: "b9de402d-3bf9-4e99-bac2-6f241134b16a"). InnerVolumeSpecName "kube-api-access-jp7l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.541155 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9de402d-3bf9-4e99-bac2-6f241134b16a" (UID: "b9de402d-3bf9-4e99-bac2-6f241134b16a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.548894 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31ccda11-3d5b-4530-9cbe-e3a994610b08" (UID: "31ccda11-3d5b-4530-9cbe-e3a994610b08"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.582716 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-config-data" (OuterVolumeSpecName: "config-data") pod "b9de402d-3bf9-4e99-bac2-6f241134b16a" (UID: "b9de402d-3bf9-4e99-bac2-6f241134b16a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.586304 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-config-data" (OuterVolumeSpecName: "config-data") pod "31ccda11-3d5b-4530-9cbe-e3a994610b08" (UID: "31ccda11-3d5b-4530-9cbe-e3a994610b08"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601599 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp7l4\" (UniqueName: \"kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-kube-api-access-jp7l4\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601639 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-logs\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601656 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601667 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601681 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601691 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/31ccda11-3d5b-4530-9cbe-e3a994610b08-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601703 4743 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-httpd-run\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601713 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601724 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b9de402d-3bf9-4e99-bac2-6f241134b16a-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601735 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601745 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9de402d-3bf9-4e99-bac2-6f241134b16a-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601755 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31ccda11-3d5b-4530-9cbe-e3a994610b08-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601768 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n44dx\" (UniqueName: \"kubernetes.io/projected/31ccda11-3d5b-4530-9cbe-e3a994610b08-kube-api-access-n44dx\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.601779 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9de402d-3bf9-4e99-bac2-6f241134b16a-logs\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.720498 4743 scope.go:117] "RemoveContainer" containerID="4be7b35b414a045785d62915dd2fd1f1cf18324df0d99b8222170e6f5ba65c09" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.841343 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.868652 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.898277 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.906674 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.929644 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 10:02:13 crc kubenswrapper[4743]: E1122 10:02:13.930092 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ccda11-3d5b-4530-9cbe-e3a994610b08" containerName="glance-httpd" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.930109 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ccda11-3d5b-4530-9cbe-e3a994610b08" containerName="glance-httpd" Nov 22 10:02:13 crc kubenswrapper[4743]: E1122 10:02:13.930139 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31ccda11-3d5b-4530-9cbe-e3a994610b08" containerName="glance-log" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.930146 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="31ccda11-3d5b-4530-9cbe-e3a994610b08" containerName="glance-log" Nov 22 10:02:13 crc kubenswrapper[4743]: E1122 10:02:13.930162 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9de402d-3bf9-4e99-bac2-6f241134b16a" containerName="glance-log" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.930168 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9de402d-3bf9-4e99-bac2-6f241134b16a" containerName="glance-log" Nov 22 10:02:13 crc kubenswrapper[4743]: E1122 10:02:13.930181 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9de402d-3bf9-4e99-bac2-6f241134b16a" containerName="glance-httpd" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.930186 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9de402d-3bf9-4e99-bac2-6f241134b16a" containerName="glance-httpd" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.930382 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9de402d-3bf9-4e99-bac2-6f241134b16a" containerName="glance-httpd" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.930397 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9de402d-3bf9-4e99-bac2-6f241134b16a" containerName="glance-log" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.930422 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="31ccda11-3d5b-4530-9cbe-e3a994610b08" containerName="glance-log" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.930438 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="31ccda11-3d5b-4530-9cbe-e3a994610b08" containerName="glance-httpd" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.931675 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.936750 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.937034 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.937282 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vq74t" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.949066 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.951137 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.955975 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Nov 22 10:02:13 crc kubenswrapper[4743]: I1122 10:02:13.973660 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.012640 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.115671 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b576b318-2d3e-40b4-bdb2-3582ab998152-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.115735 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b576b318-2d3e-40b4-bdb2-3582ab998152-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.115760 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-logs\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.115786 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzqn9\" (UniqueName: \"kubernetes.io/projected/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-kube-api-access-dzqn9\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.115847 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-ceph\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.115868 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6qhm\" (UniqueName: \"kubernetes.io/projected/b576b318-2d3e-40b4-bdb2-3582ab998152-kube-api-access-m6qhm\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.115900 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b576b318-2d3e-40b4-bdb2-3582ab998152-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.115934 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b576b318-2d3e-40b4-bdb2-3582ab998152-logs\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.115958 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.115985 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.116003 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.116017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.116033 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b576b318-2d3e-40b4-bdb2-3582ab998152-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.116051 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b576b318-2d3e-40b4-bdb2-3582ab998152-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.217546 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-ceph\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.217619 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6qhm\" (UniqueName: \"kubernetes.io/projected/b576b318-2d3e-40b4-bdb2-3582ab998152-kube-api-access-m6qhm\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.217678 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b576b318-2d3e-40b4-bdb2-3582ab998152-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.217719 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b576b318-2d3e-40b4-bdb2-3582ab998152-logs\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.217760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.217777 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.217795 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.217808 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.217841 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b576b318-2d3e-40b4-bdb2-3582ab998152-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.218334 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b576b318-2d3e-40b4-bdb2-3582ab998152-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.218335 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.218421 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b576b318-2d3e-40b4-bdb2-3582ab998152-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.218550 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b576b318-2d3e-40b4-bdb2-3582ab998152-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.218687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-logs\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.219752 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzqn9\" (UniqueName: \"kubernetes.io/projected/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-kube-api-access-dzqn9\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.219907 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-logs\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.222052 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-scripts\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.224901 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-config-data\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.225158 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b576b318-2d3e-40b4-bdb2-3582ab998152-logs\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.227224 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b576b318-2d3e-40b4-bdb2-3582ab998152-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.228002 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-ceph\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.228524 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/b576b318-2d3e-40b4-bdb2-3582ab998152-ceph\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.229365 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b576b318-2d3e-40b4-bdb2-3582ab998152-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.229973 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b576b318-2d3e-40b4-bdb2-3582ab998152-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.231153 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b576b318-2d3e-40b4-bdb2-3582ab998152-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.235650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.243314 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzqn9\" (UniqueName: \"kubernetes.io/projected/7c5e8908-5e95-40ef-bb4d-940cc5c38e49-kube-api-access-dzqn9\") pod \"glance-default-external-api-0\" (UID: \"7c5e8908-5e95-40ef-bb4d-940cc5c38e49\") " pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.244086 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6qhm\" (UniqueName: \"kubernetes.io/projected/b576b318-2d3e-40b4-bdb2-3582ab998152-kube-api-access-m6qhm\") pod \"glance-default-internal-api-0\" (UID: \"b576b318-2d3e-40b4-bdb2-3582ab998152\") " pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.269568 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.318149 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:14 crc kubenswrapper[4743]: I1122 10:02:14.905524 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.003932 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.042982 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-vdqlr"] Nov 22 10:02:15 crc kubenswrapper[4743]: W1122 10:02:15.051474 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c5e8908_5e95_40ef_bb4d_940cc5c38e49.slice/crio-7ef51d6121d15fd3d5f041d7d930c2ca55bcb0fc52d2d472a4d634be636a557e WatchSource:0}: Error finding container 7ef51d6121d15fd3d5f041d7d930c2ca55bcb0fc52d2d472a4d634be636a557e: Status 404 returned error can't find the container with id 7ef51d6121d15fd3d5f041d7d930c2ca55bcb0fc52d2d472a4d634be636a557e Nov 22 10:02:15 crc kubenswrapper[4743]: W1122 10:02:15.052662 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb576b318_2d3e_40b4_bdb2_3582ab998152.slice/crio-52dec7f00edcf1793c021478ff855e9ed5786ad3eb299f4d985ea233c5c9cd11 WatchSource:0}: Error finding container 52dec7f00edcf1793c021478ff855e9ed5786ad3eb299f4d985ea233c5c9cd11: Status 404 returned error can't find the container with id 52dec7f00edcf1793c021478ff855e9ed5786ad3eb299f4d985ea233c5c9cd11 Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.052927 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-e01a-account-create-rs4p5"] Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.063715 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-e01a-account-create-rs4p5"] Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.074001 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-vdqlr"] Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.164016 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31ccda11-3d5b-4530-9cbe-e3a994610b08" path="/var/lib/kubelet/pods/31ccda11-3d5b-4530-9cbe-e3a994610b08/volumes" Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.164869 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c8fcd80-31e1-4905-91f1-21e3cee7cbf3" path="/var/lib/kubelet/pods/4c8fcd80-31e1-4905-91f1-21e3cee7cbf3/volumes" Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.165726 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9de402d-3bf9-4e99-bac2-6f241134b16a" path="/var/lib/kubelet/pods/b9de402d-3bf9-4e99-bac2-6f241134b16a/volumes" Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.168827 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5907bd9-c6b9-4676-8884-fcbc49d5986a" path="/var/lib/kubelet/pods/c5907bd9-c6b9-4676-8884-fcbc49d5986a/volumes" Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.517840 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c5e8908-5e95-40ef-bb4d-940cc5c38e49","Type":"ContainerStarted","Data":"7ef51d6121d15fd3d5f041d7d930c2ca55bcb0fc52d2d472a4d634be636a557e"} Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.519184 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b576b318-2d3e-40b4-bdb2-3582ab998152","Type":"ContainerStarted","Data":"52dec7f00edcf1793c021478ff855e9ed5786ad3eb299f4d985ea233c5c9cd11"} Nov 22 10:02:15 crc kubenswrapper[4743]: I1122 10:02:15.520398 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5847b769d5-2dvqm" event={"ID":"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb","Type":"ContainerStarted","Data":"833ab1d927a41158271bae91f0141202046adc2d2dee2cbe9d69ed2d066f93df"} Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.531954 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c5e8908-5e95-40ef-bb4d-940cc5c38e49","Type":"ContainerStarted","Data":"f99470ea99123130a62669847c9c374f612dcf96e6997575ce42dc3f8fec1f76"} Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.535120 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b576b318-2d3e-40b4-bdb2-3582ab998152","Type":"ContainerStarted","Data":"9f375b2b4f6ba8682ec014733d96e0956e55e6a202ef10f80497578105a79b0a"} Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.536925 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5847b769d5-2dvqm" event={"ID":"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb","Type":"ContainerStarted","Data":"5bf47d163bae53778bfd5f0875fff7d8506965099ab5baffabd97e9192770589"} Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.537102 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5847b769d5-2dvqm" podUID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" containerName="horizon-log" containerID="cri-o://833ab1d927a41158271bae91f0141202046adc2d2dee2cbe9d69ed2d066f93df" gracePeriod=30 Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.538069 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5847b769d5-2dvqm" podUID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" containerName="horizon" containerID="cri-o://5bf47d163bae53778bfd5f0875fff7d8506965099ab5baffabd97e9192770589" gracePeriod=30 Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.544351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-864787bfcc-j7q6g" event={"ID":"b445ab4a-9430-4c45-bd96-006a3c22598e","Type":"ContainerStarted","Data":"e7fac22848c24cbb3a1ba66cdf5b985d198f13c3e69dab4a759754935b1ff1b8"} Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.544420 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-864787bfcc-j7q6g" event={"ID":"b445ab4a-9430-4c45-bd96-006a3c22598e","Type":"ContainerStarted","Data":"cb36c7c80b2e712f0bd470bfc184dfcc805f9602785042793b521f21829f8c35"} Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.548607 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6568dc69d5-6d7ql" event={"ID":"cd9642c9-2a43-40ac-bda7-37b8c0d8b668","Type":"ContainerStarted","Data":"137cbcd7c0e6ca96ae73e48375638b6be5af1bede962be55854591705f47896f"} Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.548647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6568dc69d5-6d7ql" event={"ID":"cd9642c9-2a43-40ac-bda7-37b8c0d8b668","Type":"ContainerStarted","Data":"0a10a9c5f4731b886f789cb5457256af6b58c411284080064e52a9fc3ae66f5e"} Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.558032 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5847b769d5-2dvqm" podStartSLOduration=4.054841053 podStartE2EDuration="14.558013182s" podCreationTimestamp="2025-11-22 10:02:02 +0000 UTC" firstStartedPulling="2025-11-22 10:02:03.697484762 +0000 UTC m=+5997.403845814" lastFinishedPulling="2025-11-22 10:02:14.200656891 +0000 UTC m=+6007.907017943" observedRunningTime="2025-11-22 10:02:16.553904494 +0000 UTC m=+6010.260265566" watchObservedRunningTime="2025-11-22 10:02:16.558013182 +0000 UTC m=+6010.264374234" Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.582138 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6568dc69d5-6d7ql" podStartSLOduration=2.751017741 podStartE2EDuration="14.582118614s" podCreationTimestamp="2025-11-22 10:02:02 +0000 UTC" firstStartedPulling="2025-11-22 10:02:03.695149625 +0000 UTC m=+5997.401510677" lastFinishedPulling="2025-11-22 10:02:15.526250498 +0000 UTC m=+6009.232611550" observedRunningTime="2025-11-22 10:02:16.575867595 +0000 UTC m=+6010.282228667" watchObservedRunningTime="2025-11-22 10:02:16.582118614 +0000 UTC m=+6010.288479666" Nov 22 10:02:16 crc kubenswrapper[4743]: I1122 10:02:16.603525 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-864787bfcc-j7q6g" podStartSLOduration=1.941933987 podStartE2EDuration="13.603478278s" podCreationTimestamp="2025-11-22 10:02:03 +0000 UTC" firstStartedPulling="2025-11-22 10:02:03.980764721 +0000 UTC m=+5997.687125763" lastFinishedPulling="2025-11-22 10:02:15.642309002 +0000 UTC m=+6009.348670054" observedRunningTime="2025-11-22 10:02:16.598163685 +0000 UTC m=+6010.304524757" watchObservedRunningTime="2025-11-22 10:02:16.603478278 +0000 UTC m=+6010.309839330" Nov 22 10:02:17 crc kubenswrapper[4743]: I1122 10:02:17.562845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b576b318-2d3e-40b4-bdb2-3582ab998152","Type":"ContainerStarted","Data":"bb6f214a309686c3dab52bbf898bc00d8d74876c4539c3a5115d874491e7c4a6"} Nov 22 10:02:17 crc kubenswrapper[4743]: I1122 10:02:17.568950 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"7c5e8908-5e95-40ef-bb4d-940cc5c38e49","Type":"ContainerStarted","Data":"4174a4d816b9f79e6f89724b76753f76c984229c00112eb5c154bc67aa7d1779"} Nov 22 10:02:17 crc kubenswrapper[4743]: I1122 10:02:17.588073 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.588054335 podStartE2EDuration="4.588054335s" podCreationTimestamp="2025-11-22 10:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:02:17.584929306 +0000 UTC m=+6011.291290358" watchObservedRunningTime="2025-11-22 10:02:17.588054335 +0000 UTC m=+6011.294415387" Nov 22 10:02:17 crc kubenswrapper[4743]: I1122 10:02:17.615338 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.615313239 podStartE2EDuration="4.615313239s" podCreationTimestamp="2025-11-22 10:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:02:17.606150125 +0000 UTC m=+6011.312511187" watchObservedRunningTime="2025-11-22 10:02:17.615313239 +0000 UTC m=+6011.321674291" Nov 22 10:02:19 crc kubenswrapper[4743]: I1122 10:02:19.160269 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:02:19 crc kubenswrapper[4743]: E1122 10:02:19.160876 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:02:22 crc kubenswrapper[4743]: I1122 10:02:22.982664 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:23 crc kubenswrapper[4743]: I1122 10:02:23.088327 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:23 crc kubenswrapper[4743]: I1122 10:02:23.088380 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:23 crc kubenswrapper[4743]: I1122 10:02:23.499097 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:23 crc kubenswrapper[4743]: I1122 10:02:23.499414 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.270719 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.270802 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.302303 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.314236 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.326930 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.329521 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.366008 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.380309 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.642021 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.642069 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.642080 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:24 crc kubenswrapper[4743]: I1122 10:02:24.642089 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Nov 22 10:02:27 crc kubenswrapper[4743]: I1122 10:02:27.079271 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 10:02:27 crc kubenswrapper[4743]: I1122 10:02:27.079882 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 10:02:27 crc kubenswrapper[4743]: I1122 10:02:27.081658 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Nov 22 10:02:27 crc kubenswrapper[4743]: I1122 10:02:27.140844 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:27 crc kubenswrapper[4743]: I1122 10:02:27.140943 4743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Nov 22 10:02:27 crc kubenswrapper[4743]: I1122 10:02:27.198394 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Nov 22 10:02:31 crc kubenswrapper[4743]: I1122 10:02:31.039238 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-fmrsm"] Nov 22 10:02:31 crc kubenswrapper[4743]: I1122 10:02:31.051349 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-fmrsm"] Nov 22 10:02:31 crc kubenswrapper[4743]: I1122 10:02:31.162429 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeda2833-ef57-4a51-a3fa-e0b82f667688" path="/var/lib/kubelet/pods/aeda2833-ef57-4a51-a3fa-e0b82f667688/volumes" Nov 22 10:02:33 crc kubenswrapper[4743]: I1122 10:02:33.090825 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6568dc69d5-6d7ql" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Nov 22 10:02:33 crc kubenswrapper[4743]: I1122 10:02:33.152466 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:02:33 crc kubenswrapper[4743]: E1122 10:02:33.152737 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:02:33 crc kubenswrapper[4743]: I1122 10:02:33.499892 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-864787bfcc-j7q6g" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Nov 22 10:02:33 crc kubenswrapper[4743]: I1122 10:02:33.648870 4743 scope.go:117] "RemoveContainer" containerID="c99457232b7d3d01c35d1d2f4096f94b5647e1890b186c840eeda8276f9f7619" Nov 22 10:02:33 crc kubenswrapper[4743]: I1122 10:02:33.733989 4743 scope.go:117] "RemoveContainer" containerID="37dae494fde993647e438f51403a24847e5950cbbadfb5dbfed3ed6faa4c0925" Nov 22 10:02:33 crc kubenswrapper[4743]: I1122 10:02:33.782213 4743 scope.go:117] "RemoveContainer" containerID="1957840df6055a399eb2bf8a4dc0e2351b7963713df5df6abeb1cbb28ca04eea" Nov 22 10:02:33 crc kubenswrapper[4743]: I1122 10:02:33.910035 4743 scope.go:117] "RemoveContainer" containerID="15f06d505d36c1a5d99f67be71b358f32acd46652b4c6977d23e7a83508314df" Nov 22 10:02:33 crc kubenswrapper[4743]: I1122 10:02:33.961949 4743 scope.go:117] "RemoveContainer" containerID="a89fdc4a1c1c277300b8d7e936b419bff40daf5844ce29720b519778468df321" Nov 22 10:02:34 crc kubenswrapper[4743]: I1122 10:02:34.031295 4743 scope.go:117] "RemoveContainer" containerID="454cd8c9877a28e42f0be77333c2ef8b29bc591614ef43bcbdb41dcf78388076" Nov 22 10:02:45 crc kubenswrapper[4743]: I1122 10:02:45.044301 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:45 crc kubenswrapper[4743]: I1122 10:02:45.777391 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:46 crc kubenswrapper[4743]: I1122 10:02:46.152854 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:02:46 crc kubenswrapper[4743]: E1122 10:02:46.153205 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:02:46 crc kubenswrapper[4743]: I1122 10:02:46.863200 4743 generic.go:334] "Generic (PLEG): container finished" podID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" containerID="5bf47d163bae53778bfd5f0875fff7d8506965099ab5baffabd97e9192770589" exitCode=137 Nov 22 10:02:46 crc kubenswrapper[4743]: I1122 10:02:46.863693 4743 generic.go:334] "Generic (PLEG): container finished" podID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" containerID="833ab1d927a41158271bae91f0141202046adc2d2dee2cbe9d69ed2d066f93df" exitCode=137 Nov 22 10:02:46 crc kubenswrapper[4743]: I1122 10:02:46.863724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5847b769d5-2dvqm" event={"ID":"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb","Type":"ContainerDied","Data":"5bf47d163bae53778bfd5f0875fff7d8506965099ab5baffabd97e9192770589"} Nov 22 10:02:46 crc kubenswrapper[4743]: I1122 10:02:46.863754 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5847b769d5-2dvqm" event={"ID":"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb","Type":"ContainerDied","Data":"833ab1d927a41158271bae91f0141202046adc2d2dee2cbe9d69ed2d066f93df"} Nov 22 10:02:46 crc kubenswrapper[4743]: I1122 10:02:46.885269 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.148680 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.223603 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-config-data\") pod \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.223716 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-horizon-secret-key\") pod \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.223776 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvpmg\" (UniqueName: \"kubernetes.io/projected/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-kube-api-access-pvpmg\") pod \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.223817 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-scripts\") pod \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.223934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-logs\") pod \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\" (UID: \"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb\") " Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.224542 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-logs" (OuterVolumeSpecName: "logs") pod "c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" (UID: "c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.225664 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-logs\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.230265 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" (UID: "c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.231837 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-kube-api-access-pvpmg" (OuterVolumeSpecName: "kube-api-access-pvpmg") pod "c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" (UID: "c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb"). InnerVolumeSpecName "kube-api-access-pvpmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.256552 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-config-data" (OuterVolumeSpecName: "config-data") pod "c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" (UID: "c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.274256 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-scripts" (OuterVolumeSpecName: "scripts") pod "c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" (UID: "c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.327361 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvpmg\" (UniqueName: \"kubernetes.io/projected/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-kube-api-access-pvpmg\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.327395 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.327405 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.327421 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.575592 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.647796 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6568dc69d5-6d7ql"] Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.874764 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5847b769d5-2dvqm" event={"ID":"c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb","Type":"ContainerDied","Data":"7438b03d377708330a99044c78ea084b66ccbffa4fe4ed8fdf584e082171a398"} Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.874808 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5847b769d5-2dvqm" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.874853 4743 scope.go:117] "RemoveContainer" containerID="5bf47d163bae53778bfd5f0875fff7d8506965099ab5baffabd97e9192770589" Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.874910 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6568dc69d5-6d7ql" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon-log" containerID="cri-o://0a10a9c5f4731b886f789cb5457256af6b58c411284080064e52a9fc3ae66f5e" gracePeriod=30 Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.875076 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6568dc69d5-6d7ql" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon" containerID="cri-o://137cbcd7c0e6ca96ae73e48375638b6be5af1bede962be55854591705f47896f" gracePeriod=30 Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.917210 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5847b769d5-2dvqm"] Nov 22 10:02:47 crc kubenswrapper[4743]: I1122 10:02:47.928183 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5847b769d5-2dvqm"] Nov 22 10:02:48 crc kubenswrapper[4743]: I1122 10:02:48.041480 4743 scope.go:117] "RemoveContainer" containerID="833ab1d927a41158271bae91f0141202046adc2d2dee2cbe9d69ed2d066f93df" Nov 22 10:02:49 crc kubenswrapper[4743]: I1122 10:02:49.167459 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" path="/var/lib/kubelet/pods/c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb/volumes" Nov 22 10:02:51 crc kubenswrapper[4743]: I1122 10:02:51.911107 4743 generic.go:334] "Generic (PLEG): container finished" podID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerID="137cbcd7c0e6ca96ae73e48375638b6be5af1bede962be55854591705f47896f" exitCode=0 Nov 22 10:02:51 crc kubenswrapper[4743]: I1122 10:02:51.911190 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6568dc69d5-6d7ql" event={"ID":"cd9642c9-2a43-40ac-bda7-37b8c0d8b668","Type":"ContainerDied","Data":"137cbcd7c0e6ca96ae73e48375638b6be5af1bede962be55854591705f47896f"} Nov 22 10:02:53 crc kubenswrapper[4743]: I1122 10:02:53.088624 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6568dc69d5-6d7ql" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Nov 22 10:02:59 crc kubenswrapper[4743]: I1122 10:02:59.151464 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:02:59 crc kubenswrapper[4743]: E1122 10:02:59.152557 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:03:03 crc kubenswrapper[4743]: I1122 10:03:03.088741 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6568dc69d5-6d7ql" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Nov 22 10:03:13 crc kubenswrapper[4743]: I1122 10:03:13.056232 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-rvqd9"] Nov 22 10:03:13 crc kubenswrapper[4743]: I1122 10:03:13.069901 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bc66-account-create-7ph85"] Nov 22 10:03:13 crc kubenswrapper[4743]: I1122 10:03:13.080719 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-rvqd9"] Nov 22 10:03:13 crc kubenswrapper[4743]: I1122 10:03:13.088357 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6568dc69d5-6d7ql" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.111:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.111:8080: connect: connection refused" Nov 22 10:03:13 crc kubenswrapper[4743]: I1122 10:03:13.088477 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:03:13 crc kubenswrapper[4743]: I1122 10:03:13.090746 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bc66-account-create-7ph85"] Nov 22 10:03:13 crc kubenswrapper[4743]: I1122 10:03:13.157913 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:03:13 crc kubenswrapper[4743]: E1122 10:03:13.158118 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:03:13 crc kubenswrapper[4743]: I1122 10:03:13.165392 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67515e96-6010-4d7b-9d07-82a661b67ef0" path="/var/lib/kubelet/pods/67515e96-6010-4d7b-9d07-82a661b67ef0/volumes" Nov 22 10:03:13 crc kubenswrapper[4743]: I1122 10:03:13.166021 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3684de8-7edc-43f3-92f1-2012be187d12" path="/var/lib/kubelet/pods/f3684de8-7edc-43f3-92f1-2012be187d12/volumes" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.146650 4743 generic.go:334] "Generic (PLEG): container finished" podID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerID="0a10a9c5f4731b886f789cb5457256af6b58c411284080064e52a9fc3ae66f5e" exitCode=137 Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.146879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6568dc69d5-6d7ql" event={"ID":"cd9642c9-2a43-40ac-bda7-37b8c0d8b668","Type":"ContainerDied","Data":"0a10a9c5f4731b886f789cb5457256af6b58c411284080064e52a9fc3ae66f5e"} Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.397890 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.512830 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-logs\") pod \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.512919 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-config-data\") pod \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.513035 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdllg\" (UniqueName: \"kubernetes.io/projected/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-kube-api-access-kdllg\") pod \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.513154 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-scripts\") pod \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.513313 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-horizon-secret-key\") pod \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\" (UID: \"cd9642c9-2a43-40ac-bda7-37b8c0d8b668\") " Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.513445 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-logs" (OuterVolumeSpecName: "logs") pod "cd9642c9-2a43-40ac-bda7-37b8c0d8b668" (UID: "cd9642c9-2a43-40ac-bda7-37b8c0d8b668"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.514120 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-logs\") on node \"crc\" DevicePath \"\"" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.519131 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-kube-api-access-kdllg" (OuterVolumeSpecName: "kube-api-access-kdllg") pod "cd9642c9-2a43-40ac-bda7-37b8c0d8b668" (UID: "cd9642c9-2a43-40ac-bda7-37b8c0d8b668"). InnerVolumeSpecName "kube-api-access-kdllg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.536736 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cd9642c9-2a43-40ac-bda7-37b8c0d8b668" (UID: "cd9642c9-2a43-40ac-bda7-37b8c0d8b668"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.539126 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-config-data" (OuterVolumeSpecName: "config-data") pod "cd9642c9-2a43-40ac-bda7-37b8c0d8b668" (UID: "cd9642c9-2a43-40ac-bda7-37b8c0d8b668"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.541809 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-scripts" (OuterVolumeSpecName: "scripts") pod "cd9642c9-2a43-40ac-bda7-37b8c0d8b668" (UID: "cd9642c9-2a43-40ac-bda7-37b8c0d8b668"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.616479 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.616548 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdllg\" (UniqueName: \"kubernetes.io/projected/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-kube-api-access-kdllg\") on node \"crc\" DevicePath \"\"" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.616563 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:03:18 crc kubenswrapper[4743]: I1122 10:03:18.616595 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9642c9-2a43-40ac-bda7-37b8c0d8b668-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:03:19 crc kubenswrapper[4743]: I1122 10:03:19.159766 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6568dc69d5-6d7ql" Nov 22 10:03:19 crc kubenswrapper[4743]: I1122 10:03:19.166403 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6568dc69d5-6d7ql" event={"ID":"cd9642c9-2a43-40ac-bda7-37b8c0d8b668","Type":"ContainerDied","Data":"e770aee2c5eee4d19cd4266303eae10002990a927bc62e407350e12d7d9171ea"} Nov 22 10:03:19 crc kubenswrapper[4743]: I1122 10:03:19.166480 4743 scope.go:117] "RemoveContainer" containerID="137cbcd7c0e6ca96ae73e48375638b6be5af1bede962be55854591705f47896f" Nov 22 10:03:19 crc kubenswrapper[4743]: I1122 10:03:19.206292 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6568dc69d5-6d7ql"] Nov 22 10:03:19 crc kubenswrapper[4743]: I1122 10:03:19.221058 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6568dc69d5-6d7ql"] Nov 22 10:03:19 crc kubenswrapper[4743]: I1122 10:03:19.337751 4743 scope.go:117] "RemoveContainer" containerID="0a10a9c5f4731b886f789cb5457256af6b58c411284080064e52a9fc3ae66f5e" Nov 22 10:03:20 crc kubenswrapper[4743]: I1122 10:03:20.045702 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-wq9jx"] Nov 22 10:03:20 crc kubenswrapper[4743]: I1122 10:03:20.057498 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-wq9jx"] Nov 22 10:03:21 crc kubenswrapper[4743]: I1122 10:03:21.162294 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" path="/var/lib/kubelet/pods/cd9642c9-2a43-40ac-bda7-37b8c0d8b668/volumes" Nov 22 10:03:21 crc kubenswrapper[4743]: I1122 10:03:21.163407 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0ded17-e7cd-474f-9aab-50bd38924ae1" path="/var/lib/kubelet/pods/ed0ded17-e7cd-474f-9aab-50bd38924ae1/volumes" Nov 22 10:03:24 crc kubenswrapper[4743]: I1122 10:03:24.153011 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:03:24 crc kubenswrapper[4743]: E1122 10:03:24.154431 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:03:34 crc kubenswrapper[4743]: I1122 10:03:34.566696 4743 scope.go:117] "RemoveContainer" containerID="49704b2f3378d514fd719677ec7c09b2d7dd9f4608cffbd94fd354a3a0cd8af3" Nov 22 10:03:34 crc kubenswrapper[4743]: I1122 10:03:34.591689 4743 scope.go:117] "RemoveContainer" containerID="f46bd808cf70e5094d48f92e49fe3b73ce6824c13c3b0cd3f32bd9e99b8719a9" Nov 22 10:03:34 crc kubenswrapper[4743]: I1122 10:03:34.639719 4743 scope.go:117] "RemoveContainer" containerID="82d24f02b577255c5bde1a13bc50c12544aba7acf65f94aae2726367c1b40b96" Nov 22 10:03:34 crc kubenswrapper[4743]: I1122 10:03:34.690880 4743 scope.go:117] "RemoveContainer" containerID="eeb73d4da37af9026faa316ea2672ae27ab2c956c883577768937e69e5add235" Nov 22 10:03:34 crc kubenswrapper[4743]: I1122 10:03:34.721662 4743 scope.go:117] "RemoveContainer" containerID="dfc85cadbbd916b7c1aadb3efb8ad370f614798cd8fe55e684ddf0642b67538b" Nov 22 10:03:36 crc kubenswrapper[4743]: I1122 10:03:36.152119 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:03:37 crc kubenswrapper[4743]: I1122 10:03:37.322414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"6b404b343b65af21aff6649872514335dd4038e7612309804bf70aeba3bcb920"} Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.517557 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6nx7j"] Nov 22 10:03:38 crc kubenswrapper[4743]: E1122 10:03:38.524975 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" containerName="horizon-log" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.525008 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" containerName="horizon-log" Nov 22 10:03:38 crc kubenswrapper[4743]: E1122 10:03:38.525026 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" containerName="horizon" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.525032 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" containerName="horizon" Nov 22 10:03:38 crc kubenswrapper[4743]: E1122 10:03:38.525050 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon-log" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.525056 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon-log" Nov 22 10:03:38 crc kubenswrapper[4743]: E1122 10:03:38.525066 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.525072 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.525284 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" containerName="horizon-log" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.525299 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d765c2-95ac-4af8-ad8c-9c7ed79c62fb" containerName="horizon" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.525320 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.525329 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd9642c9-2a43-40ac-bda7-37b8c0d8b668" containerName="horizon-log" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.527021 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.529746 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6nx7j"] Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.605011 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-catalog-content\") pod \"certified-operators-6nx7j\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.605348 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njbzs\" (UniqueName: \"kubernetes.io/projected/f693dc5e-bd28-4753-96e5-6f162a201c32-kube-api-access-njbzs\") pod \"certified-operators-6nx7j\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.605471 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-utilities\") pod \"certified-operators-6nx7j\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.707030 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njbzs\" (UniqueName: \"kubernetes.io/projected/f693dc5e-bd28-4753-96e5-6f162a201c32-kube-api-access-njbzs\") pod \"certified-operators-6nx7j\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.707135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-utilities\") pod \"certified-operators-6nx7j\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.707210 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-catalog-content\") pod \"certified-operators-6nx7j\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.708021 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-utilities\") pod \"certified-operators-6nx7j\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.708992 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-catalog-content\") pod \"certified-operators-6nx7j\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:38 crc kubenswrapper[4743]: I1122 10:03:38.730662 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njbzs\" (UniqueName: \"kubernetes.io/projected/f693dc5e-bd28-4753-96e5-6f162a201c32-kube-api-access-njbzs\") pod \"certified-operators-6nx7j\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:39 crc kubenswrapper[4743]: I1122 10:03:39.024906 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:39 crc kubenswrapper[4743]: I1122 10:03:39.575558 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6nx7j"] Nov 22 10:03:39 crc kubenswrapper[4743]: W1122 10:03:39.575793 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf693dc5e_bd28_4753_96e5_6f162a201c32.slice/crio-a3ea864cad1975d81c856a1c7e2376bcd4545e8af34b27648ebe3374452a15ac WatchSource:0}: Error finding container a3ea864cad1975d81c856a1c7e2376bcd4545e8af34b27648ebe3374452a15ac: Status 404 returned error can't find the container with id a3ea864cad1975d81c856a1c7e2376bcd4545e8af34b27648ebe3374452a15ac Nov 22 10:03:40 crc kubenswrapper[4743]: I1122 10:03:40.355399 4743 generic.go:334] "Generic (PLEG): container finished" podID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerID="4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625" exitCode=0 Nov 22 10:03:40 crc kubenswrapper[4743]: I1122 10:03:40.355499 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nx7j" event={"ID":"f693dc5e-bd28-4753-96e5-6f162a201c32","Type":"ContainerDied","Data":"4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625"} Nov 22 10:03:40 crc kubenswrapper[4743]: I1122 10:03:40.355752 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nx7j" event={"ID":"f693dc5e-bd28-4753-96e5-6f162a201c32","Type":"ContainerStarted","Data":"a3ea864cad1975d81c856a1c7e2376bcd4545e8af34b27648ebe3374452a15ac"} Nov 22 10:03:40 crc kubenswrapper[4743]: I1122 10:03:40.358141 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:03:42 crc kubenswrapper[4743]: I1122 10:03:42.374461 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nx7j" event={"ID":"f693dc5e-bd28-4753-96e5-6f162a201c32","Type":"ContainerStarted","Data":"7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e"} Nov 22 10:03:43 crc kubenswrapper[4743]: I1122 10:03:43.390506 4743 generic.go:334] "Generic (PLEG): container finished" podID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerID="7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e" exitCode=0 Nov 22 10:03:43 crc kubenswrapper[4743]: I1122 10:03:43.390560 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nx7j" event={"ID":"f693dc5e-bd28-4753-96e5-6f162a201c32","Type":"ContainerDied","Data":"7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e"} Nov 22 10:03:46 crc kubenswrapper[4743]: I1122 10:03:46.432124 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nx7j" event={"ID":"f693dc5e-bd28-4753-96e5-6f162a201c32","Type":"ContainerStarted","Data":"a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec"} Nov 22 10:03:46 crc kubenswrapper[4743]: I1122 10:03:46.456190 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6nx7j" podStartSLOduration=3.091896634 podStartE2EDuration="8.456159397s" podCreationTimestamp="2025-11-22 10:03:38 +0000 UTC" firstStartedPulling="2025-11-22 10:03:40.357928707 +0000 UTC m=+6094.064289759" lastFinishedPulling="2025-11-22 10:03:45.72219146 +0000 UTC m=+6099.428552522" observedRunningTime="2025-11-22 10:03:46.452124611 +0000 UTC m=+6100.158485663" watchObservedRunningTime="2025-11-22 10:03:46.456159397 +0000 UTC m=+6100.162520469" Nov 22 10:03:49 crc kubenswrapper[4743]: I1122 10:03:49.025801 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:49 crc kubenswrapper[4743]: I1122 10:03:49.026310 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:49 crc kubenswrapper[4743]: I1122 10:03:49.078909 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:50 crc kubenswrapper[4743]: I1122 10:03:50.504911 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:50 crc kubenswrapper[4743]: I1122 10:03:50.548937 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6nx7j"] Nov 22 10:03:52 crc kubenswrapper[4743]: I1122 10:03:52.049203 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m8fmm"] Nov 22 10:03:52 crc kubenswrapper[4743]: I1122 10:03:52.057469 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m8fmm"] Nov 22 10:03:52 crc kubenswrapper[4743]: I1122 10:03:52.065209 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-42bb-account-create-xrbt7"] Nov 22 10:03:52 crc kubenswrapper[4743]: I1122 10:03:52.072526 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-42bb-account-create-xrbt7"] Nov 22 10:03:52 crc kubenswrapper[4743]: I1122 10:03:52.481312 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6nx7j" podUID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerName="registry-server" containerID="cri-o://a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec" gracePeriod=2 Nov 22 10:03:52 crc kubenswrapper[4743]: I1122 10:03:52.928535 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.094754 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-catalog-content\") pod \"f693dc5e-bd28-4753-96e5-6f162a201c32\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.095890 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-utilities\") pod \"f693dc5e-bd28-4753-96e5-6f162a201c32\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.096011 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njbzs\" (UniqueName: \"kubernetes.io/projected/f693dc5e-bd28-4753-96e5-6f162a201c32-kube-api-access-njbzs\") pod \"f693dc5e-bd28-4753-96e5-6f162a201c32\" (UID: \"f693dc5e-bd28-4753-96e5-6f162a201c32\") " Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.097090 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-utilities" (OuterVolumeSpecName: "utilities") pod "f693dc5e-bd28-4753-96e5-6f162a201c32" (UID: "f693dc5e-bd28-4753-96e5-6f162a201c32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.101382 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f693dc5e-bd28-4753-96e5-6f162a201c32-kube-api-access-njbzs" (OuterVolumeSpecName: "kube-api-access-njbzs") pod "f693dc5e-bd28-4753-96e5-6f162a201c32" (UID: "f693dc5e-bd28-4753-96e5-6f162a201c32"). InnerVolumeSpecName "kube-api-access-njbzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.135739 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f693dc5e-bd28-4753-96e5-6f162a201c32" (UID: "f693dc5e-bd28-4753-96e5-6f162a201c32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.163590 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31f7b903-226c-402f-884f-4bf2ae3b7f74" path="/var/lib/kubelet/pods/31f7b903-226c-402f-884f-4bf2ae3b7f74/volumes" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.164150 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39ac93d2-dedd-4109-a0ba-928660962d81" path="/var/lib/kubelet/pods/39ac93d2-dedd-4109-a0ba-928660962d81/volumes" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.197835 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njbzs\" (UniqueName: \"kubernetes.io/projected/f693dc5e-bd28-4753-96e5-6f162a201c32-kube-api-access-njbzs\") on node \"crc\" DevicePath \"\"" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.197862 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.197877 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f693dc5e-bd28-4753-96e5-6f162a201c32-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.495523 4743 generic.go:334] "Generic (PLEG): container finished" podID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerID="a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec" exitCode=0 Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.495560 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nx7j" event={"ID":"f693dc5e-bd28-4753-96e5-6f162a201c32","Type":"ContainerDied","Data":"a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec"} Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.495867 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6nx7j" event={"ID":"f693dc5e-bd28-4753-96e5-6f162a201c32","Type":"ContainerDied","Data":"a3ea864cad1975d81c856a1c7e2376bcd4545e8af34b27648ebe3374452a15ac"} Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.495909 4743 scope.go:117] "RemoveContainer" containerID="a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.495657 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6nx7j" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.525552 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6nx7j"] Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.527313 4743 scope.go:117] "RemoveContainer" containerID="7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.534723 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6nx7j"] Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.550691 4743 scope.go:117] "RemoveContainer" containerID="4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.597963 4743 scope.go:117] "RemoveContainer" containerID="a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec" Nov 22 10:03:53 crc kubenswrapper[4743]: E1122 10:03:53.605798 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec\": container with ID starting with a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec not found: ID does not exist" containerID="a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.605857 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec"} err="failed to get container status \"a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec\": rpc error: code = NotFound desc = could not find container \"a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec\": container with ID starting with a785419accf6100ff1352ba168cf425f9792e90b2add587dfb5c795264ec5cec not found: ID does not exist" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.605885 4743 scope.go:117] "RemoveContainer" containerID="7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e" Nov 22 10:03:53 crc kubenswrapper[4743]: E1122 10:03:53.607394 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e\": container with ID starting with 7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e not found: ID does not exist" containerID="7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.607478 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e"} err="failed to get container status \"7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e\": rpc error: code = NotFound desc = could not find container \"7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e\": container with ID starting with 7e8322fb3e3c6cbeff7a1e9143e0c0667b46a8e9a61861fb77640e5d7c3aab8e not found: ID does not exist" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.607508 4743 scope.go:117] "RemoveContainer" containerID="4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625" Nov 22 10:03:53 crc kubenswrapper[4743]: E1122 10:03:53.608014 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625\": container with ID starting with 4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625 not found: ID does not exist" containerID="4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625" Nov 22 10:03:53 crc kubenswrapper[4743]: I1122 10:03:53.608059 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625"} err="failed to get container status \"4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625\": rpc error: code = NotFound desc = could not find container \"4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625\": container with ID starting with 4a5e9b74233e0a24da7bce80f2ed5a0319bf40903ab1d6e7efc13f6dfe592625 not found: ID does not exist" Nov 22 10:03:55 crc kubenswrapper[4743]: I1122 10:03:55.163906 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f693dc5e-bd28-4753-96e5-6f162a201c32" path="/var/lib/kubelet/pods/f693dc5e-bd28-4753-96e5-6f162a201c32/volumes" Nov 22 10:03:55 crc kubenswrapper[4743]: I1122 10:03:55.933165 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76456674f-grnzr"] Nov 22 10:03:55 crc kubenswrapper[4743]: E1122 10:03:55.933733 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerName="extract-utilities" Nov 22 10:03:55 crc kubenswrapper[4743]: I1122 10:03:55.933756 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerName="extract-utilities" Nov 22 10:03:55 crc kubenswrapper[4743]: E1122 10:03:55.933783 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerName="registry-server" Nov 22 10:03:55 crc kubenswrapper[4743]: I1122 10:03:55.933791 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerName="registry-server" Nov 22 10:03:55 crc kubenswrapper[4743]: E1122 10:03:55.933822 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerName="extract-content" Nov 22 10:03:55 crc kubenswrapper[4743]: I1122 10:03:55.933831 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerName="extract-content" Nov 22 10:03:55 crc kubenswrapper[4743]: I1122 10:03:55.934098 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f693dc5e-bd28-4753-96e5-6f162a201c32" containerName="registry-server" Nov 22 10:03:55 crc kubenswrapper[4743]: I1122 10:03:55.935468 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:55 crc kubenswrapper[4743]: I1122 10:03:55.941959 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76456674f-grnzr"] Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.051865 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b30c1d40-0697-4337-ba40-9090dc6988a5-config-data\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.051958 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b30c1d40-0697-4337-ba40-9090dc6988a5-logs\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.052022 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b30c1d40-0697-4337-ba40-9090dc6988a5-scripts\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.052117 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpfkv\" (UniqueName: \"kubernetes.io/projected/b30c1d40-0697-4337-ba40-9090dc6988a5-kube-api-access-mpfkv\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.052149 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b30c1d40-0697-4337-ba40-9090dc6988a5-horizon-secret-key\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.154213 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b30c1d40-0697-4337-ba40-9090dc6988a5-config-data\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.154299 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b30c1d40-0697-4337-ba40-9090dc6988a5-logs\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.154348 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b30c1d40-0697-4337-ba40-9090dc6988a5-scripts\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.154380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpfkv\" (UniqueName: \"kubernetes.io/projected/b30c1d40-0697-4337-ba40-9090dc6988a5-kube-api-access-mpfkv\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.154402 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b30c1d40-0697-4337-ba40-9090dc6988a5-horizon-secret-key\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.155076 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b30c1d40-0697-4337-ba40-9090dc6988a5-logs\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.155562 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b30c1d40-0697-4337-ba40-9090dc6988a5-scripts\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.156108 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b30c1d40-0697-4337-ba40-9090dc6988a5-config-data\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.159854 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b30c1d40-0697-4337-ba40-9090dc6988a5-horizon-secret-key\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.170112 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpfkv\" (UniqueName: \"kubernetes.io/projected/b30c1d40-0697-4337-ba40-9090dc6988a5-kube-api-access-mpfkv\") pod \"horizon-76456674f-grnzr\" (UID: \"b30c1d40-0697-4337-ba40-9090dc6988a5\") " pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.254427 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76456674f-grnzr" Nov 22 10:03:56 crc kubenswrapper[4743]: I1122 10:03:56.731441 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76456674f-grnzr"] Nov 22 10:03:56 crc kubenswrapper[4743]: W1122 10:03:56.732557 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb30c1d40_0697_4337_ba40_9090dc6988a5.slice/crio-1268e947b35ab6a533483196462e573c2b87a302f153adcbc645874497936e03 WatchSource:0}: Error finding container 1268e947b35ab6a533483196462e573c2b87a302f153adcbc645874497936e03: Status 404 returned error can't find the container with id 1268e947b35ab6a533483196462e573c2b87a302f153adcbc645874497936e03 Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.018490 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-mzjsv"] Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.020165 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mzjsv" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.044876 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-mzjsv"] Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.107357 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-fa27-account-create-bcq27"] Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.109282 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-fa27-account-create-bcq27" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.111490 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.122860 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-fa27-account-create-bcq27"] Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.177916 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b746268-80b6-4be3-b32c-19b1fe639bef-operator-scripts\") pod \"heat-db-create-mzjsv\" (UID: \"8b746268-80b6-4be3-b32c-19b1fe639bef\") " pod="openstack/heat-db-create-mzjsv" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.178042 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s5zl\" (UniqueName: \"kubernetes.io/projected/8b746268-80b6-4be3-b32c-19b1fe639bef-kube-api-access-9s5zl\") pod \"heat-db-create-mzjsv\" (UID: \"8b746268-80b6-4be3-b32c-19b1fe639bef\") " pod="openstack/heat-db-create-mzjsv" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.279967 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-operator-scripts\") pod \"heat-fa27-account-create-bcq27\" (UID: \"ef7f86b6-ae6d-4428-ba15-014b04aa2a93\") " pod="openstack/heat-fa27-account-create-bcq27" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.280082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5nx7\" (UniqueName: \"kubernetes.io/projected/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-kube-api-access-l5nx7\") pod \"heat-fa27-account-create-bcq27\" (UID: \"ef7f86b6-ae6d-4428-ba15-014b04aa2a93\") " pod="openstack/heat-fa27-account-create-bcq27" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.280135 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b746268-80b6-4be3-b32c-19b1fe639bef-operator-scripts\") pod \"heat-db-create-mzjsv\" (UID: \"8b746268-80b6-4be3-b32c-19b1fe639bef\") " pod="openstack/heat-db-create-mzjsv" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.280855 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b746268-80b6-4be3-b32c-19b1fe639bef-operator-scripts\") pod \"heat-db-create-mzjsv\" (UID: \"8b746268-80b6-4be3-b32c-19b1fe639bef\") " pod="openstack/heat-db-create-mzjsv" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.281138 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s5zl\" (UniqueName: \"kubernetes.io/projected/8b746268-80b6-4be3-b32c-19b1fe639bef-kube-api-access-9s5zl\") pod \"heat-db-create-mzjsv\" (UID: \"8b746268-80b6-4be3-b32c-19b1fe639bef\") " pod="openstack/heat-db-create-mzjsv" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.300378 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s5zl\" (UniqueName: \"kubernetes.io/projected/8b746268-80b6-4be3-b32c-19b1fe639bef-kube-api-access-9s5zl\") pod \"heat-db-create-mzjsv\" (UID: \"8b746268-80b6-4be3-b32c-19b1fe639bef\") " pod="openstack/heat-db-create-mzjsv" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.368151 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mzjsv" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.385622 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-operator-scripts\") pod \"heat-fa27-account-create-bcq27\" (UID: \"ef7f86b6-ae6d-4428-ba15-014b04aa2a93\") " pod="openstack/heat-fa27-account-create-bcq27" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.385756 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5nx7\" (UniqueName: \"kubernetes.io/projected/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-kube-api-access-l5nx7\") pod \"heat-fa27-account-create-bcq27\" (UID: \"ef7f86b6-ae6d-4428-ba15-014b04aa2a93\") " pod="openstack/heat-fa27-account-create-bcq27" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.386334 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-operator-scripts\") pod \"heat-fa27-account-create-bcq27\" (UID: \"ef7f86b6-ae6d-4428-ba15-014b04aa2a93\") " pod="openstack/heat-fa27-account-create-bcq27" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.405976 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5nx7\" (UniqueName: \"kubernetes.io/projected/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-kube-api-access-l5nx7\") pod \"heat-fa27-account-create-bcq27\" (UID: \"ef7f86b6-ae6d-4428-ba15-014b04aa2a93\") " pod="openstack/heat-fa27-account-create-bcq27" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.431982 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-fa27-account-create-bcq27" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.545237 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76456674f-grnzr" event={"ID":"b30c1d40-0697-4337-ba40-9090dc6988a5","Type":"ContainerStarted","Data":"e65e2aa36c0681bf4695d3f4cf75b0474d581a1e8f56237df0b4fbe547da1216"} Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.545680 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76456674f-grnzr" event={"ID":"b30c1d40-0697-4337-ba40-9090dc6988a5","Type":"ContainerStarted","Data":"0611a5bad2fe0b83ecbdf515aaafeaf7731d14cfa5c759ff1b193721cd5f1b64"} Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.545698 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76456674f-grnzr" event={"ID":"b30c1d40-0697-4337-ba40-9090dc6988a5","Type":"ContainerStarted","Data":"1268e947b35ab6a533483196462e573c2b87a302f153adcbc645874497936e03"} Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.575172 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76456674f-grnzr" podStartSLOduration=2.5751486999999997 podStartE2EDuration="2.5751487s" podCreationTimestamp="2025-11-22 10:03:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:03:57.567449249 +0000 UTC m=+6111.273810301" watchObservedRunningTime="2025-11-22 10:03:57.5751487 +0000 UTC m=+6111.281509752" Nov 22 10:03:57 crc kubenswrapper[4743]: I1122 10:03:57.898515 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-mzjsv"] Nov 22 10:03:57 crc kubenswrapper[4743]: W1122 10:03:57.898630 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b746268_80b6_4be3_b32c_19b1fe639bef.slice/crio-0732a55f11c422a47c3063af24e0f177f334c3c27ed534f77da0cd1b9dfe54e6 WatchSource:0}: Error finding container 0732a55f11c422a47c3063af24e0f177f334c3c27ed534f77da0cd1b9dfe54e6: Status 404 returned error can't find the container with id 0732a55f11c422a47c3063af24e0f177f334c3c27ed534f77da0cd1b9dfe54e6 Nov 22 10:03:58 crc kubenswrapper[4743]: I1122 10:03:58.005752 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-fa27-account-create-bcq27"] Nov 22 10:03:58 crc kubenswrapper[4743]: W1122 10:03:58.023056 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef7f86b6_ae6d_4428_ba15_014b04aa2a93.slice/crio-3d726c5df8667f18f1d53b31a54a7b6d0acb37ba9a3e417081bf95f573a3e6d9 WatchSource:0}: Error finding container 3d726c5df8667f18f1d53b31a54a7b6d0acb37ba9a3e417081bf95f573a3e6d9: Status 404 returned error can't find the container with id 3d726c5df8667f18f1d53b31a54a7b6d0acb37ba9a3e417081bf95f573a3e6d9 Nov 22 10:03:58 crc kubenswrapper[4743]: I1122 10:03:58.555549 4743 generic.go:334] "Generic (PLEG): container finished" podID="8b746268-80b6-4be3-b32c-19b1fe639bef" containerID="425989f6b45af1ef2395a839296d880b505e064b07fb63094a0678a3946817b9" exitCode=0 Nov 22 10:03:58 crc kubenswrapper[4743]: I1122 10:03:58.555616 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mzjsv" event={"ID":"8b746268-80b6-4be3-b32c-19b1fe639bef","Type":"ContainerDied","Data":"425989f6b45af1ef2395a839296d880b505e064b07fb63094a0678a3946817b9"} Nov 22 10:03:58 crc kubenswrapper[4743]: I1122 10:03:58.556114 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mzjsv" event={"ID":"8b746268-80b6-4be3-b32c-19b1fe639bef","Type":"ContainerStarted","Data":"0732a55f11c422a47c3063af24e0f177f334c3c27ed534f77da0cd1b9dfe54e6"} Nov 22 10:03:58 crc kubenswrapper[4743]: I1122 10:03:58.564210 4743 generic.go:334] "Generic (PLEG): container finished" podID="ef7f86b6-ae6d-4428-ba15-014b04aa2a93" containerID="6cb93f4b69bcf9bfe8c564273cc51e30b612a5c4dfd204dc8193235cf7973539" exitCode=0 Nov 22 10:03:58 crc kubenswrapper[4743]: I1122 10:03:58.564299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-fa27-account-create-bcq27" event={"ID":"ef7f86b6-ae6d-4428-ba15-014b04aa2a93","Type":"ContainerDied","Data":"6cb93f4b69bcf9bfe8c564273cc51e30b612a5c4dfd204dc8193235cf7973539"} Nov 22 10:03:58 crc kubenswrapper[4743]: I1122 10:03:58.564342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-fa27-account-create-bcq27" event={"ID":"ef7f86b6-ae6d-4428-ba15-014b04aa2a93","Type":"ContainerStarted","Data":"3d726c5df8667f18f1d53b31a54a7b6d0acb37ba9a3e417081bf95f573a3e6d9"} Nov 22 10:03:59 crc kubenswrapper[4743]: I1122 10:03:59.063735 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-mf77p"] Nov 22 10:03:59 crc kubenswrapper[4743]: I1122 10:03:59.072488 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-mf77p"] Nov 22 10:03:59 crc kubenswrapper[4743]: I1122 10:03:59.165015 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="825a445b-19a5-433a-b0b5-c87cb078d274" path="/var/lib/kubelet/pods/825a445b-19a5-433a-b0b5-c87cb078d274/volumes" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.029751 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-fa27-account-create-bcq27" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.041147 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mzjsv" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.152370 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5nx7\" (UniqueName: \"kubernetes.io/projected/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-kube-api-access-l5nx7\") pod \"ef7f86b6-ae6d-4428-ba15-014b04aa2a93\" (UID: \"ef7f86b6-ae6d-4428-ba15-014b04aa2a93\") " Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.152601 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-operator-scripts\") pod \"ef7f86b6-ae6d-4428-ba15-014b04aa2a93\" (UID: \"ef7f86b6-ae6d-4428-ba15-014b04aa2a93\") " Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.152728 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s5zl\" (UniqueName: \"kubernetes.io/projected/8b746268-80b6-4be3-b32c-19b1fe639bef-kube-api-access-9s5zl\") pod \"8b746268-80b6-4be3-b32c-19b1fe639bef\" (UID: \"8b746268-80b6-4be3-b32c-19b1fe639bef\") " Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.152853 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b746268-80b6-4be3-b32c-19b1fe639bef-operator-scripts\") pod \"8b746268-80b6-4be3-b32c-19b1fe639bef\" (UID: \"8b746268-80b6-4be3-b32c-19b1fe639bef\") " Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.153390 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef7f86b6-ae6d-4428-ba15-014b04aa2a93" (UID: "ef7f86b6-ae6d-4428-ba15-014b04aa2a93"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.153468 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b746268-80b6-4be3-b32c-19b1fe639bef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b746268-80b6-4be3-b32c-19b1fe639bef" (UID: "8b746268-80b6-4be3-b32c-19b1fe639bef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.158513 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-kube-api-access-l5nx7" (OuterVolumeSpecName: "kube-api-access-l5nx7") pod "ef7f86b6-ae6d-4428-ba15-014b04aa2a93" (UID: "ef7f86b6-ae6d-4428-ba15-014b04aa2a93"). InnerVolumeSpecName "kube-api-access-l5nx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.163604 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b746268-80b6-4be3-b32c-19b1fe639bef-kube-api-access-9s5zl" (OuterVolumeSpecName: "kube-api-access-9s5zl") pod "8b746268-80b6-4be3-b32c-19b1fe639bef" (UID: "8b746268-80b6-4be3-b32c-19b1fe639bef"). InnerVolumeSpecName "kube-api-access-9s5zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.254996 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b746268-80b6-4be3-b32c-19b1fe639bef-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.255070 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5nx7\" (UniqueName: \"kubernetes.io/projected/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-kube-api-access-l5nx7\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.255085 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef7f86b6-ae6d-4428-ba15-014b04aa2a93-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.255119 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s5zl\" (UniqueName: \"kubernetes.io/projected/8b746268-80b6-4be3-b32c-19b1fe639bef-kube-api-access-9s5zl\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.581014 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-fa27-account-create-bcq27" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.581012 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-fa27-account-create-bcq27" event={"ID":"ef7f86b6-ae6d-4428-ba15-014b04aa2a93","Type":"ContainerDied","Data":"3d726c5df8667f18f1d53b31a54a7b6d0acb37ba9a3e417081bf95f573a3e6d9"} Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.581155 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d726c5df8667f18f1d53b31a54a7b6d0acb37ba9a3e417081bf95f573a3e6d9" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.582558 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-mzjsv" event={"ID":"8b746268-80b6-4be3-b32c-19b1fe639bef","Type":"ContainerDied","Data":"0732a55f11c422a47c3063af24e0f177f334c3c27ed534f77da0cd1b9dfe54e6"} Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.582645 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0732a55f11c422a47c3063af24e0f177f334c3c27ed534f77da0cd1b9dfe54e6" Nov 22 10:04:00 crc kubenswrapper[4743]: I1122 10:04:00.582705 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-mzjsv" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.214488 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-dddhg"] Nov 22 10:04:02 crc kubenswrapper[4743]: E1122 10:04:02.215366 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b746268-80b6-4be3-b32c-19b1fe639bef" containerName="mariadb-database-create" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.215379 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b746268-80b6-4be3-b32c-19b1fe639bef" containerName="mariadb-database-create" Nov 22 10:04:02 crc kubenswrapper[4743]: E1122 10:04:02.215392 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef7f86b6-ae6d-4428-ba15-014b04aa2a93" containerName="mariadb-account-create" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.215407 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef7f86b6-ae6d-4428-ba15-014b04aa2a93" containerName="mariadb-account-create" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.215703 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b746268-80b6-4be3-b32c-19b1fe639bef" containerName="mariadb-database-create" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.215719 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef7f86b6-ae6d-4428-ba15-014b04aa2a93" containerName="mariadb-account-create" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.216392 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.218367 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-xxd26" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.224812 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.226238 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-dddhg"] Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.396835 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-combined-ca-bundle\") pod \"heat-db-sync-dddhg\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.396960 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-config-data\") pod \"heat-db-sync-dddhg\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.396981 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6774\" (UniqueName: \"kubernetes.io/projected/b4529808-850b-47b2-adbd-99c4c2e9a0e2-kube-api-access-j6774\") pod \"heat-db-sync-dddhg\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.498464 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-config-data\") pod \"heat-db-sync-dddhg\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.498793 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6774\" (UniqueName: \"kubernetes.io/projected/b4529808-850b-47b2-adbd-99c4c2e9a0e2-kube-api-access-j6774\") pod \"heat-db-sync-dddhg\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.498933 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-combined-ca-bundle\") pod \"heat-db-sync-dddhg\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.503826 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-combined-ca-bundle\") pod \"heat-db-sync-dddhg\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.513123 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-config-data\") pod \"heat-db-sync-dddhg\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.525120 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6774\" (UniqueName: \"kubernetes.io/projected/b4529808-850b-47b2-adbd-99c4c2e9a0e2-kube-api-access-j6774\") pod \"heat-db-sync-dddhg\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:02 crc kubenswrapper[4743]: I1122 10:04:02.539758 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:03 crc kubenswrapper[4743]: I1122 10:04:03.019402 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-dddhg"] Nov 22 10:04:03 crc kubenswrapper[4743]: W1122 10:04:03.027774 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4529808_850b_47b2_adbd_99c4c2e9a0e2.slice/crio-433462fd490d2b3cfa4f32a00e660ad1cb0ad1de0d48192f2dc5774f1286d7ff WatchSource:0}: Error finding container 433462fd490d2b3cfa4f32a00e660ad1cb0ad1de0d48192f2dc5774f1286d7ff: Status 404 returned error can't find the container with id 433462fd490d2b3cfa4f32a00e660ad1cb0ad1de0d48192f2dc5774f1286d7ff Nov 22 10:04:03 crc kubenswrapper[4743]: I1122 10:04:03.631938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dddhg" event={"ID":"b4529808-850b-47b2-adbd-99c4c2e9a0e2","Type":"ContainerStarted","Data":"433462fd490d2b3cfa4f32a00e660ad1cb0ad1de0d48192f2dc5774f1286d7ff"} Nov 22 10:04:06 crc kubenswrapper[4743]: I1122 10:04:06.255171 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76456674f-grnzr" Nov 22 10:04:06 crc kubenswrapper[4743]: I1122 10:04:06.255789 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-76456674f-grnzr" Nov 22 10:04:09 crc kubenswrapper[4743]: I1122 10:04:09.689625 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dddhg" event={"ID":"b4529808-850b-47b2-adbd-99c4c2e9a0e2","Type":"ContainerStarted","Data":"9bfea331c50322b595e5152c42d23fe3b7ebaab2a552122deec8ce333449e07b"} Nov 22 10:04:09 crc kubenswrapper[4743]: I1122 10:04:09.714185 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-dddhg" podStartSLOduration=1.3791769280000001 podStartE2EDuration="7.71416748s" podCreationTimestamp="2025-11-22 10:04:02 +0000 UTC" firstStartedPulling="2025-11-22 10:04:03.029969824 +0000 UTC m=+6116.736330876" lastFinishedPulling="2025-11-22 10:04:09.364960376 +0000 UTC m=+6123.071321428" observedRunningTime="2025-11-22 10:04:09.70337878 +0000 UTC m=+6123.409739832" watchObservedRunningTime="2025-11-22 10:04:09.71416748 +0000 UTC m=+6123.420528532" Nov 22 10:04:11 crc kubenswrapper[4743]: I1122 10:04:11.709548 4743 generic.go:334] "Generic (PLEG): container finished" podID="b4529808-850b-47b2-adbd-99c4c2e9a0e2" containerID="9bfea331c50322b595e5152c42d23fe3b7ebaab2a552122deec8ce333449e07b" exitCode=0 Nov 22 10:04:11 crc kubenswrapper[4743]: I1122 10:04:11.709617 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dddhg" event={"ID":"b4529808-850b-47b2-adbd-99c4c2e9a0e2","Type":"ContainerDied","Data":"9bfea331c50322b595e5152c42d23fe3b7ebaab2a552122deec8ce333449e07b"} Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.080657 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.200716 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6774\" (UniqueName: \"kubernetes.io/projected/b4529808-850b-47b2-adbd-99c4c2e9a0e2-kube-api-access-j6774\") pod \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.201137 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-config-data\") pod \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.201196 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-combined-ca-bundle\") pod \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\" (UID: \"b4529808-850b-47b2-adbd-99c4c2e9a0e2\") " Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.206458 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4529808-850b-47b2-adbd-99c4c2e9a0e2-kube-api-access-j6774" (OuterVolumeSpecName: "kube-api-access-j6774") pod "b4529808-850b-47b2-adbd-99c4c2e9a0e2" (UID: "b4529808-850b-47b2-adbd-99c4c2e9a0e2"). InnerVolumeSpecName "kube-api-access-j6774". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.260792 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4529808-850b-47b2-adbd-99c4c2e9a0e2" (UID: "b4529808-850b-47b2-adbd-99c4c2e9a0e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.287921 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-config-data" (OuterVolumeSpecName: "config-data") pod "b4529808-850b-47b2-adbd-99c4c2e9a0e2" (UID: "b4529808-850b-47b2-adbd-99c4c2e9a0e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.304172 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6774\" (UniqueName: \"kubernetes.io/projected/b4529808-850b-47b2-adbd-99c4c2e9a0e2-kube-api-access-j6774\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.304217 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.304233 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4529808-850b-47b2-adbd-99c4c2e9a0e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.728727 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-dddhg" event={"ID":"b4529808-850b-47b2-adbd-99c4c2e9a0e2","Type":"ContainerDied","Data":"433462fd490d2b3cfa4f32a00e660ad1cb0ad1de0d48192f2dc5774f1286d7ff"} Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.729114 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="433462fd490d2b3cfa4f32a00e660ad1cb0ad1de0d48192f2dc5774f1286d7ff" Nov 22 10:04:13 crc kubenswrapper[4743]: I1122 10:04:13.728810 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-dddhg" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.706351 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-58667d47cd-44kmd"] Nov 22 10:04:14 crc kubenswrapper[4743]: E1122 10:04:14.707097 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4529808-850b-47b2-adbd-99c4c2e9a0e2" containerName="heat-db-sync" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.707113 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4529808-850b-47b2-adbd-99c4c2e9a0e2" containerName="heat-db-sync" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.707300 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4529808-850b-47b2-adbd-99c4c2e9a0e2" containerName="heat-db-sync" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.707981 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.712944 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-58667d47cd-44kmd"] Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.723494 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.723927 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-xxd26" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.724129 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.845849 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad7f997-13cd-4561-8de6-17685d0d649a-config-data\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.846593 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ad7f997-13cd-4561-8de6-17685d0d649a-config-data-custom\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.846708 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad7f997-13cd-4561-8de6-17685d0d649a-combined-ca-bundle\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.846779 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr8f5\" (UniqueName: \"kubernetes.io/projected/3ad7f997-13cd-4561-8de6-17685d0d649a-kube-api-access-fr8f5\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.915756 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-d4ff5bc67-qmrg8"] Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.917164 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.921500 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.936645 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-cdb5f48d-n9r67"] Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.939172 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.941823 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.948461 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr8f5\" (UniqueName: \"kubernetes.io/projected/3ad7f997-13cd-4561-8de6-17685d0d649a-kube-api-access-fr8f5\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.948569 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad7f997-13cd-4561-8de6-17685d0d649a-config-data\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.948630 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ad7f997-13cd-4561-8de6-17685d0d649a-config-data-custom\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.948696 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad7f997-13cd-4561-8de6-17685d0d649a-combined-ca-bundle\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.956299 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-d4ff5bc67-qmrg8"] Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.962862 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad7f997-13cd-4561-8de6-17685d0d649a-combined-ca-bundle\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.969407 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-cdb5f48d-n9r67"] Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.973008 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad7f997-13cd-4561-8de6-17685d0d649a-config-data\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.977323 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr8f5\" (UniqueName: \"kubernetes.io/projected/3ad7f997-13cd-4561-8de6-17685d0d649a-kube-api-access-fr8f5\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:14 crc kubenswrapper[4743]: I1122 10:04:14.984608 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ad7f997-13cd-4561-8de6-17685d0d649a-config-data-custom\") pod \"heat-engine-58667d47cd-44kmd\" (UID: \"3ad7f997-13cd-4561-8de6-17685d0d649a\") " pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.050489 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb34585-5e6a-440e-bfbf-694c35c35cd4-combined-ca-bundle\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.050545 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07af23e-fa72-4754-b88e-59aa7423bd8e-combined-ca-bundle\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.051203 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07af23e-fa72-4754-b88e-59aa7423bd8e-config-data\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.051312 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb34585-5e6a-440e-bfbf-694c35c35cd4-config-data-custom\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.051345 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l799l\" (UniqueName: \"kubernetes.io/projected/5eb34585-5e6a-440e-bfbf-694c35c35cd4-kube-api-access-l799l\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.051524 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb34585-5e6a-440e-bfbf-694c35c35cd4-config-data\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.051578 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f07af23e-fa72-4754-b88e-59aa7423bd8e-config-data-custom\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.051627 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrr9w\" (UniqueName: \"kubernetes.io/projected/f07af23e-fa72-4754-b88e-59aa7423bd8e-kube-api-access-nrr9w\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.091565 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.153096 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrr9w\" (UniqueName: \"kubernetes.io/projected/f07af23e-fa72-4754-b88e-59aa7423bd8e-kube-api-access-nrr9w\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.153164 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb34585-5e6a-440e-bfbf-694c35c35cd4-combined-ca-bundle\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.153192 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07af23e-fa72-4754-b88e-59aa7423bd8e-combined-ca-bundle\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.153231 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07af23e-fa72-4754-b88e-59aa7423bd8e-config-data\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.153299 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb34585-5e6a-440e-bfbf-694c35c35cd4-config-data-custom\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.153317 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l799l\" (UniqueName: \"kubernetes.io/projected/5eb34585-5e6a-440e-bfbf-694c35c35cd4-kube-api-access-l799l\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.153384 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb34585-5e6a-440e-bfbf-694c35c35cd4-config-data\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.153452 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f07af23e-fa72-4754-b88e-59aa7423bd8e-config-data-custom\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.158793 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f07af23e-fa72-4754-b88e-59aa7423bd8e-combined-ca-bundle\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.158906 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f07af23e-fa72-4754-b88e-59aa7423bd8e-config-data-custom\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.161537 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb34585-5e6a-440e-bfbf-694c35c35cd4-combined-ca-bundle\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.165609 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb34585-5e6a-440e-bfbf-694c35c35cd4-config-data\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.172726 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb34585-5e6a-440e-bfbf-694c35c35cd4-config-data-custom\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.177529 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l799l\" (UniqueName: \"kubernetes.io/projected/5eb34585-5e6a-440e-bfbf-694c35c35cd4-kube-api-access-l799l\") pod \"heat-cfnapi-d4ff5bc67-qmrg8\" (UID: \"5eb34585-5e6a-440e-bfbf-694c35c35cd4\") " pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.178288 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f07af23e-fa72-4754-b88e-59aa7423bd8e-config-data\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.182931 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrr9w\" (UniqueName: \"kubernetes.io/projected/f07af23e-fa72-4754-b88e-59aa7423bd8e-kube-api-access-nrr9w\") pod \"heat-api-cdb5f48d-n9r67\" (UID: \"f07af23e-fa72-4754-b88e-59aa7423bd8e\") " pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.257322 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.364318 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.614263 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-58667d47cd-44kmd"] Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.811806 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-58667d47cd-44kmd" event={"ID":"3ad7f997-13cd-4561-8de6-17685d0d649a","Type":"ContainerStarted","Data":"340bf7de95c4873d9ed25560a212929f49022fa71bea6ccd1da71708f86c7bb5"} Nov 22 10:04:15 crc kubenswrapper[4743]: I1122 10:04:15.937979 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-d4ff5bc67-qmrg8"] Nov 22 10:04:15 crc kubenswrapper[4743]: W1122 10:04:15.938096 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eb34585_5e6a_440e_bfbf_694c35c35cd4.slice/crio-82e56c4196ea88c0b3232ffaee7e325a2baf4d558c4cbb1b7c6fa7bcbef232ff WatchSource:0}: Error finding container 82e56c4196ea88c0b3232ffaee7e325a2baf4d558c4cbb1b7c6fa7bcbef232ff: Status 404 returned error can't find the container with id 82e56c4196ea88c0b3232ffaee7e325a2baf4d558c4cbb1b7c6fa7bcbef232ff Nov 22 10:04:16 crc kubenswrapper[4743]: I1122 10:04:16.009484 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-cdb5f48d-n9r67"] Nov 22 10:04:16 crc kubenswrapper[4743]: I1122 10:04:16.256851 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-76456674f-grnzr" podUID="b30c1d40-0697-4337-ba40-9090dc6988a5" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.116:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.116:8080: connect: connection refused" Nov 22 10:04:16 crc kubenswrapper[4743]: I1122 10:04:16.830423 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-cdb5f48d-n9r67" event={"ID":"f07af23e-fa72-4754-b88e-59aa7423bd8e","Type":"ContainerStarted","Data":"06acd9eac9b2f96f856d8d41701f3b12ac32a61e637c08a850b59aa646d7a71d"} Nov 22 10:04:16 crc kubenswrapper[4743]: I1122 10:04:16.833694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-58667d47cd-44kmd" event={"ID":"3ad7f997-13cd-4561-8de6-17685d0d649a","Type":"ContainerStarted","Data":"1b9052b8ca8bdb6ff4f9b7e71d2cfb2f4553930ed84355db9f823775f525ac95"} Nov 22 10:04:16 crc kubenswrapper[4743]: I1122 10:04:16.833790 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:16 crc kubenswrapper[4743]: I1122 10:04:16.835019 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" event={"ID":"5eb34585-5e6a-440e-bfbf-694c35c35cd4","Type":"ContainerStarted","Data":"82e56c4196ea88c0b3232ffaee7e325a2baf4d558c4cbb1b7c6fa7bcbef232ff"} Nov 22 10:04:16 crc kubenswrapper[4743]: I1122 10:04:16.867844 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-58667d47cd-44kmd" podStartSLOduration=2.867828223 podStartE2EDuration="2.867828223s" podCreationTimestamp="2025-11-22 10:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:04:16.852524303 +0000 UTC m=+6130.558885355" watchObservedRunningTime="2025-11-22 10:04:16.867828223 +0000 UTC m=+6130.574189265" Nov 22 10:04:18 crc kubenswrapper[4743]: I1122 10:04:18.856146 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-cdb5f48d-n9r67" event={"ID":"f07af23e-fa72-4754-b88e-59aa7423bd8e","Type":"ContainerStarted","Data":"f8815432becd6f411e6fd681f8e0259d564c395dbc6d9879f2915f4dcbb42b82"} Nov 22 10:04:18 crc kubenswrapper[4743]: I1122 10:04:18.856744 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:18 crc kubenswrapper[4743]: I1122 10:04:18.861824 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" event={"ID":"5eb34585-5e6a-440e-bfbf-694c35c35cd4","Type":"ContainerStarted","Data":"64c4942e46350cfa84acda7cea7cc1edca3bfc7215908179e99ba0b13d190cd0"} Nov 22 10:04:18 crc kubenswrapper[4743]: I1122 10:04:18.861972 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:18 crc kubenswrapper[4743]: I1122 10:04:18.880332 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-cdb5f48d-n9r67" podStartSLOduration=2.973494778 podStartE2EDuration="4.880312744s" podCreationTimestamp="2025-11-22 10:04:14 +0000 UTC" firstStartedPulling="2025-11-22 10:04:16.046390412 +0000 UTC m=+6129.752751464" lastFinishedPulling="2025-11-22 10:04:17.953208378 +0000 UTC m=+6131.659569430" observedRunningTime="2025-11-22 10:04:18.871349806 +0000 UTC m=+6132.577710858" watchObservedRunningTime="2025-11-22 10:04:18.880312744 +0000 UTC m=+6132.586673796" Nov 22 10:04:18 crc kubenswrapper[4743]: I1122 10:04:18.893741 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" podStartSLOduration=2.870651503 podStartE2EDuration="4.893720819s" podCreationTimestamp="2025-11-22 10:04:14 +0000 UTC" firstStartedPulling="2025-11-22 10:04:15.943092664 +0000 UTC m=+6129.649453716" lastFinishedPulling="2025-11-22 10:04:17.96616198 +0000 UTC m=+6131.672523032" observedRunningTime="2025-11-22 10:04:18.8888861 +0000 UTC m=+6132.595247152" watchObservedRunningTime="2025-11-22 10:04:18.893720819 +0000 UTC m=+6132.600081871" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.001179 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pjw9t"] Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.004722 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.018407 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjw9t"] Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.113116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-utilities\") pod \"redhat-operators-pjw9t\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.113305 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dd9j\" (UniqueName: \"kubernetes.io/projected/9898591b-5eff-4220-91b8-f4108169fed0-kube-api-access-4dd9j\") pod \"redhat-operators-pjw9t\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.113380 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-catalog-content\") pod \"redhat-operators-pjw9t\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.215179 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-utilities\") pod \"redhat-operators-pjw9t\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.215314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dd9j\" (UniqueName: \"kubernetes.io/projected/9898591b-5eff-4220-91b8-f4108169fed0-kube-api-access-4dd9j\") pod \"redhat-operators-pjw9t\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.215354 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-catalog-content\") pod \"redhat-operators-pjw9t\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.216069 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-utilities\") pod \"redhat-operators-pjw9t\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.216115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-catalog-content\") pod \"redhat-operators-pjw9t\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.239707 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dd9j\" (UniqueName: \"kubernetes.io/projected/9898591b-5eff-4220-91b8-f4108169fed0-kube-api-access-4dd9j\") pod \"redhat-operators-pjw9t\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.324738 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:22 crc kubenswrapper[4743]: W1122 10:04:22.876605 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9898591b_5eff_4220_91b8_f4108169fed0.slice/crio-85713d5801ca27515688c278c3422daea7e5f817dd5baadb452bdd6756e3f15a WatchSource:0}: Error finding container 85713d5801ca27515688c278c3422daea7e5f817dd5baadb452bdd6756e3f15a: Status 404 returned error can't find the container with id 85713d5801ca27515688c278c3422daea7e5f817dd5baadb452bdd6756e3f15a Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.887380 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjw9t"] Nov 22 10:04:22 crc kubenswrapper[4743]: I1122 10:04:22.897569 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw9t" event={"ID":"9898591b-5eff-4220-91b8-f4108169fed0","Type":"ContainerStarted","Data":"85713d5801ca27515688c278c3422daea7e5f817dd5baadb452bdd6756e3f15a"} Nov 22 10:04:23 crc kubenswrapper[4743]: I1122 10:04:23.907285 4743 generic.go:334] "Generic (PLEG): container finished" podID="9898591b-5eff-4220-91b8-f4108169fed0" containerID="85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39" exitCode=0 Nov 22 10:04:23 crc kubenswrapper[4743]: I1122 10:04:23.907339 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw9t" event={"ID":"9898591b-5eff-4220-91b8-f4108169fed0","Type":"ContainerDied","Data":"85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39"} Nov 22 10:04:24 crc kubenswrapper[4743]: I1122 10:04:24.919056 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw9t" event={"ID":"9898591b-5eff-4220-91b8-f4108169fed0","Type":"ContainerStarted","Data":"c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45"} Nov 22 10:04:26 crc kubenswrapper[4743]: I1122 10:04:26.839953 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-d4ff5bc67-qmrg8" Nov 22 10:04:26 crc kubenswrapper[4743]: I1122 10:04:26.856991 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-cdb5f48d-n9r67" Nov 22 10:04:26 crc kubenswrapper[4743]: I1122 10:04:26.950790 4743 generic.go:334] "Generic (PLEG): container finished" podID="9898591b-5eff-4220-91b8-f4108169fed0" containerID="c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45" exitCode=0 Nov 22 10:04:26 crc kubenswrapper[4743]: I1122 10:04:26.950839 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw9t" event={"ID":"9898591b-5eff-4220-91b8-f4108169fed0","Type":"ContainerDied","Data":"c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45"} Nov 22 10:04:28 crc kubenswrapper[4743]: I1122 10:04:28.972106 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw9t" event={"ID":"9898591b-5eff-4220-91b8-f4108169fed0","Type":"ContainerStarted","Data":"47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60"} Nov 22 10:04:28 crc kubenswrapper[4743]: I1122 10:04:28.999413 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pjw9t" podStartSLOduration=3.967065524 podStartE2EDuration="7.999391048s" podCreationTimestamp="2025-11-22 10:04:21 +0000 UTC" firstStartedPulling="2025-11-22 10:04:23.910181569 +0000 UTC m=+6137.616542621" lastFinishedPulling="2025-11-22 10:04:27.942507093 +0000 UTC m=+6141.648868145" observedRunningTime="2025-11-22 10:04:28.989231567 +0000 UTC m=+6142.695592629" watchObservedRunningTime="2025-11-22 10:04:28.999391048 +0000 UTC m=+6142.705752100" Nov 22 10:04:29 crc kubenswrapper[4743]: I1122 10:04:29.364107 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-76456674f-grnzr" Nov 22 10:04:31 crc kubenswrapper[4743]: I1122 10:04:31.059237 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-76456674f-grnzr" Nov 22 10:04:31 crc kubenswrapper[4743]: I1122 10:04:31.123959 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-864787bfcc-j7q6g"] Nov 22 10:04:31 crc kubenswrapper[4743]: I1122 10:04:31.124493 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-864787bfcc-j7q6g" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon-log" containerID="cri-o://cb36c7c80b2e712f0bd470bfc184dfcc805f9602785042793b521f21829f8c35" gracePeriod=30 Nov 22 10:04:31 crc kubenswrapper[4743]: I1122 10:04:31.124955 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-864787bfcc-j7q6g" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon" containerID="cri-o://e7fac22848c24cbb3a1ba66cdf5b985d198f13c3e69dab4a759754935b1ff1b8" gracePeriod=30 Nov 22 10:04:32 crc kubenswrapper[4743]: I1122 10:04:32.325670 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:32 crc kubenswrapper[4743]: I1122 10:04:32.326731 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:33 crc kubenswrapper[4743]: I1122 10:04:33.376543 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pjw9t" podUID="9898591b-5eff-4220-91b8-f4108169fed0" containerName="registry-server" probeResult="failure" output=< Nov 22 10:04:33 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 10:04:33 crc kubenswrapper[4743]: > Nov 22 10:04:34 crc kubenswrapper[4743]: I1122 10:04:34.263615 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-864787bfcc-j7q6g" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:55498->10.217.1.112:8080: read: connection reset by peer" Nov 22 10:04:34 crc kubenswrapper[4743]: I1122 10:04:34.860315 4743 scope.go:117] "RemoveContainer" containerID="16c17b1cc2fe7a0d8ae6d9df0aad11bfc587664e8ee04a95e2b8d9a6b30816d4" Nov 22 10:04:34 crc kubenswrapper[4743]: I1122 10:04:34.886686 4743 scope.go:117] "RemoveContainer" containerID="9c80c72a0cf5d2809f1b57d692a40d1025cebd7e615dbbdaaa50c9df90f881cf" Nov 22 10:04:34 crc kubenswrapper[4743]: I1122 10:04:34.926391 4743 scope.go:117] "RemoveContainer" containerID="f96543829f351b01dcd4c35027a21e98cf0094d6537039b3477ae3be99769d47" Nov 22 10:04:35 crc kubenswrapper[4743]: I1122 10:04:35.026021 4743 generic.go:334] "Generic (PLEG): container finished" podID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerID="e7fac22848c24cbb3a1ba66cdf5b985d198f13c3e69dab4a759754935b1ff1b8" exitCode=0 Nov 22 10:04:35 crc kubenswrapper[4743]: I1122 10:04:35.026062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-864787bfcc-j7q6g" event={"ID":"b445ab4a-9430-4c45-bd96-006a3c22598e","Type":"ContainerDied","Data":"e7fac22848c24cbb3a1ba66cdf5b985d198f13c3e69dab4a759754935b1ff1b8"} Nov 22 10:04:35 crc kubenswrapper[4743]: I1122 10:04:35.123099 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-58667d47cd-44kmd" Nov 22 10:04:42 crc kubenswrapper[4743]: I1122 10:04:42.373014 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:42 crc kubenswrapper[4743]: I1122 10:04:42.422106 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:43 crc kubenswrapper[4743]: I1122 10:04:43.499881 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-864787bfcc-j7q6g" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Nov 22 10:04:44 crc kubenswrapper[4743]: I1122 10:04:44.441712 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pjw9t"] Nov 22 10:04:44 crc kubenswrapper[4743]: I1122 10:04:44.448117 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pjw9t" podUID="9898591b-5eff-4220-91b8-f4108169fed0" containerName="registry-server" containerID="cri-o://47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60" gracePeriod=2 Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.110386 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.203443 4743 generic.go:334] "Generic (PLEG): container finished" podID="9898591b-5eff-4220-91b8-f4108169fed0" containerID="47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60" exitCode=0 Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.203488 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw9t" event={"ID":"9898591b-5eff-4220-91b8-f4108169fed0","Type":"ContainerDied","Data":"47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60"} Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.203512 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjw9t" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.203530 4743 scope.go:117] "RemoveContainer" containerID="47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.203517 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjw9t" event={"ID":"9898591b-5eff-4220-91b8-f4108169fed0","Type":"ContainerDied","Data":"85713d5801ca27515688c278c3422daea7e5f817dd5baadb452bdd6756e3f15a"} Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.223537 4743 scope.go:117] "RemoveContainer" containerID="c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.226850 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dd9j\" (UniqueName: \"kubernetes.io/projected/9898591b-5eff-4220-91b8-f4108169fed0-kube-api-access-4dd9j\") pod \"9898591b-5eff-4220-91b8-f4108169fed0\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.226970 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-utilities\") pod \"9898591b-5eff-4220-91b8-f4108169fed0\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.227089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-catalog-content\") pod \"9898591b-5eff-4220-91b8-f4108169fed0\" (UID: \"9898591b-5eff-4220-91b8-f4108169fed0\") " Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.231125 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-utilities" (OuterVolumeSpecName: "utilities") pod "9898591b-5eff-4220-91b8-f4108169fed0" (UID: "9898591b-5eff-4220-91b8-f4108169fed0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.236711 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9898591b-5eff-4220-91b8-f4108169fed0-kube-api-access-4dd9j" (OuterVolumeSpecName: "kube-api-access-4dd9j") pod "9898591b-5eff-4220-91b8-f4108169fed0" (UID: "9898591b-5eff-4220-91b8-f4108169fed0"). InnerVolumeSpecName "kube-api-access-4dd9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.246912 4743 scope.go:117] "RemoveContainer" containerID="85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.328889 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9898591b-5eff-4220-91b8-f4108169fed0" (UID: "9898591b-5eff-4220-91b8-f4108169fed0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.329197 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.329234 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9898591b-5eff-4220-91b8-f4108169fed0-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.329245 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dd9j\" (UniqueName: \"kubernetes.io/projected/9898591b-5eff-4220-91b8-f4108169fed0-kube-api-access-4dd9j\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.366322 4743 scope.go:117] "RemoveContainer" containerID="47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60" Nov 22 10:04:45 crc kubenswrapper[4743]: E1122 10:04:45.367893 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60\": container with ID starting with 47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60 not found: ID does not exist" containerID="47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.367944 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60"} err="failed to get container status \"47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60\": rpc error: code = NotFound desc = could not find container \"47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60\": container with ID starting with 47ae2a26517e20e89f07e878bf5e9682b822e6d3999e2d6190eab2290d1aee60 not found: ID does not exist" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.367978 4743 scope.go:117] "RemoveContainer" containerID="c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45" Nov 22 10:04:45 crc kubenswrapper[4743]: E1122 10:04:45.368542 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45\": container with ID starting with c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45 not found: ID does not exist" containerID="c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.368565 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45"} err="failed to get container status \"c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45\": rpc error: code = NotFound desc = could not find container \"c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45\": container with ID starting with c5f159d0351110d65d7e714305dd7204225f908a61bb1553e77243f399858e45 not found: ID does not exist" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.368581 4743 scope.go:117] "RemoveContainer" containerID="85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39" Nov 22 10:04:45 crc kubenswrapper[4743]: E1122 10:04:45.369085 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39\": container with ID starting with 85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39 not found: ID does not exist" containerID="85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.369139 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39"} err="failed to get container status \"85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39\": rpc error: code = NotFound desc = could not find container \"85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39\": container with ID starting with 85a8736541c48d70861ddf0d9e75a12daeff31a3ac6d3486a6ca6bdefcd61d39 not found: ID does not exist" Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.536890 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pjw9t"] Nov 22 10:04:45 crc kubenswrapper[4743]: I1122 10:04:45.545241 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pjw9t"] Nov 22 10:04:47 crc kubenswrapper[4743]: I1122 10:04:47.164969 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9898591b-5eff-4220-91b8-f4108169fed0" path="/var/lib/kubelet/pods/9898591b-5eff-4220-91b8-f4108169fed0/volumes" Nov 22 10:04:49 crc kubenswrapper[4743]: I1122 10:04:49.965778 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8"] Nov 22 10:04:49 crc kubenswrapper[4743]: E1122 10:04:49.966484 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9898591b-5eff-4220-91b8-f4108169fed0" containerName="extract-utilities" Nov 22 10:04:49 crc kubenswrapper[4743]: I1122 10:04:49.966497 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9898591b-5eff-4220-91b8-f4108169fed0" containerName="extract-utilities" Nov 22 10:04:49 crc kubenswrapper[4743]: E1122 10:04:49.966511 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9898591b-5eff-4220-91b8-f4108169fed0" containerName="extract-content" Nov 22 10:04:49 crc kubenswrapper[4743]: I1122 10:04:49.966516 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9898591b-5eff-4220-91b8-f4108169fed0" containerName="extract-content" Nov 22 10:04:49 crc kubenswrapper[4743]: E1122 10:04:49.966553 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9898591b-5eff-4220-91b8-f4108169fed0" containerName="registry-server" Nov 22 10:04:49 crc kubenswrapper[4743]: I1122 10:04:49.966559 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9898591b-5eff-4220-91b8-f4108169fed0" containerName="registry-server" Nov 22 10:04:49 crc kubenswrapper[4743]: I1122 10:04:49.966763 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9898591b-5eff-4220-91b8-f4108169fed0" containerName="registry-server" Nov 22 10:04:49 crc kubenswrapper[4743]: I1122 10:04:49.968212 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:49 crc kubenswrapper[4743]: I1122 10:04:49.971981 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Nov 22 10:04:49 crc kubenswrapper[4743]: I1122 10:04:49.977014 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8"] Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.023105 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.023207 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.023235 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6f5d\" (UniqueName: \"kubernetes.io/projected/a0444a53-315b-4e96-852f-5a5db7824935-kube-api-access-z6f5d\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.124955 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.125009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6f5d\" (UniqueName: \"kubernetes.io/projected/a0444a53-315b-4e96-852f-5a5db7824935-kube-api-access-z6f5d\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.125139 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.125537 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.125581 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.145403 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6f5d\" (UniqueName: \"kubernetes.io/projected/a0444a53-315b-4e96-852f-5a5db7824935-kube-api-access-z6f5d\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.298015 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:50 crc kubenswrapper[4743]: I1122 10:04:50.739890 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8"] Nov 22 10:04:51 crc kubenswrapper[4743]: I1122 10:04:51.270154 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0444a53-315b-4e96-852f-5a5db7824935" containerID="c6206c28860d9544183f42771cc2c94f90ae58f5851407bd9442c2f29c08c634" exitCode=0 Nov 22 10:04:51 crc kubenswrapper[4743]: I1122 10:04:51.270232 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" event={"ID":"a0444a53-315b-4e96-852f-5a5db7824935","Type":"ContainerDied","Data":"c6206c28860d9544183f42771cc2c94f90ae58f5851407bd9442c2f29c08c634"} Nov 22 10:04:51 crc kubenswrapper[4743]: I1122 10:04:51.270261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" event={"ID":"a0444a53-315b-4e96-852f-5a5db7824935","Type":"ContainerStarted","Data":"0fc017b564058f799a1b1314c14f3d6b145702089e0a53cdd40e2fa2707b77d1"} Nov 22 10:04:53 crc kubenswrapper[4743]: I1122 10:04:53.287830 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0444a53-315b-4e96-852f-5a5db7824935" containerID="2356258725be7e9a95a3f719676ceaae35ff4a1de7ad7ab7d6a73eb3f1bc32c7" exitCode=0 Nov 22 10:04:53 crc kubenswrapper[4743]: I1122 10:04:53.287921 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" event={"ID":"a0444a53-315b-4e96-852f-5a5db7824935","Type":"ContainerDied","Data":"2356258725be7e9a95a3f719676ceaae35ff4a1de7ad7ab7d6a73eb3f1bc32c7"} Nov 22 10:04:53 crc kubenswrapper[4743]: I1122 10:04:53.499983 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-864787bfcc-j7q6g" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.112:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.112:8080: connect: connection refused" Nov 22 10:04:53 crc kubenswrapper[4743]: I1122 10:04:53.500088 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:04:54 crc kubenswrapper[4743]: I1122 10:04:54.301477 4743 generic.go:334] "Generic (PLEG): container finished" podID="a0444a53-315b-4e96-852f-5a5db7824935" containerID="977d0585156348c0702febe02b0efb1b42cddc9b9bb71fd0653c6103caddec4d" exitCode=0 Nov 22 10:04:54 crc kubenswrapper[4743]: I1122 10:04:54.301552 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" event={"ID":"a0444a53-315b-4e96-852f-5a5db7824935","Type":"ContainerDied","Data":"977d0585156348c0702febe02b0efb1b42cddc9b9bb71fd0653c6103caddec4d"} Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.045389 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-vcdq2"] Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.053969 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-vcdq2"] Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.163049 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1325d49a-c56f-4183-a0aa-6f558767ccaa" path="/var/lib/kubelet/pods/1325d49a-c56f-4183-a0aa-6f558767ccaa/volumes" Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.768359 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.861984 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-bundle\") pod \"a0444a53-315b-4e96-852f-5a5db7824935\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.862150 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6f5d\" (UniqueName: \"kubernetes.io/projected/a0444a53-315b-4e96-852f-5a5db7824935-kube-api-access-z6f5d\") pod \"a0444a53-315b-4e96-852f-5a5db7824935\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.862238 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-util\") pod \"a0444a53-315b-4e96-852f-5a5db7824935\" (UID: \"a0444a53-315b-4e96-852f-5a5db7824935\") " Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.864394 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-bundle" (OuterVolumeSpecName: "bundle") pod "a0444a53-315b-4e96-852f-5a5db7824935" (UID: "a0444a53-315b-4e96-852f-5a5db7824935"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.868457 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0444a53-315b-4e96-852f-5a5db7824935-kube-api-access-z6f5d" (OuterVolumeSpecName: "kube-api-access-z6f5d") pod "a0444a53-315b-4e96-852f-5a5db7824935" (UID: "a0444a53-315b-4e96-852f-5a5db7824935"). InnerVolumeSpecName "kube-api-access-z6f5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.876207 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-util" (OuterVolumeSpecName: "util") pod "a0444a53-315b-4e96-852f-5a5db7824935" (UID: "a0444a53-315b-4e96-852f-5a5db7824935"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.964415 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6f5d\" (UniqueName: \"kubernetes.io/projected/a0444a53-315b-4e96-852f-5a5db7824935-kube-api-access-z6f5d\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.964684 4743 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-util\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:55 crc kubenswrapper[4743]: I1122 10:04:55.964753 4743 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0444a53-315b-4e96-852f-5a5db7824935-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.028064 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-405d-account-create-842jc"] Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.040747 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wbsdw"] Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.051309 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-405d-account-create-842jc"] Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.059300 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-c0ec-account-create-hfj6d"] Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.067329 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8c6d-account-create-585ck"] Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.076380 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wbsdw"] Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.084784 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-m99g9"] Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.092160 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8c6d-account-create-585ck"] Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.101720 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-c0ec-account-create-hfj6d"] Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.106971 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-m99g9"] Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.322429 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" event={"ID":"a0444a53-315b-4e96-852f-5a5db7824935","Type":"ContainerDied","Data":"0fc017b564058f799a1b1314c14f3d6b145702089e0a53cdd40e2fa2707b77d1"} Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.322820 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fc017b564058f799a1b1314c14f3d6b145702089e0a53cdd40e2fa2707b77d1" Nov 22 10:04:56 crc kubenswrapper[4743]: I1122 10:04:56.322502 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8" Nov 22 10:04:57 crc kubenswrapper[4743]: I1122 10:04:57.167022 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03224792-7994-47a0-bd4a-68c2e394b3c1" path="/var/lib/kubelet/pods/03224792-7994-47a0-bd4a-68c2e394b3c1/volumes" Nov 22 10:04:57 crc kubenswrapper[4743]: I1122 10:04:57.168980 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="073cacc3-d575-4696-a875-9181e9d250d9" path="/var/lib/kubelet/pods/073cacc3-d575-4696-a875-9181e9d250d9/volumes" Nov 22 10:04:57 crc kubenswrapper[4743]: I1122 10:04:57.170212 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b713ccde-5f99-47ae-8c30-78677acde194" path="/var/lib/kubelet/pods/b713ccde-5f99-47ae-8c30-78677acde194/volumes" Nov 22 10:04:57 crc kubenswrapper[4743]: I1122 10:04:57.173259 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d16badcc-aa28-40f5-ac9e-71a23ee6e209" path="/var/lib/kubelet/pods/d16badcc-aa28-40f5-ac9e-71a23ee6e209/volumes" Nov 22 10:04:57 crc kubenswrapper[4743]: I1122 10:04:57.175483 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbb6248e-e10b-43ee-aa8c-d9e1bba1219b" path="/var/lib/kubelet/pods/dbb6248e-e10b-43ee-aa8c-d9e1bba1219b/volumes" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.379837 4743 generic.go:334] "Generic (PLEG): container finished" podID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerID="cb36c7c80b2e712f0bd470bfc184dfcc805f9602785042793b521f21829f8c35" exitCode=137 Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.380405 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-864787bfcc-j7q6g" event={"ID":"b445ab4a-9430-4c45-bd96-006a3c22598e","Type":"ContainerDied","Data":"cb36c7c80b2e712f0bd470bfc184dfcc805f9602785042793b521f21829f8c35"} Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.663405 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.790298 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-scripts\") pod \"b445ab4a-9430-4c45-bd96-006a3c22598e\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.790376 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b445ab4a-9430-4c45-bd96-006a3c22598e-horizon-secret-key\") pod \"b445ab4a-9430-4c45-bd96-006a3c22598e\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.790443 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzpjh\" (UniqueName: \"kubernetes.io/projected/b445ab4a-9430-4c45-bd96-006a3c22598e-kube-api-access-lzpjh\") pod \"b445ab4a-9430-4c45-bd96-006a3c22598e\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.790567 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-config-data\") pod \"b445ab4a-9430-4c45-bd96-006a3c22598e\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.790662 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b445ab4a-9430-4c45-bd96-006a3c22598e-logs\") pod \"b445ab4a-9430-4c45-bd96-006a3c22598e\" (UID: \"b445ab4a-9430-4c45-bd96-006a3c22598e\") " Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.793828 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b445ab4a-9430-4c45-bd96-006a3c22598e-logs" (OuterVolumeSpecName: "logs") pod "b445ab4a-9430-4c45-bd96-006a3c22598e" (UID: "b445ab4a-9430-4c45-bd96-006a3c22598e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.798890 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b445ab4a-9430-4c45-bd96-006a3c22598e-kube-api-access-lzpjh" (OuterVolumeSpecName: "kube-api-access-lzpjh") pod "b445ab4a-9430-4c45-bd96-006a3c22598e" (UID: "b445ab4a-9430-4c45-bd96-006a3c22598e"). InnerVolumeSpecName "kube-api-access-lzpjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.826742 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b445ab4a-9430-4c45-bd96-006a3c22598e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b445ab4a-9430-4c45-bd96-006a3c22598e" (UID: "b445ab4a-9430-4c45-bd96-006a3c22598e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.847963 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-config-data" (OuterVolumeSpecName: "config-data") pod "b445ab4a-9430-4c45-bd96-006a3c22598e" (UID: "b445ab4a-9430-4c45-bd96-006a3c22598e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.848816 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-scripts" (OuterVolumeSpecName: "scripts") pod "b445ab4a-9430-4c45-bd96-006a3c22598e" (UID: "b445ab4a-9430-4c45-bd96-006a3c22598e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.903830 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.903868 4743 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b445ab4a-9430-4c45-bd96-006a3c22598e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.903881 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzpjh\" (UniqueName: \"kubernetes.io/projected/b445ab4a-9430-4c45-bd96-006a3c22598e-kube-api-access-lzpjh\") on node \"crc\" DevicePath \"\"" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.903889 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b445ab4a-9430-4c45-bd96-006a3c22598e-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:05:01 crc kubenswrapper[4743]: I1122 10:05:01.903897 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b445ab4a-9430-4c45-bd96-006a3c22598e-logs\") on node \"crc\" DevicePath \"\"" Nov 22 10:05:02 crc kubenswrapper[4743]: I1122 10:05:02.390213 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-864787bfcc-j7q6g" event={"ID":"b445ab4a-9430-4c45-bd96-006a3c22598e","Type":"ContainerDied","Data":"f17f88dafa1db6a8c70b36780a13f99f5e8e3267948dd6ecbe18e54004345013"} Nov 22 10:05:02 crc kubenswrapper[4743]: I1122 10:05:02.390274 4743 scope.go:117] "RemoveContainer" containerID="e7fac22848c24cbb3a1ba66cdf5b985d198f13c3e69dab4a759754935b1ff1b8" Nov 22 10:05:02 crc kubenswrapper[4743]: I1122 10:05:02.390320 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864787bfcc-j7q6g" Nov 22 10:05:02 crc kubenswrapper[4743]: I1122 10:05:02.421783 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-864787bfcc-j7q6g"] Nov 22 10:05:02 crc kubenswrapper[4743]: I1122 10:05:02.430344 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-864787bfcc-j7q6g"] Nov 22 10:05:02 crc kubenswrapper[4743]: I1122 10:05:02.568719 4743 scope.go:117] "RemoveContainer" containerID="cb36c7c80b2e712f0bd470bfc184dfcc805f9602785042793b521f21829f8c35" Nov 22 10:05:03 crc kubenswrapper[4743]: I1122 10:05:03.167889 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" path="/var/lib/kubelet/pods/b445ab4a-9430-4c45-bd96-006a3c22598e/volumes" Nov 22 10:05:06 crc kubenswrapper[4743]: I1122 10:05:06.073632 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbrlx"] Nov 22 10:05:06 crc kubenswrapper[4743]: I1122 10:05:06.089754 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wbrlx"] Nov 22 10:05:07 crc kubenswrapper[4743]: I1122 10:05:07.163861 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c9cec77-56f5-4914-9309-19696765d0db" path="/var/lib/kubelet/pods/5c9cec77-56f5-4914-9309-19696765d0db/volumes" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.885396 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs"] Nov 22 10:05:08 crc kubenswrapper[4743]: E1122 10:05:08.886570 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0444a53-315b-4e96-852f-5a5db7824935" containerName="extract" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.886595 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0444a53-315b-4e96-852f-5a5db7824935" containerName="extract" Nov 22 10:05:08 crc kubenswrapper[4743]: E1122 10:05:08.886624 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.886630 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon" Nov 22 10:05:08 crc kubenswrapper[4743]: E1122 10:05:08.886664 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0444a53-315b-4e96-852f-5a5db7824935" containerName="util" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.886671 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0444a53-315b-4e96-852f-5a5db7824935" containerName="util" Nov 22 10:05:08 crc kubenswrapper[4743]: E1122 10:05:08.886698 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0444a53-315b-4e96-852f-5a5db7824935" containerName="pull" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.886705 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0444a53-315b-4e96-852f-5a5db7824935" containerName="pull" Nov 22 10:05:08 crc kubenswrapper[4743]: E1122 10:05:08.886716 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon-log" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.886722 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon-log" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.887038 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0444a53-315b-4e96-852f-5a5db7824935" containerName="extract" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.887077 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon-log" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.887094 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b445ab4a-9430-4c45-bd96-006a3c22598e" containerName="horizon" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.887998 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.893317 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.894155 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.895964 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-pnxbp" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.902448 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs"] Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.921240 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht"] Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.922675 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.926847 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-w26zr" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.927160 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.928509 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf"] Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.929901 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.974125 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht"] Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.976308 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf"] Nov 22 10:05:08 crc kubenswrapper[4743]: I1122 10:05:08.977049 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcmms\" (UniqueName: \"kubernetes.io/projected/518037c9-5978-4eae-bca7-bf63f9dc5b50-kube-api-access-tcmms\") pod \"obo-prometheus-operator-668cf9dfbb-7htfs\" (UID: \"518037c9-5978-4eae-bca7-bf63f9dc5b50\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.055209 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-mm5nj"] Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.056556 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.059035 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-mj52q" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.059092 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.073840 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-mm5nj"] Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.079238 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcmms\" (UniqueName: \"kubernetes.io/projected/518037c9-5978-4eae-bca7-bf63f9dc5b50-kube-api-access-tcmms\") pod \"obo-prometheus-operator-668cf9dfbb-7htfs\" (UID: \"518037c9-5978-4eae-bca7-bf63f9dc5b50\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.079319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf\" (UID: \"9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.079343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf\" (UID: \"9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.079399 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht\" (UID: \"4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.079434 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht\" (UID: \"4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.113182 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcmms\" (UniqueName: \"kubernetes.io/projected/518037c9-5978-4eae-bca7-bf63f9dc5b50-kube-api-access-tcmms\") pod \"obo-prometheus-operator-668cf9dfbb-7htfs\" (UID: \"518037c9-5978-4eae-bca7-bf63f9dc5b50\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.181058 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf\" (UID: \"9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.181372 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf\" (UID: \"9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.181437 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht\" (UID: \"4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.181476 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht\" (UID: \"4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.181503 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/86470a8e-5cc0-4f60-9cf7-f9675599a769-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-mm5nj\" (UID: \"86470a8e-5cc0-4f60-9cf7-f9675599a769\") " pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.181604 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7f9b\" (UniqueName: \"kubernetes.io/projected/86470a8e-5cc0-4f60-9cf7-f9675599a769-kube-api-access-p7f9b\") pod \"observability-operator-d8bb48f5d-mm5nj\" (UID: \"86470a8e-5cc0-4f60-9cf7-f9675599a769\") " pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.184504 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht\" (UID: \"4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.184725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf\" (UID: \"9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.184822 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht\" (UID: \"4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.184868 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf\" (UID: \"9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.222794 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.246314 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-gdwmk"] Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.247683 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-gdwmk" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.252088 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6l82d" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.277637 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-gdwmk"] Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.278104 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.284389 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/86470a8e-5cc0-4f60-9cf7-f9675599a769-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-mm5nj\" (UID: \"86470a8e-5cc0-4f60-9cf7-f9675599a769\") " pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.284552 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7f9b\" (UniqueName: \"kubernetes.io/projected/86470a8e-5cc0-4f60-9cf7-f9675599a769-kube-api-access-p7f9b\") pod \"observability-operator-d8bb48f5d-mm5nj\" (UID: \"86470a8e-5cc0-4f60-9cf7-f9675599a769\") " pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.292154 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.292252 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/86470a8e-5cc0-4f60-9cf7-f9675599a769-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-mm5nj\" (UID: \"86470a8e-5cc0-4f60-9cf7-f9675599a769\") " pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.326900 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7f9b\" (UniqueName: \"kubernetes.io/projected/86470a8e-5cc0-4f60-9cf7-f9675599a769-kube-api-access-p7f9b\") pod \"observability-operator-d8bb48f5d-mm5nj\" (UID: \"86470a8e-5cc0-4f60-9cf7-f9675599a769\") " pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.386651 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a0a8704-e5ab-4bed-bdd6-bb291e7823e5-openshift-service-ca\") pod \"perses-operator-5446b9c989-gdwmk\" (UID: \"1a0a8704-e5ab-4bed-bdd6-bb291e7823e5\") " pod="openshift-operators/perses-operator-5446b9c989-gdwmk" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.386755 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2hpv\" (UniqueName: \"kubernetes.io/projected/1a0a8704-e5ab-4bed-bdd6-bb291e7823e5-kube-api-access-m2hpv\") pod \"perses-operator-5446b9c989-gdwmk\" (UID: \"1a0a8704-e5ab-4bed-bdd6-bb291e7823e5\") " pod="openshift-operators/perses-operator-5446b9c989-gdwmk" Nov 22 10:05:09 crc kubenswrapper[4743]: I1122 10:05:09.397094 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" Nov 22 10:05:10 crc kubenswrapper[4743]: I1122 10:05:09.488155 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a0a8704-e5ab-4bed-bdd6-bb291e7823e5-openshift-service-ca\") pod \"perses-operator-5446b9c989-gdwmk\" (UID: \"1a0a8704-e5ab-4bed-bdd6-bb291e7823e5\") " pod="openshift-operators/perses-operator-5446b9c989-gdwmk" Nov 22 10:05:10 crc kubenswrapper[4743]: I1122 10:05:09.488506 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2hpv\" (UniqueName: \"kubernetes.io/projected/1a0a8704-e5ab-4bed-bdd6-bb291e7823e5-kube-api-access-m2hpv\") pod \"perses-operator-5446b9c989-gdwmk\" (UID: \"1a0a8704-e5ab-4bed-bdd6-bb291e7823e5\") " pod="openshift-operators/perses-operator-5446b9c989-gdwmk" Nov 22 10:05:10 crc kubenswrapper[4743]: I1122 10:05:09.489342 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a0a8704-e5ab-4bed-bdd6-bb291e7823e5-openshift-service-ca\") pod \"perses-operator-5446b9c989-gdwmk\" (UID: \"1a0a8704-e5ab-4bed-bdd6-bb291e7823e5\") " pod="openshift-operators/perses-operator-5446b9c989-gdwmk" Nov 22 10:05:10 crc kubenswrapper[4743]: I1122 10:05:09.507087 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2hpv\" (UniqueName: \"kubernetes.io/projected/1a0a8704-e5ab-4bed-bdd6-bb291e7823e5-kube-api-access-m2hpv\") pod \"perses-operator-5446b9c989-gdwmk\" (UID: \"1a0a8704-e5ab-4bed-bdd6-bb291e7823e5\") " pod="openshift-operators/perses-operator-5446b9c989-gdwmk" Nov 22 10:05:10 crc kubenswrapper[4743]: I1122 10:05:09.670627 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-gdwmk" Nov 22 10:05:10 crc kubenswrapper[4743]: I1122 10:05:10.668697 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-gdwmk"] Nov 22 10:05:10 crc kubenswrapper[4743]: I1122 10:05:10.680838 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht"] Nov 22 10:05:10 crc kubenswrapper[4743]: I1122 10:05:10.691916 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf"] Nov 22 10:05:10 crc kubenswrapper[4743]: I1122 10:05:10.699688 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-mm5nj"] Nov 22 10:05:10 crc kubenswrapper[4743]: I1122 10:05:10.707499 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs"] Nov 22 10:05:11 crc kubenswrapper[4743]: I1122 10:05:11.482249 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" event={"ID":"86470a8e-5cc0-4f60-9cf7-f9675599a769","Type":"ContainerStarted","Data":"cbd4960897047349b4ae5d33f6f8336c770d056731e7931d562b8eafb18d5909"} Nov 22 10:05:11 crc kubenswrapper[4743]: I1122 10:05:11.485044 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" event={"ID":"4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1","Type":"ContainerStarted","Data":"b2dfa7444f6efcbcf19d75396bc1801ca73873300e20bb244bd415fbcc3268bd"} Nov 22 10:05:11 crc kubenswrapper[4743]: I1122 10:05:11.486078 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" event={"ID":"9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf","Type":"ContainerStarted","Data":"f2682a4dc13d64ab867e1973562f4cddb2f45d2e79f3eafbb46e65cc2d9e5cad"} Nov 22 10:05:11 crc kubenswrapper[4743]: I1122 10:05:11.487137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs" event={"ID":"518037c9-5978-4eae-bca7-bf63f9dc5b50","Type":"ContainerStarted","Data":"f2137154669ddce932fe5cfdc5d9a909ce4271f5d48223972c2f635299428cbe"} Nov 22 10:05:11 crc kubenswrapper[4743]: I1122 10:05:11.488272 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-gdwmk" event={"ID":"1a0a8704-e5ab-4bed-bdd6-bb291e7823e5","Type":"ContainerStarted","Data":"3fd7997c45f2c328a1b1439945d300c6676dd103d74aff391aeaba77645d7d66"} Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.649930 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" event={"ID":"4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1","Type":"ContainerStarted","Data":"46b155493a08d1b330598d0c583a6eee03c7c322d8234761bf11a49efa9a02bb"} Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.653986 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" event={"ID":"9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf","Type":"ContainerStarted","Data":"52c8295fa3dd9884a68a48e96c89fa64ba563cf5ffd1e1706b1fb3a9551814f4"} Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.659075 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs" event={"ID":"518037c9-5978-4eae-bca7-bf63f9dc5b50","Type":"ContainerStarted","Data":"f85e26e1ed722a82317c0579748bdfdb81c93f9ecd76999d8ca6dbe1a0f64407"} Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.662710 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-gdwmk" event={"ID":"1a0a8704-e5ab-4bed-bdd6-bb291e7823e5","Type":"ContainerStarted","Data":"b181e3be861f569854e5587f23d5f767f6ed026ec1691a49963bb3691b5dc5bb"} Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.663327 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-gdwmk" Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.664632 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" event={"ID":"86470a8e-5cc0-4f60-9cf7-f9675599a769","Type":"ContainerStarted","Data":"1892db366d12cfe8e6243bcc0b59fd4c213850339940704cc7ea2350de96f64f"} Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.665528 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.674913 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.678055 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht" podStartSLOduration=3.592432049 podStartE2EDuration="13.678042031s" podCreationTimestamp="2025-11-22 10:05:08 +0000 UTC" firstStartedPulling="2025-11-22 10:05:10.738236407 +0000 UTC m=+6184.444597459" lastFinishedPulling="2025-11-22 10:05:20.823846379 +0000 UTC m=+6194.530207441" observedRunningTime="2025-11-22 10:05:21.669148936 +0000 UTC m=+6195.375509988" watchObservedRunningTime="2025-11-22 10:05:21.678042031 +0000 UTC m=+6195.384403083" Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.727676 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf" podStartSLOduration=3.683565578 podStartE2EDuration="13.727653847s" podCreationTimestamp="2025-11-22 10:05:08 +0000 UTC" firstStartedPulling="2025-11-22 10:05:10.737780364 +0000 UTC m=+6184.444141416" lastFinishedPulling="2025-11-22 10:05:20.781868613 +0000 UTC m=+6194.488229685" observedRunningTime="2025-11-22 10:05:21.715383064 +0000 UTC m=+6195.421744116" watchObservedRunningTime="2025-11-22 10:05:21.727653847 +0000 UTC m=+6195.434014899" Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.779793 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-gdwmk" podStartSLOduration=2.775426126 podStartE2EDuration="12.779748433s" podCreationTimestamp="2025-11-22 10:05:09 +0000 UTC" firstStartedPulling="2025-11-22 10:05:10.761801904 +0000 UTC m=+6184.468162956" lastFinishedPulling="2025-11-22 10:05:20.766124211 +0000 UTC m=+6194.472485263" observedRunningTime="2025-11-22 10:05:21.743570544 +0000 UTC m=+6195.449931596" watchObservedRunningTime="2025-11-22 10:05:21.779748433 +0000 UTC m=+6195.486109485" Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.781604 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-7htfs" podStartSLOduration=3.761022953 podStartE2EDuration="13.781593086s" podCreationTimestamp="2025-11-22 10:05:08 +0000 UTC" firstStartedPulling="2025-11-22 10:05:10.76234114 +0000 UTC m=+6184.468702192" lastFinishedPulling="2025-11-22 10:05:20.782911263 +0000 UTC m=+6194.489272325" observedRunningTime="2025-11-22 10:05:21.767354137 +0000 UTC m=+6195.473715199" watchObservedRunningTime="2025-11-22 10:05:21.781593086 +0000 UTC m=+6195.487954138" Nov 22 10:05:21 crc kubenswrapper[4743]: I1122 10:05:21.802916 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-mm5nj" podStartSLOduration=2.492769696 podStartE2EDuration="12.802882808s" podCreationTimestamp="2025-11-22 10:05:09 +0000 UTC" firstStartedPulling="2025-11-22 10:05:10.738588797 +0000 UTC m=+6184.444949859" lastFinishedPulling="2025-11-22 10:05:21.048701919 +0000 UTC m=+6194.755062971" observedRunningTime="2025-11-22 10:05:21.79944923 +0000 UTC m=+6195.505810282" watchObservedRunningTime="2025-11-22 10:05:21.802882808 +0000 UTC m=+6195.509243860" Nov 22 10:05:25 crc kubenswrapper[4743]: I1122 10:05:25.036889 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-7wnxw"] Nov 22 10:05:25 crc kubenswrapper[4743]: I1122 10:05:25.046504 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-7wnxw"] Nov 22 10:05:25 crc kubenswrapper[4743]: I1122 10:05:25.181552 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd1bcc9-26a2-493f-be9e-30dfb052cbc3" path="/var/lib/kubelet/pods/9fd1bcc9-26a2-493f-be9e-30dfb052cbc3/volumes" Nov 22 10:05:29 crc kubenswrapper[4743]: I1122 10:05:29.098227 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s7x4g"] Nov 22 10:05:29 crc kubenswrapper[4743]: I1122 10:05:29.133431 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-s7x4g"] Nov 22 10:05:29 crc kubenswrapper[4743]: I1122 10:05:29.167055 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ccec00-e345-40b9-a606-aa72c0d64b8b" path="/var/lib/kubelet/pods/19ccec00-e345-40b9-a606-aa72c0d64b8b/volumes" Nov 22 10:05:29 crc kubenswrapper[4743]: I1122 10:05:29.673906 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-gdwmk" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.148182 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.148916 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" containerName="openstackclient" containerID="cri-o://dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337" gracePeriod=2 Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.169332 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.208246 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Nov 22 10:05:32 crc kubenswrapper[4743]: E1122 10:05:32.208906 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" containerName="openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.208989 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" containerName="openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.209264 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" containerName="openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.210004 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.224488 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" podUID="ab061051-d1a4-4a0d-bd76-00fdb28c7a13" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.228726 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.307227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ab061051-d1a4-4a0d-bd76-00fdb28c7a13-openstack-config\") pod \"openstackclient\" (UID: \"ab061051-d1a4-4a0d-bd76-00fdb28c7a13\") " pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.307356 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvnjp\" (UniqueName: \"kubernetes.io/projected/ab061051-d1a4-4a0d-bd76-00fdb28c7a13-kube-api-access-xvnjp\") pod \"openstackclient\" (UID: \"ab061051-d1a4-4a0d-bd76-00fdb28c7a13\") " pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.307526 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ab061051-d1a4-4a0d-bd76-00fdb28c7a13-openstack-config-secret\") pod \"openstackclient\" (UID: \"ab061051-d1a4-4a0d-bd76-00fdb28c7a13\") " pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.409802 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ab061051-d1a4-4a0d-bd76-00fdb28c7a13-openstack-config-secret\") pod \"openstackclient\" (UID: \"ab061051-d1a4-4a0d-bd76-00fdb28c7a13\") " pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.410119 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ab061051-d1a4-4a0d-bd76-00fdb28c7a13-openstack-config\") pod \"openstackclient\" (UID: \"ab061051-d1a4-4a0d-bd76-00fdb28c7a13\") " pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.410336 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvnjp\" (UniqueName: \"kubernetes.io/projected/ab061051-d1a4-4a0d-bd76-00fdb28c7a13-kube-api-access-xvnjp\") pod \"openstackclient\" (UID: \"ab061051-d1a4-4a0d-bd76-00fdb28c7a13\") " pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.411503 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ab061051-d1a4-4a0d-bd76-00fdb28c7a13-openstack-config\") pod \"openstackclient\" (UID: \"ab061051-d1a4-4a0d-bd76-00fdb28c7a13\") " pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.433171 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ab061051-d1a4-4a0d-bd76-00fdb28c7a13-openstack-config-secret\") pod \"openstackclient\" (UID: \"ab061051-d1a4-4a0d-bd76-00fdb28c7a13\") " pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.451315 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvnjp\" (UniqueName: \"kubernetes.io/projected/ab061051-d1a4-4a0d-bd76-00fdb28c7a13-kube-api-access-xvnjp\") pod \"openstackclient\" (UID: \"ab061051-d1a4-4a0d-bd76-00fdb28c7a13\") " pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.508719 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.510286 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.521384 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.527751 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-n97b9" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.540030 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.614877 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkcmk\" (UniqueName: \"kubernetes.io/projected/04f1df88-8ca3-4284-b010-0c09b8acde5f-kube-api-access-kkcmk\") pod \"kube-state-metrics-0\" (UID: \"04f1df88-8ca3-4284-b010-0c09b8acde5f\") " pod="openstack/kube-state-metrics-0" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.717013 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkcmk\" (UniqueName: \"kubernetes.io/projected/04f1df88-8ca3-4284-b010-0c09b8acde5f-kube-api-access-kkcmk\") pod \"kube-state-metrics-0\" (UID: \"04f1df88-8ca3-4284-b010-0c09b8acde5f\") " pod="openstack/kube-state-metrics-0" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.771618 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkcmk\" (UniqueName: \"kubernetes.io/projected/04f1df88-8ca3-4284-b010-0c09b8acde5f-kube-api-access-kkcmk\") pod \"kube-state-metrics-0\" (UID: \"04f1df88-8ca3-4284-b010-0c09b8acde5f\") " pod="openstack/kube-state-metrics-0" Nov 22 10:05:32 crc kubenswrapper[4743]: I1122 10:05:32.888281 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.504507 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.519379 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.525298 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-mv4bf" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.525542 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.525708 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.525855 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.525978 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.555566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8fb2090e-f38d-4934-99d3-8756dc9552f2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.555674 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8fb2090e-f38d-4934-99d3-8756dc9552f2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.555712 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8fb2090e-f38d-4934-99d3-8756dc9552f2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.555726 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8fb2090e-f38d-4934-99d3-8756dc9552f2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.555778 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfbwg\" (UniqueName: \"kubernetes.io/projected/8fb2090e-f38d-4934-99d3-8756dc9552f2-kube-api-access-lfbwg\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.555877 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/8fb2090e-f38d-4934-99d3-8756dc9552f2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.555934 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8fb2090e-f38d-4934-99d3-8756dc9552f2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.563609 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.659437 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.660652 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/8fb2090e-f38d-4934-99d3-8756dc9552f2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.660724 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8fb2090e-f38d-4934-99d3-8756dc9552f2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.660960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8fb2090e-f38d-4934-99d3-8756dc9552f2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.661010 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8fb2090e-f38d-4934-99d3-8756dc9552f2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.661036 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8fb2090e-f38d-4934-99d3-8756dc9552f2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.661051 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8fb2090e-f38d-4934-99d3-8756dc9552f2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.661081 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfbwg\" (UniqueName: \"kubernetes.io/projected/8fb2090e-f38d-4934-99d3-8756dc9552f2-kube-api-access-lfbwg\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.661188 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/8fb2090e-f38d-4934-99d3-8756dc9552f2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.668318 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8fb2090e-f38d-4934-99d3-8756dc9552f2-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.685941 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8fb2090e-f38d-4934-99d3-8756dc9552f2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.694873 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8fb2090e-f38d-4934-99d3-8756dc9552f2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.695744 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8fb2090e-f38d-4934-99d3-8756dc9552f2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.696017 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfbwg\" (UniqueName: \"kubernetes.io/projected/8fb2090e-f38d-4934-99d3-8756dc9552f2-kube-api-access-lfbwg\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.705419 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8fb2090e-f38d-4934-99d3-8756dc9552f2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"8fb2090e-f38d-4934-99d3-8756dc9552f2\") " pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.802343 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ab061051-d1a4-4a0d-bd76-00fdb28c7a13","Type":"ContainerStarted","Data":"141294a1572ada7a692051759569e19e03be53af70bc46ef954888ca07171e6c"} Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.848026 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.865085 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.968618 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.972106 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.975813 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.975990 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.976175 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.976335 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-6zrb2" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.976893 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.977528 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Nov 22 10:05:33 crc kubenswrapper[4743]: I1122 10:05:33.997892 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.083877 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d45bcf1f-df0a-4470-acd9-62a70715936e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.084379 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d45bcf1f-df0a-4470-acd9-62a70715936e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.084435 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-22092a22-2bda-4390-bec6-1b7298236800\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22092a22-2bda-4390-bec6-1b7298236800\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.084461 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d45bcf1f-df0a-4470-acd9-62a70715936e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.084480 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d45bcf1f-df0a-4470-acd9-62a70715936e-config\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.084522 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d45bcf1f-df0a-4470-acd9-62a70715936e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.084560 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zggz\" (UniqueName: \"kubernetes.io/projected/d45bcf1f-df0a-4470-acd9-62a70715936e-kube-api-access-5zggz\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.084636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d45bcf1f-df0a-4470-acd9-62a70715936e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.186182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d45bcf1f-df0a-4470-acd9-62a70715936e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.186233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d45bcf1f-df0a-4470-acd9-62a70715936e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.186285 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-22092a22-2bda-4390-bec6-1b7298236800\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22092a22-2bda-4390-bec6-1b7298236800\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.186305 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d45bcf1f-df0a-4470-acd9-62a70715936e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.186323 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d45bcf1f-df0a-4470-acd9-62a70715936e-config\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.186356 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d45bcf1f-df0a-4470-acd9-62a70715936e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.186383 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zggz\" (UniqueName: \"kubernetes.io/projected/d45bcf1f-df0a-4470-acd9-62a70715936e-kube-api-access-5zggz\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.186425 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d45bcf1f-df0a-4470-acd9-62a70715936e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.187985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d45bcf1f-df0a-4470-acd9-62a70715936e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.190383 4743 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.190420 4743 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-22092a22-2bda-4390-bec6-1b7298236800\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22092a22-2bda-4390-bec6-1b7298236800\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a94cd82a0ef84b427745ddf7c65a4a73d0f210fd07b69e232b3b0850834c4c9a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.194548 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d45bcf1f-df0a-4470-acd9-62a70715936e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.195027 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d45bcf1f-df0a-4470-acd9-62a70715936e-config\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.205501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d45bcf1f-df0a-4470-acd9-62a70715936e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.206177 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d45bcf1f-df0a-4470-acd9-62a70715936e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.210274 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d45bcf1f-df0a-4470-acd9-62a70715936e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.223250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zggz\" (UniqueName: \"kubernetes.io/projected/d45bcf1f-df0a-4470-acd9-62a70715936e-kube-api-access-5zggz\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.295823 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-22092a22-2bda-4390-bec6-1b7298236800\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-22092a22-2bda-4390-bec6-1b7298236800\") pod \"prometheus-metric-storage-0\" (UID: \"d45bcf1f-df0a-4470-acd9-62a70715936e\") " pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.303205 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.517874 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Nov 22 10:05:34 crc kubenswrapper[4743]: W1122 10:05:34.546392 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fb2090e_f38d_4934_99d3_8756dc9552f2.slice/crio-efc996a023ff22daba4106b887d40adf9e3e158155ebb300672ffb7e9a508c47 WatchSource:0}: Error finding container efc996a023ff22daba4106b887d40adf9e3e158155ebb300672ffb7e9a508c47: Status 404 returned error can't find the container with id efc996a023ff22daba4106b887d40adf9e3e158155ebb300672ffb7e9a508c47 Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.646315 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.695325 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2vfv\" (UniqueName: \"kubernetes.io/projected/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-kube-api-access-h2vfv\") pod \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.695447 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config\") pod \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.695559 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config-secret\") pod \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\" (UID: \"740fdd50-f1ff-4415-b473-a5e4a86f2e5a\") " Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.710228 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-kube-api-access-h2vfv" (OuterVolumeSpecName: "kube-api-access-h2vfv") pod "740fdd50-f1ff-4415-b473-a5e4a86f2e5a" (UID: "740fdd50-f1ff-4415-b473-a5e4a86f2e5a"). InnerVolumeSpecName "kube-api-access-h2vfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.798389 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "740fdd50-f1ff-4415-b473-a5e4a86f2e5a" (UID: "740fdd50-f1ff-4415-b473-a5e4a86f2e5a"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.800092 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config\") on node \"crc\" DevicePath \"\"" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.800115 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2vfv\" (UniqueName: \"kubernetes.io/projected/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-kube-api-access-h2vfv\") on node \"crc\" DevicePath \"\"" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.825291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8fb2090e-f38d-4934-99d3-8756dc9552f2","Type":"ContainerStarted","Data":"efc996a023ff22daba4106b887d40adf9e3e158155ebb300672ffb7e9a508c47"} Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.828903 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "740fdd50-f1ff-4415-b473-a5e4a86f2e5a" (UID: "740fdd50-f1ff-4415-b473-a5e4a86f2e5a"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.830322 4743 generic.go:334] "Generic (PLEG): container finished" podID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" containerID="dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337" exitCode=137 Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.830388 4743 scope.go:117] "RemoveContainer" containerID="dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.830397 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.832780 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"04f1df88-8ca3-4284-b010-0c09b8acde5f","Type":"ContainerStarted","Data":"86baae1ad75a935aa48673598100e226a9d1c10a9628d418e35299af97f5b13f"} Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.832817 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"04f1df88-8ca3-4284-b010-0c09b8acde5f","Type":"ContainerStarted","Data":"ea9b9d47924a3f005abc07522ef12bd67d6614f3161c50e22f606dd10e23913b"} Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.832871 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.834422 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ab061051-d1a4-4a0d-bd76-00fdb28c7a13","Type":"ContainerStarted","Data":"c1f144e3235035b401dcc940a29698564313567fe7a8fc9683de958e1d472b53"} Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.885646 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.418051477 podStartE2EDuration="2.885623251s" podCreationTimestamp="2025-11-22 10:05:32 +0000 UTC" firstStartedPulling="2025-11-22 10:05:33.888917014 +0000 UTC m=+6207.595278066" lastFinishedPulling="2025-11-22 10:05:34.356488788 +0000 UTC m=+6208.062849840" observedRunningTime="2025-11-22 10:05:34.874266815 +0000 UTC m=+6208.580627867" watchObservedRunningTime="2025-11-22 10:05:34.885623251 +0000 UTC m=+6208.591984303" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.895502 4743 scope.go:117] "RemoveContainer" containerID="dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337" Nov 22 10:05:34 crc kubenswrapper[4743]: E1122 10:05:34.899251 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337\": container with ID starting with dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337 not found: ID does not exist" containerID="dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.899296 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337"} err="failed to get container status \"dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337\": rpc error: code = NotFound desc = could not find container \"dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337\": container with ID starting with dd8c5a776c3590a05044d06068e5b8e6ea72b0aa0bb45412d1e475bddcbb2337 not found: ID does not exist" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.900835 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.900813438 podStartE2EDuration="2.900813438s" podCreationTimestamp="2025-11-22 10:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:05:34.899550411 +0000 UTC m=+6208.605911453" watchObservedRunningTime="2025-11-22 10:05:34.900813438 +0000 UTC m=+6208.607174490" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.901945 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/740fdd50-f1ff-4415-b473-a5e4a86f2e5a-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.902833 4743 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" podUID="ab061051-d1a4-4a0d-bd76-00fdb28c7a13" Nov 22 10:05:34 crc kubenswrapper[4743]: I1122 10:05:34.966075 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.048648 4743 scope.go:117] "RemoveContainer" containerID="c0f8d107b3c1b021be89ff7af4d768f3502eb72eecc8a9d1f23650309fc5de0e" Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.077082 4743 scope.go:117] "RemoveContainer" containerID="72c0a61b882e4f36382e61128f4e9a686449c486cd9a4f07bcb9c3a44a1f62ab" Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.164186 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="740fdd50-f1ff-4415-b473-a5e4a86f2e5a" path="/var/lib/kubelet/pods/740fdd50-f1ff-4415-b473-a5e4a86f2e5a/volumes" Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.188118 4743 scope.go:117] "RemoveContainer" containerID="24dc3fe94b7d6a358c33cf32e7da0a9a6c3ae5d2e3a6f83875c0e7cf16598a6b" Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.244671 4743 scope.go:117] "RemoveContainer" containerID="8be01ab90b71902b48d522f7659269920a4bf9b3e4878b40405b422ea3d7929b" Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.301626 4743 scope.go:117] "RemoveContainer" containerID="f3ca7a23f6d5f7ad78bca3a8b38f4b6ef1385443b64340f3c33783956342f196" Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.331298 4743 scope.go:117] "RemoveContainer" containerID="ddeb9b9c8454225ffb28236314a67faa3d8a1cf5d1639436f6cc528e201a4f8e" Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.379950 4743 scope.go:117] "RemoveContainer" containerID="c232c18607f5202278acb57d1d051f46226a915c40cee6e8b85c62e948110e06" Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.403809 4743 scope.go:117] "RemoveContainer" containerID="5a2ee31798cb93bde59c4247b40c33bb401fc8c2048d9730c047ee8256c1df01" Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.431176 4743 scope.go:117] "RemoveContainer" containerID="78c54715e656a40bd0ec04c4c311853dd98402a2f860139b315dc3f0fce6f086" Nov 22 10:05:35 crc kubenswrapper[4743]: I1122 10:05:35.856030 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d45bcf1f-df0a-4470-acd9-62a70715936e","Type":"ContainerStarted","Data":"582c5164c5c2acd0207f31fb3996197b66994f16d5408fd23babf78fcaf02d3b"} Nov 22 10:05:41 crc kubenswrapper[4743]: I1122 10:05:41.266845 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d45bcf1f-df0a-4470-acd9-62a70715936e","Type":"ContainerStarted","Data":"4dc3ab900d9b87bce884c812612a9e40c103f47897ada74e3986115535a51f05"} Nov 22 10:05:41 crc kubenswrapper[4743]: I1122 10:05:41.269982 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8fb2090e-f38d-4934-99d3-8756dc9552f2","Type":"ContainerStarted","Data":"73394ba91a5291553621a31413dedd6522b1b2d28e3449c4b7196ad320a7d16e"} Nov 22 10:05:42 crc kubenswrapper[4743]: I1122 10:05:42.892777 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Nov 22 10:05:46 crc kubenswrapper[4743]: I1122 10:05:46.323060 4743 generic.go:334] "Generic (PLEG): container finished" podID="8fb2090e-f38d-4934-99d3-8756dc9552f2" containerID="73394ba91a5291553621a31413dedd6522b1b2d28e3449c4b7196ad320a7d16e" exitCode=0 Nov 22 10:05:46 crc kubenswrapper[4743]: I1122 10:05:46.323636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8fb2090e-f38d-4934-99d3-8756dc9552f2","Type":"ContainerDied","Data":"73394ba91a5291553621a31413dedd6522b1b2d28e3449c4b7196ad320a7d16e"} Nov 22 10:05:46 crc kubenswrapper[4743]: E1122 10:05:46.332936 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fb2090e_f38d_4934_99d3_8756dc9552f2.slice/crio-conmon-73394ba91a5291553621a31413dedd6522b1b2d28e3449c4b7196ad320a7d16e.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:05:47 crc kubenswrapper[4743]: I1122 10:05:47.334202 4743 generic.go:334] "Generic (PLEG): container finished" podID="d45bcf1f-df0a-4470-acd9-62a70715936e" containerID="4dc3ab900d9b87bce884c812612a9e40c103f47897ada74e3986115535a51f05" exitCode=0 Nov 22 10:05:47 crc kubenswrapper[4743]: I1122 10:05:47.334269 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d45bcf1f-df0a-4470-acd9-62a70715936e","Type":"ContainerDied","Data":"4dc3ab900d9b87bce884c812612a9e40c103f47897ada74e3986115535a51f05"} Nov 22 10:05:48 crc kubenswrapper[4743]: I1122 10:05:48.035946 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-6ps4b"] Nov 22 10:05:48 crc kubenswrapper[4743]: I1122 10:05:48.044398 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-6ps4b"] Nov 22 10:05:49 crc kubenswrapper[4743]: I1122 10:05:49.164156 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2b2d2d-594a-4b08-baf1-d021c012f86a" path="/var/lib/kubelet/pods/ba2b2d2d-594a-4b08-baf1-d021c012f86a/volumes" Nov 22 10:05:49 crc kubenswrapper[4743]: I1122 10:05:49.358413 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8fb2090e-f38d-4934-99d3-8756dc9552f2","Type":"ContainerStarted","Data":"8a349d284a1b1843767fbef3f819bc6dd249b1248181745dbaeeaa2b3b456984"} Nov 22 10:05:53 crc kubenswrapper[4743]: I1122 10:05:53.395556 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"8fb2090e-f38d-4934-99d3-8756dc9552f2","Type":"ContainerStarted","Data":"db5ab2c1f5b9fe578ad1374c25c23f70bddd03733237ab1cd0e192dcffe46e3b"} Nov 22 10:05:53 crc kubenswrapper[4743]: I1122 10:05:53.396480 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:53 crc kubenswrapper[4743]: I1122 10:05:53.398692 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d45bcf1f-df0a-4470-acd9-62a70715936e","Type":"ContainerStarted","Data":"5b96f8f5ecddf58479694e0c278c93b4a278a89681a2c909d26b04af1a12337b"} Nov 22 10:05:53 crc kubenswrapper[4743]: I1122 10:05:53.399221 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Nov 22 10:05:53 crc kubenswrapper[4743]: I1122 10:05:53.434532 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.9315136840000005 podStartE2EDuration="20.434513664s" podCreationTimestamp="2025-11-22 10:05:33 +0000 UTC" firstStartedPulling="2025-11-22 10:05:34.555711642 +0000 UTC m=+6208.262072694" lastFinishedPulling="2025-11-22 10:05:49.058711622 +0000 UTC m=+6222.765072674" observedRunningTime="2025-11-22 10:05:53.422483559 +0000 UTC m=+6227.128844611" watchObservedRunningTime="2025-11-22 10:05:53.434513664 +0000 UTC m=+6227.140874716" Nov 22 10:05:57 crc kubenswrapper[4743]: I1122 10:05:57.434065 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d45bcf1f-df0a-4470-acd9-62a70715936e","Type":"ContainerStarted","Data":"24b327d47922df3ab746af76c5ec9e3b6527b3c0afc12c4cf981f295b63d6121"} Nov 22 10:06:01 crc kubenswrapper[4743]: I1122 10:06:01.240976 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:06:01 crc kubenswrapper[4743]: I1122 10:06:01.241341 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:06:01 crc kubenswrapper[4743]: I1122 10:06:01.484434 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d45bcf1f-df0a-4470-acd9-62a70715936e","Type":"ContainerStarted","Data":"f091a274416f401f8aeeb8ba4f29691d902ea72551ab6bdaa585e0e9d50992dd"} Nov 22 10:06:01 crc kubenswrapper[4743]: I1122 10:06:01.507807 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.026519721 podStartE2EDuration="29.50778966s" podCreationTimestamp="2025-11-22 10:05:32 +0000 UTC" firstStartedPulling="2025-11-22 10:05:34.978548111 +0000 UTC m=+6208.684909163" lastFinishedPulling="2025-11-22 10:06:00.45981805 +0000 UTC m=+6234.166179102" observedRunningTime="2025-11-22 10:06:01.503688792 +0000 UTC m=+6235.210049864" watchObservedRunningTime="2025-11-22 10:06:01.50778966 +0000 UTC m=+6235.214150712" Nov 22 10:06:04 crc kubenswrapper[4743]: I1122 10:06:04.304191 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Nov 22 10:06:04 crc kubenswrapper[4743]: I1122 10:06:04.304724 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Nov 22 10:06:04 crc kubenswrapper[4743]: I1122 10:06:04.306668 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Nov 22 10:06:04 crc kubenswrapper[4743]: I1122 10:06:04.517346 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Nov 22 10:06:07 crc kubenswrapper[4743]: I1122 10:06:07.892066 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:06:07 crc kubenswrapper[4743]: I1122 10:06:07.895296 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:06:07 crc kubenswrapper[4743]: I1122 10:06:07.897539 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 10:06:07 crc kubenswrapper[4743]: I1122 10:06:07.897677 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 10:06:07 crc kubenswrapper[4743]: I1122 10:06:07.903232 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.001080 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.001146 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvqcd\" (UniqueName: \"kubernetes.io/projected/801c1bcb-23de-4bf4-b1b0-774a6188c187-kube-api-access-jvqcd\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.001180 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.001237 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-log-httpd\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.001360 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-scripts\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.001408 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-run-httpd\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.001514 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-config-data\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.103229 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-scripts\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.103314 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-run-httpd\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.103345 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-config-data\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.103426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.103456 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvqcd\" (UniqueName: \"kubernetes.io/projected/801c1bcb-23de-4bf4-b1b0-774a6188c187-kube-api-access-jvqcd\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.103478 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.103511 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-log-httpd\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.103903 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-run-httpd\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.103971 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-log-httpd\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.112418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-scripts\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.113270 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.113772 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-config-data\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.120329 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.126657 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvqcd\" (UniqueName: \"kubernetes.io/projected/801c1bcb-23de-4bf4-b1b0-774a6188c187-kube-api-access-jvqcd\") pod \"ceilometer-0\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.231824 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:06:08 crc kubenswrapper[4743]: I1122 10:06:08.788705 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:06:08 crc kubenswrapper[4743]: W1122 10:06:08.802181 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801c1bcb_23de_4bf4_b1b0_774a6188c187.slice/crio-7af5ee585882f63f5814c5bd08415ab90153e59737739a9c9ef3840a2e2de9bc WatchSource:0}: Error finding container 7af5ee585882f63f5814c5bd08415ab90153e59737739a9c9ef3840a2e2de9bc: Status 404 returned error can't find the container with id 7af5ee585882f63f5814c5bd08415ab90153e59737739a9c9ef3840a2e2de9bc Nov 22 10:06:09 crc kubenswrapper[4743]: I1122 10:06:09.587676 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801c1bcb-23de-4bf4-b1b0-774a6188c187","Type":"ContainerStarted","Data":"b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3"} Nov 22 10:06:09 crc kubenswrapper[4743]: I1122 10:06:09.588495 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801c1bcb-23de-4bf4-b1b0-774a6188c187","Type":"ContainerStarted","Data":"7af5ee585882f63f5814c5bd08415ab90153e59737739a9c9ef3840a2e2de9bc"} Nov 22 10:06:10 crc kubenswrapper[4743]: I1122 10:06:10.602988 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801c1bcb-23de-4bf4-b1b0-774a6188c187","Type":"ContainerStarted","Data":"7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19"} Nov 22 10:06:11 crc kubenswrapper[4743]: I1122 10:06:11.617015 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801c1bcb-23de-4bf4-b1b0-774a6188c187","Type":"ContainerStarted","Data":"5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e"} Nov 22 10:06:13 crc kubenswrapper[4743]: I1122 10:06:13.634893 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801c1bcb-23de-4bf4-b1b0-774a6188c187","Type":"ContainerStarted","Data":"4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b"} Nov 22 10:06:13 crc kubenswrapper[4743]: I1122 10:06:13.635479 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 10:06:13 crc kubenswrapper[4743]: I1122 10:06:13.658283 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.98082541 podStartE2EDuration="6.658261767s" podCreationTimestamp="2025-11-22 10:06:07 +0000 UTC" firstStartedPulling="2025-11-22 10:06:08.806270134 +0000 UTC m=+6242.512631186" lastFinishedPulling="2025-11-22 10:06:12.483706491 +0000 UTC m=+6246.190067543" observedRunningTime="2025-11-22 10:06:13.653035907 +0000 UTC m=+6247.359396989" watchObservedRunningTime="2025-11-22 10:06:13.658261767 +0000 UTC m=+6247.364622819" Nov 22 10:06:18 crc kubenswrapper[4743]: I1122 10:06:18.879798 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-dz28f"] Nov 22 10:06:18 crc kubenswrapper[4743]: I1122 10:06:18.881623 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-dz28f" Nov 22 10:06:18 crc kubenswrapper[4743]: I1122 10:06:18.917125 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-dz28f"] Nov 22 10:06:18 crc kubenswrapper[4743]: I1122 10:06:18.957869 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sjfv\" (UniqueName: \"kubernetes.io/projected/d1b02d6b-89a6-4119-96ff-0a80cb68e437-kube-api-access-2sjfv\") pod \"aodh-db-create-dz28f\" (UID: \"d1b02d6b-89a6-4119-96ff-0a80cb68e437\") " pod="openstack/aodh-db-create-dz28f" Nov 22 10:06:18 crc kubenswrapper[4743]: I1122 10:06:18.957982 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b02d6b-89a6-4119-96ff-0a80cb68e437-operator-scripts\") pod \"aodh-db-create-dz28f\" (UID: \"d1b02d6b-89a6-4119-96ff-0a80cb68e437\") " pod="openstack/aodh-db-create-dz28f" Nov 22 10:06:18 crc kubenswrapper[4743]: I1122 10:06:18.988022 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-78bc-account-create-tfdlp"] Nov 22 10:06:18 crc kubenswrapper[4743]: I1122 10:06:18.989641 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-78bc-account-create-tfdlp" Nov 22 10:06:18 crc kubenswrapper[4743]: I1122 10:06:18.992032 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Nov 22 10:06:18 crc kubenswrapper[4743]: I1122 10:06:18.999518 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-78bc-account-create-tfdlp"] Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.059654 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sjfv\" (UniqueName: \"kubernetes.io/projected/d1b02d6b-89a6-4119-96ff-0a80cb68e437-kube-api-access-2sjfv\") pod \"aodh-db-create-dz28f\" (UID: \"d1b02d6b-89a6-4119-96ff-0a80cb68e437\") " pod="openstack/aodh-db-create-dz28f" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.059726 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkgsx\" (UniqueName: \"kubernetes.io/projected/727ee333-0f62-4dc1-acc9-2f03f1b9269c-kube-api-access-wkgsx\") pod \"aodh-78bc-account-create-tfdlp\" (UID: \"727ee333-0f62-4dc1-acc9-2f03f1b9269c\") " pod="openstack/aodh-78bc-account-create-tfdlp" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.059768 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/727ee333-0f62-4dc1-acc9-2f03f1b9269c-operator-scripts\") pod \"aodh-78bc-account-create-tfdlp\" (UID: \"727ee333-0f62-4dc1-acc9-2f03f1b9269c\") " pod="openstack/aodh-78bc-account-create-tfdlp" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.059816 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b02d6b-89a6-4119-96ff-0a80cb68e437-operator-scripts\") pod \"aodh-db-create-dz28f\" (UID: \"d1b02d6b-89a6-4119-96ff-0a80cb68e437\") " pod="openstack/aodh-db-create-dz28f" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.060492 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b02d6b-89a6-4119-96ff-0a80cb68e437-operator-scripts\") pod \"aodh-db-create-dz28f\" (UID: \"d1b02d6b-89a6-4119-96ff-0a80cb68e437\") " pod="openstack/aodh-db-create-dz28f" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.080670 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sjfv\" (UniqueName: \"kubernetes.io/projected/d1b02d6b-89a6-4119-96ff-0a80cb68e437-kube-api-access-2sjfv\") pod \"aodh-db-create-dz28f\" (UID: \"d1b02d6b-89a6-4119-96ff-0a80cb68e437\") " pod="openstack/aodh-db-create-dz28f" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.161792 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkgsx\" (UniqueName: \"kubernetes.io/projected/727ee333-0f62-4dc1-acc9-2f03f1b9269c-kube-api-access-wkgsx\") pod \"aodh-78bc-account-create-tfdlp\" (UID: \"727ee333-0f62-4dc1-acc9-2f03f1b9269c\") " pod="openstack/aodh-78bc-account-create-tfdlp" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.162121 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/727ee333-0f62-4dc1-acc9-2f03f1b9269c-operator-scripts\") pod \"aodh-78bc-account-create-tfdlp\" (UID: \"727ee333-0f62-4dc1-acc9-2f03f1b9269c\") " pod="openstack/aodh-78bc-account-create-tfdlp" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.162895 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/727ee333-0f62-4dc1-acc9-2f03f1b9269c-operator-scripts\") pod \"aodh-78bc-account-create-tfdlp\" (UID: \"727ee333-0f62-4dc1-acc9-2f03f1b9269c\") " pod="openstack/aodh-78bc-account-create-tfdlp" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.178479 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkgsx\" (UniqueName: \"kubernetes.io/projected/727ee333-0f62-4dc1-acc9-2f03f1b9269c-kube-api-access-wkgsx\") pod \"aodh-78bc-account-create-tfdlp\" (UID: \"727ee333-0f62-4dc1-acc9-2f03f1b9269c\") " pod="openstack/aodh-78bc-account-create-tfdlp" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.202332 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-dz28f" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.309333 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-78bc-account-create-tfdlp" Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.697841 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-dz28f"] Nov 22 10:06:19 crc kubenswrapper[4743]: W1122 10:06:19.705618 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1b02d6b_89a6_4119_96ff_0a80cb68e437.slice/crio-619a47cf67f826fcff0740c8c1a7df92dca45759a1256443f795581398cf61a0 WatchSource:0}: Error finding container 619a47cf67f826fcff0740c8c1a7df92dca45759a1256443f795581398cf61a0: Status 404 returned error can't find the container with id 619a47cf67f826fcff0740c8c1a7df92dca45759a1256443f795581398cf61a0 Nov 22 10:06:19 crc kubenswrapper[4743]: I1122 10:06:19.843167 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-78bc-account-create-tfdlp"] Nov 22 10:06:19 crc kubenswrapper[4743]: W1122 10:06:19.851305 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod727ee333_0f62_4dc1_acc9_2f03f1b9269c.slice/crio-e2e43b5d998d63ce761f5ce9e456554c11b35be82aff63304750c7d4aed9d4ff WatchSource:0}: Error finding container e2e43b5d998d63ce761f5ce9e456554c11b35be82aff63304750c7d4aed9d4ff: Status 404 returned error can't find the container with id e2e43b5d998d63ce761f5ce9e456554c11b35be82aff63304750c7d4aed9d4ff Nov 22 10:06:20 crc kubenswrapper[4743]: I1122 10:06:20.721943 4743 generic.go:334] "Generic (PLEG): container finished" podID="d1b02d6b-89a6-4119-96ff-0a80cb68e437" containerID="6f247fd94cfb097d1a6bb0ab87800adfede67fd65b637dc2238e5e9759fec00c" exitCode=0 Nov 22 10:06:20 crc kubenswrapper[4743]: I1122 10:06:20.722117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-dz28f" event={"ID":"d1b02d6b-89a6-4119-96ff-0a80cb68e437","Type":"ContainerDied","Data":"6f247fd94cfb097d1a6bb0ab87800adfede67fd65b637dc2238e5e9759fec00c"} Nov 22 10:06:20 crc kubenswrapper[4743]: I1122 10:06:20.723186 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-dz28f" event={"ID":"d1b02d6b-89a6-4119-96ff-0a80cb68e437","Type":"ContainerStarted","Data":"619a47cf67f826fcff0740c8c1a7df92dca45759a1256443f795581398cf61a0"} Nov 22 10:06:20 crc kubenswrapper[4743]: I1122 10:06:20.725381 4743 generic.go:334] "Generic (PLEG): container finished" podID="727ee333-0f62-4dc1-acc9-2f03f1b9269c" containerID="305adb6454c27b78db6c12a014f0fd8a3af5113065c4932ea5b35797429acca6" exitCode=0 Nov 22 10:06:20 crc kubenswrapper[4743]: I1122 10:06:20.725419 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-78bc-account-create-tfdlp" event={"ID":"727ee333-0f62-4dc1-acc9-2f03f1b9269c","Type":"ContainerDied","Data":"305adb6454c27b78db6c12a014f0fd8a3af5113065c4932ea5b35797429acca6"} Nov 22 10:06:20 crc kubenswrapper[4743]: I1122 10:06:20.725550 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-78bc-account-create-tfdlp" event={"ID":"727ee333-0f62-4dc1-acc9-2f03f1b9269c","Type":"ContainerStarted","Data":"e2e43b5d998d63ce761f5ce9e456554c11b35be82aff63304750c7d4aed9d4ff"} Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.204530 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-78bc-account-create-tfdlp" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.211281 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-dz28f" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.342531 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sjfv\" (UniqueName: \"kubernetes.io/projected/d1b02d6b-89a6-4119-96ff-0a80cb68e437-kube-api-access-2sjfv\") pod \"d1b02d6b-89a6-4119-96ff-0a80cb68e437\" (UID: \"d1b02d6b-89a6-4119-96ff-0a80cb68e437\") " Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.342815 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/727ee333-0f62-4dc1-acc9-2f03f1b9269c-operator-scripts\") pod \"727ee333-0f62-4dc1-acc9-2f03f1b9269c\" (UID: \"727ee333-0f62-4dc1-acc9-2f03f1b9269c\") " Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.342959 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkgsx\" (UniqueName: \"kubernetes.io/projected/727ee333-0f62-4dc1-acc9-2f03f1b9269c-kube-api-access-wkgsx\") pod \"727ee333-0f62-4dc1-acc9-2f03f1b9269c\" (UID: \"727ee333-0f62-4dc1-acc9-2f03f1b9269c\") " Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.343079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b02d6b-89a6-4119-96ff-0a80cb68e437-operator-scripts\") pod \"d1b02d6b-89a6-4119-96ff-0a80cb68e437\" (UID: \"d1b02d6b-89a6-4119-96ff-0a80cb68e437\") " Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.343248 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/727ee333-0f62-4dc1-acc9-2f03f1b9269c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "727ee333-0f62-4dc1-acc9-2f03f1b9269c" (UID: "727ee333-0f62-4dc1-acc9-2f03f1b9269c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.343457 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b02d6b-89a6-4119-96ff-0a80cb68e437-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d1b02d6b-89a6-4119-96ff-0a80cb68e437" (UID: "d1b02d6b-89a6-4119-96ff-0a80cb68e437"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.343755 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1b02d6b-89a6-4119-96ff-0a80cb68e437-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.343778 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/727ee333-0f62-4dc1-acc9-2f03f1b9269c-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.348269 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/727ee333-0f62-4dc1-acc9-2f03f1b9269c-kube-api-access-wkgsx" (OuterVolumeSpecName: "kube-api-access-wkgsx") pod "727ee333-0f62-4dc1-acc9-2f03f1b9269c" (UID: "727ee333-0f62-4dc1-acc9-2f03f1b9269c"). InnerVolumeSpecName "kube-api-access-wkgsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.348365 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b02d6b-89a6-4119-96ff-0a80cb68e437-kube-api-access-2sjfv" (OuterVolumeSpecName: "kube-api-access-2sjfv") pod "d1b02d6b-89a6-4119-96ff-0a80cb68e437" (UID: "d1b02d6b-89a6-4119-96ff-0a80cb68e437"). InnerVolumeSpecName "kube-api-access-2sjfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.449436 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sjfv\" (UniqueName: \"kubernetes.io/projected/d1b02d6b-89a6-4119-96ff-0a80cb68e437-kube-api-access-2sjfv\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.449468 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkgsx\" (UniqueName: \"kubernetes.io/projected/727ee333-0f62-4dc1-acc9-2f03f1b9269c-kube-api-access-wkgsx\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.745317 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-dz28f" event={"ID":"d1b02d6b-89a6-4119-96ff-0a80cb68e437","Type":"ContainerDied","Data":"619a47cf67f826fcff0740c8c1a7df92dca45759a1256443f795581398cf61a0"} Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.745366 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="619a47cf67f826fcff0740c8c1a7df92dca45759a1256443f795581398cf61a0" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.745338 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-dz28f" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.746669 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-78bc-account-create-tfdlp" event={"ID":"727ee333-0f62-4dc1-acc9-2f03f1b9269c","Type":"ContainerDied","Data":"e2e43b5d998d63ce761f5ce9e456554c11b35be82aff63304750c7d4aed9d4ff"} Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.746699 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2e43b5d998d63ce761f5ce9e456554c11b35be82aff63304750c7d4aed9d4ff" Nov 22 10:06:22 crc kubenswrapper[4743]: I1122 10:06:22.746767 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-78bc-account-create-tfdlp" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.423845 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-zbhvb"] Nov 22 10:06:24 crc kubenswrapper[4743]: E1122 10:06:24.425057 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b02d6b-89a6-4119-96ff-0a80cb68e437" containerName="mariadb-database-create" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.425075 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b02d6b-89a6-4119-96ff-0a80cb68e437" containerName="mariadb-database-create" Nov 22 10:06:24 crc kubenswrapper[4743]: E1122 10:06:24.425102 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="727ee333-0f62-4dc1-acc9-2f03f1b9269c" containerName="mariadb-account-create" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.425108 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="727ee333-0f62-4dc1-acc9-2f03f1b9269c" containerName="mariadb-account-create" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.425296 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="727ee333-0f62-4dc1-acc9-2f03f1b9269c" containerName="mariadb-account-create" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.425312 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b02d6b-89a6-4119-96ff-0a80cb68e437" containerName="mariadb-database-create" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.426465 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.430101 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.430130 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.430333 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.430343 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5qgft" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.437285 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-zbhvb"] Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.594669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-scripts\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.595026 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-config-data\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.595218 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-combined-ca-bundle\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.595390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt8d4\" (UniqueName: \"kubernetes.io/projected/3f172421-21a5-46a3-847f-04f7590e9615-kube-api-access-dt8d4\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.698000 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt8d4\" (UniqueName: \"kubernetes.io/projected/3f172421-21a5-46a3-847f-04f7590e9615-kube-api-access-dt8d4\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.698128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-scripts\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.698168 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-config-data\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.698222 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-combined-ca-bundle\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.705101 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-combined-ca-bundle\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.726930 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-scripts\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.727683 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-config-data\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.737629 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt8d4\" (UniqueName: \"kubernetes.io/projected/3f172421-21a5-46a3-847f-04f7590e9615-kube-api-access-dt8d4\") pod \"aodh-db-sync-zbhvb\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:24 crc kubenswrapper[4743]: I1122 10:06:24.754743 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:25 crc kubenswrapper[4743]: I1122 10:06:25.327456 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-zbhvb"] Nov 22 10:06:25 crc kubenswrapper[4743]: I1122 10:06:25.840356 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zbhvb" event={"ID":"3f172421-21a5-46a3-847f-04f7590e9615","Type":"ContainerStarted","Data":"97b2b4c69f1535205658ac87b8561c0521909f8033d484276db5cd05271c9421"} Nov 22 10:06:30 crc kubenswrapper[4743]: I1122 10:06:30.903365 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zbhvb" event={"ID":"3f172421-21a5-46a3-847f-04f7590e9615","Type":"ContainerStarted","Data":"4e06356717f4dd0511f005510a94de0df4a596b48645ae695ff638e7e1c00071"} Nov 22 10:06:30 crc kubenswrapper[4743]: I1122 10:06:30.929303 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-zbhvb" podStartSLOduration=2.48586012 podStartE2EDuration="6.929265875s" podCreationTimestamp="2025-11-22 10:06:24 +0000 UTC" firstStartedPulling="2025-11-22 10:06:25.328550269 +0000 UTC m=+6259.034911321" lastFinishedPulling="2025-11-22 10:06:29.771956024 +0000 UTC m=+6263.478317076" observedRunningTime="2025-11-22 10:06:30.918837446 +0000 UTC m=+6264.625198518" watchObservedRunningTime="2025-11-22 10:06:30.929265875 +0000 UTC m=+6264.635626947" Nov 22 10:06:31 crc kubenswrapper[4743]: I1122 10:06:31.055071 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f58b-account-create-gt4nm"] Nov 22 10:06:31 crc kubenswrapper[4743]: I1122 10:06:31.067816 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-9dn66"] Nov 22 10:06:31 crc kubenswrapper[4743]: I1122 10:06:31.075501 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-9dn66"] Nov 22 10:06:31 crc kubenswrapper[4743]: I1122 10:06:31.083800 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f58b-account-create-gt4nm"] Nov 22 10:06:31 crc kubenswrapper[4743]: I1122 10:06:31.162810 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f87462c-1792-4b68-82f0-3bbcb314686d" path="/var/lib/kubelet/pods/8f87462c-1792-4b68-82f0-3bbcb314686d/volumes" Nov 22 10:06:31 crc kubenswrapper[4743]: I1122 10:06:31.164417 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4a2ede5-5abd-4899-a65b-45c0eac279c4" path="/var/lib/kubelet/pods/d4a2ede5-5abd-4899-a65b-45c0eac279c4/volumes" Nov 22 10:06:31 crc kubenswrapper[4743]: I1122 10:06:31.241337 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:06:31 crc kubenswrapper[4743]: I1122 10:06:31.241661 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:06:32 crc kubenswrapper[4743]: I1122 10:06:32.921667 4743 generic.go:334] "Generic (PLEG): container finished" podID="3f172421-21a5-46a3-847f-04f7590e9615" containerID="4e06356717f4dd0511f005510a94de0df4a596b48645ae695ff638e7e1c00071" exitCode=0 Nov 22 10:06:32 crc kubenswrapper[4743]: I1122 10:06:32.921768 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zbhvb" event={"ID":"3f172421-21a5-46a3-847f-04f7590e9615","Type":"ContainerDied","Data":"4e06356717f4dd0511f005510a94de0df4a596b48645ae695ff638e7e1c00071"} Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.345057 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.518774 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-scripts\") pod \"3f172421-21a5-46a3-847f-04f7590e9615\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.518869 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt8d4\" (UniqueName: \"kubernetes.io/projected/3f172421-21a5-46a3-847f-04f7590e9615-kube-api-access-dt8d4\") pod \"3f172421-21a5-46a3-847f-04f7590e9615\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.518920 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-combined-ca-bundle\") pod \"3f172421-21a5-46a3-847f-04f7590e9615\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.519001 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-config-data\") pod \"3f172421-21a5-46a3-847f-04f7590e9615\" (UID: \"3f172421-21a5-46a3-847f-04f7590e9615\") " Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.524143 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f172421-21a5-46a3-847f-04f7590e9615-kube-api-access-dt8d4" (OuterVolumeSpecName: "kube-api-access-dt8d4") pod "3f172421-21a5-46a3-847f-04f7590e9615" (UID: "3f172421-21a5-46a3-847f-04f7590e9615"). InnerVolumeSpecName "kube-api-access-dt8d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.524789 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-scripts" (OuterVolumeSpecName: "scripts") pod "3f172421-21a5-46a3-847f-04f7590e9615" (UID: "3f172421-21a5-46a3-847f-04f7590e9615"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.548768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-config-data" (OuterVolumeSpecName: "config-data") pod "3f172421-21a5-46a3-847f-04f7590e9615" (UID: "3f172421-21a5-46a3-847f-04f7590e9615"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.550795 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f172421-21a5-46a3-847f-04f7590e9615" (UID: "3f172421-21a5-46a3-847f-04f7590e9615"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.621964 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt8d4\" (UniqueName: \"kubernetes.io/projected/3f172421-21a5-46a3-847f-04f7590e9615-kube-api-access-dt8d4\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.621998 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.622008 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.622017 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f172421-21a5-46a3-847f-04f7590e9615-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.938295 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-zbhvb" event={"ID":"3f172421-21a5-46a3-847f-04f7590e9615","Type":"ContainerDied","Data":"97b2b4c69f1535205658ac87b8561c0521909f8033d484276db5cd05271c9421"} Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.938331 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97b2b4c69f1535205658ac87b8561c0521909f8033d484276db5cd05271c9421" Nov 22 10:06:34 crc kubenswrapper[4743]: I1122 10:06:34.938373 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-zbhvb" Nov 22 10:06:35 crc kubenswrapper[4743]: I1122 10:06:35.734206 4743 scope.go:117] "RemoveContainer" containerID="41d0b842029ab69023af12098c71b7d6acb9d47dab746e781b16764d3f1d21c1" Nov 22 10:06:35 crc kubenswrapper[4743]: I1122 10:06:35.761134 4743 scope.go:117] "RemoveContainer" containerID="a180f17750eabcbf6c96aaf8eb5febb2fca99436fec7bf3fc6aef77fce152b8f" Nov 22 10:06:35 crc kubenswrapper[4743]: I1122 10:06:35.804997 4743 scope.go:117] "RemoveContainer" containerID="50556b3083165646da7ba173acd2dd8a9feb2e072e3ec9c4233ae40e94b54641" Nov 22 10:06:35 crc kubenswrapper[4743]: I1122 10:06:35.851363 4743 scope.go:117] "RemoveContainer" containerID="9dc1ca414f493e4915e2648211e37abf27205e7a9436de039ad5286bfb54d5a4" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.245800 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.868049 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Nov 22 10:06:38 crc kubenswrapper[4743]: E1122 10:06:38.868512 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f172421-21a5-46a3-847f-04f7590e9615" containerName="aodh-db-sync" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.868529 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f172421-21a5-46a3-847f-04f7590e9615" containerName="aodh-db-sync" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.868749 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f172421-21a5-46a3-847f-04f7590e9615" containerName="aodh-db-sync" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.870573 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.873299 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-5qgft" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.873373 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.873884 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.887275 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.922100 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b402e4ee-d154-408c-9b02-b5966ebda7f1-config-data\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.922143 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkqdd\" (UniqueName: \"kubernetes.io/projected/b402e4ee-d154-408c-9b02-b5966ebda7f1-kube-api-access-fkqdd\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.922181 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b402e4ee-d154-408c-9b02-b5966ebda7f1-scripts\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:38 crc kubenswrapper[4743]: I1122 10:06:38.922271 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b402e4ee-d154-408c-9b02-b5966ebda7f1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.024358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b402e4ee-d154-408c-9b02-b5966ebda7f1-config-data\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.024747 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkqdd\" (UniqueName: \"kubernetes.io/projected/b402e4ee-d154-408c-9b02-b5966ebda7f1-kube-api-access-fkqdd\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.025244 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b402e4ee-d154-408c-9b02-b5966ebda7f1-scripts\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.025997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b402e4ee-d154-408c-9b02-b5966ebda7f1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.032284 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b402e4ee-d154-408c-9b02-b5966ebda7f1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.033201 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b402e4ee-d154-408c-9b02-b5966ebda7f1-scripts\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.033759 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b402e4ee-d154-408c-9b02-b5966ebda7f1-config-data\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.047730 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkqdd\" (UniqueName: \"kubernetes.io/projected/b402e4ee-d154-408c-9b02-b5966ebda7f1-kube-api-access-fkqdd\") pod \"aodh-0\" (UID: \"b402e4ee-d154-408c-9b02-b5966ebda7f1\") " pod="openstack/aodh-0" Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.191965 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Nov 22 10:06:39 crc kubenswrapper[4743]: W1122 10:06:39.721128 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb402e4ee_d154_408c_9b02_b5966ebda7f1.slice/crio-a656e25b679781ab6fcf1dda245bce5d6e604eb39bbcd841e40373aa0accae65 WatchSource:0}: Error finding container a656e25b679781ab6fcf1dda245bce5d6e604eb39bbcd841e40373aa0accae65: Status 404 returned error can't find the container with id a656e25b679781ab6fcf1dda245bce5d6e604eb39bbcd841e40373aa0accae65 Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.731191 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Nov 22 10:06:39 crc kubenswrapper[4743]: I1122 10:06:39.991942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b402e4ee-d154-408c-9b02-b5966ebda7f1","Type":"ContainerStarted","Data":"a656e25b679781ab6fcf1dda245bce5d6e604eb39bbcd841e40373aa0accae65"} Nov 22 10:06:40 crc kubenswrapper[4743]: I1122 10:06:40.533653 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:06:40 crc kubenswrapper[4743]: I1122 10:06:40.534281 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="ceilometer-central-agent" containerID="cri-o://b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3" gracePeriod=30 Nov 22 10:06:40 crc kubenswrapper[4743]: I1122 10:06:40.534362 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="sg-core" containerID="cri-o://5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e" gracePeriod=30 Nov 22 10:06:40 crc kubenswrapper[4743]: I1122 10:06:40.534343 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="proxy-httpd" containerID="cri-o://4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b" gracePeriod=30 Nov 22 10:06:40 crc kubenswrapper[4743]: I1122 10:06:40.534384 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="ceilometer-notification-agent" containerID="cri-o://7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19" gracePeriod=30 Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.010167 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b402e4ee-d154-408c-9b02-b5966ebda7f1","Type":"ContainerStarted","Data":"3cdbc8abcf30e17c1aae65d3a61e944b3456e8df270dbfb7ea66f33cb96f4d47"} Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.014451 4743 generic.go:334] "Generic (PLEG): container finished" podID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerID="4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b" exitCode=0 Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.014485 4743 generic.go:334] "Generic (PLEG): container finished" podID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerID="5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e" exitCode=2 Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.014495 4743 generic.go:334] "Generic (PLEG): container finished" podID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerID="7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19" exitCode=0 Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.014528 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801c1bcb-23de-4bf4-b1b0-774a6188c187","Type":"ContainerDied","Data":"4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b"} Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.014561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801c1bcb-23de-4bf4-b1b0-774a6188c187","Type":"ContainerDied","Data":"5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e"} Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.014587 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801c1bcb-23de-4bf4-b1b0-774a6188c187","Type":"ContainerDied","Data":"7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19"} Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.784508 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.981798 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-run-httpd\") pod \"801c1bcb-23de-4bf4-b1b0-774a6188c187\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.982189 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "801c1bcb-23de-4bf4-b1b0-774a6188c187" (UID: "801c1bcb-23de-4bf4-b1b0-774a6188c187"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.982356 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvqcd\" (UniqueName: \"kubernetes.io/projected/801c1bcb-23de-4bf4-b1b0-774a6188c187-kube-api-access-jvqcd\") pod \"801c1bcb-23de-4bf4-b1b0-774a6188c187\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.982434 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-combined-ca-bundle\") pod \"801c1bcb-23de-4bf4-b1b0-774a6188c187\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.982507 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-sg-core-conf-yaml\") pod \"801c1bcb-23de-4bf4-b1b0-774a6188c187\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.982567 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-scripts\") pod \"801c1bcb-23de-4bf4-b1b0-774a6188c187\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.982706 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-log-httpd\") pod \"801c1bcb-23de-4bf4-b1b0-774a6188c187\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.982807 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-config-data\") pod \"801c1bcb-23de-4bf4-b1b0-774a6188c187\" (UID: \"801c1bcb-23de-4bf4-b1b0-774a6188c187\") " Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.983114 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "801c1bcb-23de-4bf4-b1b0-774a6188c187" (UID: "801c1bcb-23de-4bf4-b1b0-774a6188c187"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.983908 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.983930 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/801c1bcb-23de-4bf4-b1b0-774a6188c187-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.986670 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801c1bcb-23de-4bf4-b1b0-774a6188c187-kube-api-access-jvqcd" (OuterVolumeSpecName: "kube-api-access-jvqcd") pod "801c1bcb-23de-4bf4-b1b0-774a6188c187" (UID: "801c1bcb-23de-4bf4-b1b0-774a6188c187"). InnerVolumeSpecName "kube-api-access-jvqcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:06:41 crc kubenswrapper[4743]: I1122 10:06:41.987788 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-scripts" (OuterVolumeSpecName: "scripts") pod "801c1bcb-23de-4bf4-b1b0-774a6188c187" (UID: "801c1bcb-23de-4bf4-b1b0-774a6188c187"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.031802 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "801c1bcb-23de-4bf4-b1b0-774a6188c187" (UID: "801c1bcb-23de-4bf4-b1b0-774a6188c187"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.031887 4743 generic.go:334] "Generic (PLEG): container finished" podID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerID="b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3" exitCode=0 Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.031953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801c1bcb-23de-4bf4-b1b0-774a6188c187","Type":"ContainerDied","Data":"b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3"} Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.031975 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.032005 4743 scope.go:117] "RemoveContainer" containerID="4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.031982 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"801c1bcb-23de-4bf4-b1b0-774a6188c187","Type":"ContainerDied","Data":"7af5ee585882f63f5814c5bd08415ab90153e59737739a9c9ef3840a2e2de9bc"} Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.035313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b402e4ee-d154-408c-9b02-b5966ebda7f1","Type":"ContainerStarted","Data":"bdce2da0e29c012161cd267e7fd5d29f22f4aac1d2c84d019cde794f289809a3"} Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.054345 4743 scope.go:117] "RemoveContainer" containerID="5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.076206 4743 scope.go:117] "RemoveContainer" containerID="7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.085795 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvqcd\" (UniqueName: \"kubernetes.io/projected/801c1bcb-23de-4bf4-b1b0-774a6188c187-kube-api-access-jvqcd\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.085823 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.085837 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.094710 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "801c1bcb-23de-4bf4-b1b0-774a6188c187" (UID: "801c1bcb-23de-4bf4-b1b0-774a6188c187"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.096399 4743 scope.go:117] "RemoveContainer" containerID="b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.115827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-config-data" (OuterVolumeSpecName: "config-data") pod "801c1bcb-23de-4bf4-b1b0-774a6188c187" (UID: "801c1bcb-23de-4bf4-b1b0-774a6188c187"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.123954 4743 scope.go:117] "RemoveContainer" containerID="4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b" Nov 22 10:06:42 crc kubenswrapper[4743]: E1122 10:06:42.125105 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b\": container with ID starting with 4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b not found: ID does not exist" containerID="4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.125154 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b"} err="failed to get container status \"4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b\": rpc error: code = NotFound desc = could not find container \"4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b\": container with ID starting with 4e112789c4c79c55332174b506214f1245171a79bec7ca9f5d38b07e4e043d2b not found: ID does not exist" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.125177 4743 scope.go:117] "RemoveContainer" containerID="5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e" Nov 22 10:06:42 crc kubenswrapper[4743]: E1122 10:06:42.126855 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e\": container with ID starting with 5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e not found: ID does not exist" containerID="5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.126922 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e"} err="failed to get container status \"5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e\": rpc error: code = NotFound desc = could not find container \"5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e\": container with ID starting with 5ccd8fb2b045a485bc10e184031d089576bf2dafed5ecd16a8b038bf4df1d87e not found: ID does not exist" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.126949 4743 scope.go:117] "RemoveContainer" containerID="7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19" Nov 22 10:06:42 crc kubenswrapper[4743]: E1122 10:06:42.128267 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19\": container with ID starting with 7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19 not found: ID does not exist" containerID="7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.128297 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19"} err="failed to get container status \"7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19\": rpc error: code = NotFound desc = could not find container \"7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19\": container with ID starting with 7eb86f1ca12aa119c4fcdd7899d237c46939cca8aa6a0cabbc72b65c08d8de19 not found: ID does not exist" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.128320 4743 scope.go:117] "RemoveContainer" containerID="b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3" Nov 22 10:06:42 crc kubenswrapper[4743]: E1122 10:06:42.128783 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3\": container with ID starting with b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3 not found: ID does not exist" containerID="b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.128800 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3"} err="failed to get container status \"b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3\": rpc error: code = NotFound desc = could not find container \"b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3\": container with ID starting with b90192b813d6255dc30bd6222293e1207af36e0370b02737ddfb6d406f3608c3 not found: ID does not exist" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.187641 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.187663 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801c1bcb-23de-4bf4-b1b0-774a6188c187-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.387838 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.413201 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.430674 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:06:42 crc kubenswrapper[4743]: E1122 10:06:42.431435 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="sg-core" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.431456 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="sg-core" Nov 22 10:06:42 crc kubenswrapper[4743]: E1122 10:06:42.431492 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="ceilometer-notification-agent" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.431502 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="ceilometer-notification-agent" Nov 22 10:06:42 crc kubenswrapper[4743]: E1122 10:06:42.431544 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="ceilometer-central-agent" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.431553 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="ceilometer-central-agent" Nov 22 10:06:42 crc kubenswrapper[4743]: E1122 10:06:42.431598 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="proxy-httpd" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.431608 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="proxy-httpd" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.431923 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="ceilometer-notification-agent" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.431955 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="proxy-httpd" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.431972 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="ceilometer-central-agent" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.431985 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" containerName="sg-core" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.442496 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.442765 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.447853 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.447983 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.595134 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.595532 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-run-httpd\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.595678 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px2gc\" (UniqueName: \"kubernetes.io/projected/35cc2975-92ac-45da-b001-d7c40ffdb8fa-kube-api-access-px2gc\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.595736 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-log-httpd\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.595775 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.595917 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-config-data\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.596237 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-scripts\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.698711 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-run-httpd\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.699165 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-run-httpd\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.699327 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px2gc\" (UniqueName: \"kubernetes.io/projected/35cc2975-92ac-45da-b001-d7c40ffdb8fa-kube-api-access-px2gc\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.699714 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-log-httpd\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.699972 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-log-httpd\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.700011 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.700038 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-config-data\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.700090 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-scripts\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.700143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.707198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.711635 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.722305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-config-data\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.723541 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-scripts\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.725370 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px2gc\" (UniqueName: \"kubernetes.io/projected/35cc2975-92ac-45da-b001-d7c40ffdb8fa-kube-api-access-px2gc\") pod \"ceilometer-0\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " pod="openstack/ceilometer-0" Nov 22 10:06:42 crc kubenswrapper[4743]: I1122 10:06:42.761933 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:06:43 crc kubenswrapper[4743]: I1122 10:06:43.030498 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fgqpl"] Nov 22 10:06:43 crc kubenswrapper[4743]: I1122 10:06:43.038828 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fgqpl"] Nov 22 10:06:43 crc kubenswrapper[4743]: I1122 10:06:43.180748 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61988351-c22b-4c17-ad07-e5fdfd3edea0" path="/var/lib/kubelet/pods/61988351-c22b-4c17-ad07-e5fdfd3edea0/volumes" Nov 22 10:06:43 crc kubenswrapper[4743]: I1122 10:06:43.182139 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801c1bcb-23de-4bf4-b1b0-774a6188c187" path="/var/lib/kubelet/pods/801c1bcb-23de-4bf4-b1b0-774a6188c187/volumes" Nov 22 10:06:43 crc kubenswrapper[4743]: I1122 10:06:43.485700 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:06:43 crc kubenswrapper[4743]: W1122 10:06:43.488863 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35cc2975_92ac_45da_b001_d7c40ffdb8fa.slice/crio-78ea778c2416eaa8d6b3d9b7fe3e71abf1ad7827b079054307ca9f92e3601c0b WatchSource:0}: Error finding container 78ea778c2416eaa8d6b3d9b7fe3e71abf1ad7827b079054307ca9f92e3601c0b: Status 404 returned error can't find the container with id 78ea778c2416eaa8d6b3d9b7fe3e71abf1ad7827b079054307ca9f92e3601c0b Nov 22 10:06:44 crc kubenswrapper[4743]: I1122 10:06:44.062721 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35cc2975-92ac-45da-b001-d7c40ffdb8fa","Type":"ContainerStarted","Data":"78ea778c2416eaa8d6b3d9b7fe3e71abf1ad7827b079054307ca9f92e3601c0b"} Nov 22 10:06:44 crc kubenswrapper[4743]: I1122 10:06:44.065598 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b402e4ee-d154-408c-9b02-b5966ebda7f1","Type":"ContainerStarted","Data":"bb7ff8956c720eb44f298d8d7adab9e776a2d7c1c92e88880c327f5862815d90"} Nov 22 10:06:45 crc kubenswrapper[4743]: I1122 10:06:45.085859 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35cc2975-92ac-45da-b001-d7c40ffdb8fa","Type":"ContainerStarted","Data":"1bab0c5f048eb28a1d5b09723635391121840be3358353f90be2c92c543ac44b"} Nov 22 10:06:46 crc kubenswrapper[4743]: I1122 10:06:46.096057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35cc2975-92ac-45da-b001-d7c40ffdb8fa","Type":"ContainerStarted","Data":"50d71dacbea76e8a7c160a3ce24d9a9cca5f46778f51a251277c0e6d91b94f66"} Nov 22 10:06:46 crc kubenswrapper[4743]: I1122 10:06:46.099304 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"b402e4ee-d154-408c-9b02-b5966ebda7f1","Type":"ContainerStarted","Data":"bdb26b8b4e8118657e8753e07b02ddbc698767cafd138885f28153c5f5e786bf"} Nov 22 10:06:46 crc kubenswrapper[4743]: I1122 10:06:46.119520 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.031812973 podStartE2EDuration="8.119497099s" podCreationTimestamp="2025-11-22 10:06:38 +0000 UTC" firstStartedPulling="2025-11-22 10:06:39.725140001 +0000 UTC m=+6273.431501053" lastFinishedPulling="2025-11-22 10:06:44.812824127 +0000 UTC m=+6278.519185179" observedRunningTime="2025-11-22 10:06:46.116596705 +0000 UTC m=+6279.822957757" watchObservedRunningTime="2025-11-22 10:06:46.119497099 +0000 UTC m=+6279.825858161" Nov 22 10:06:47 crc kubenswrapper[4743]: I1122 10:06:47.165290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35cc2975-92ac-45da-b001-d7c40ffdb8fa","Type":"ContainerStarted","Data":"286ff6d7d709ec4fc4c9d197d1bfd5049f705c9389b082d7c36129ab0783c901"} Nov 22 10:06:49 crc kubenswrapper[4743]: I1122 10:06:49.176894 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35cc2975-92ac-45da-b001-d7c40ffdb8fa","Type":"ContainerStarted","Data":"0dd8fd6a384d89e03c514fca98b9d2817c709f5eae051b724a7c5ec55a8a9c0f"} Nov 22 10:06:49 crc kubenswrapper[4743]: I1122 10:06:49.177452 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 10:06:49 crc kubenswrapper[4743]: I1122 10:06:49.205204 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.28951776 podStartE2EDuration="7.205183174s" podCreationTimestamp="2025-11-22 10:06:42 +0000 UTC" firstStartedPulling="2025-11-22 10:06:43.490968368 +0000 UTC m=+6277.197329420" lastFinishedPulling="2025-11-22 10:06:48.406633782 +0000 UTC m=+6282.112994834" observedRunningTime="2025-11-22 10:06:49.19632137 +0000 UTC m=+6282.902682422" watchObservedRunningTime="2025-11-22 10:06:49.205183174 +0000 UTC m=+6282.911544226" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.758706 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-qpjd6"] Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.761736 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-qpjd6" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.762343 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-operator-scripts\") pod \"manila-db-create-qpjd6\" (UID: \"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4\") " pod="openstack/manila-db-create-qpjd6" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.762424 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbncn\" (UniqueName: \"kubernetes.io/projected/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-kube-api-access-mbncn\") pod \"manila-db-create-qpjd6\" (UID: \"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4\") " pod="openstack/manila-db-create-qpjd6" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.777465 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-qpjd6"] Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.864466 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-operator-scripts\") pod \"manila-db-create-qpjd6\" (UID: \"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4\") " pod="openstack/manila-db-create-qpjd6" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.873625 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbncn\" (UniqueName: \"kubernetes.io/projected/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-kube-api-access-mbncn\") pod \"manila-db-create-qpjd6\" (UID: \"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4\") " pod="openstack/manila-db-create-qpjd6" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.875317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-operator-scripts\") pod \"manila-db-create-qpjd6\" (UID: \"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4\") " pod="openstack/manila-db-create-qpjd6" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.882811 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-3f39-account-create-2x5s8"] Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.884290 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3f39-account-create-2x5s8" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.887515 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.903702 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbncn\" (UniqueName: \"kubernetes.io/projected/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-kube-api-access-mbncn\") pod \"manila-db-create-qpjd6\" (UID: \"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4\") " pod="openstack/manila-db-create-qpjd6" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.910345 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3f39-account-create-2x5s8"] Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.976087 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a118f6bb-d0e9-4297-85f5-b579dc740759-operator-scripts\") pod \"manila-3f39-account-create-2x5s8\" (UID: \"a118f6bb-d0e9-4297-85f5-b579dc740759\") " pod="openstack/manila-3f39-account-create-2x5s8" Nov 22 10:06:53 crc kubenswrapper[4743]: I1122 10:06:53.976308 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnzc7\" (UniqueName: \"kubernetes.io/projected/a118f6bb-d0e9-4297-85f5-b579dc740759-kube-api-access-rnzc7\") pod \"manila-3f39-account-create-2x5s8\" (UID: \"a118f6bb-d0e9-4297-85f5-b579dc740759\") " pod="openstack/manila-3f39-account-create-2x5s8" Nov 22 10:06:54 crc kubenswrapper[4743]: I1122 10:06:54.077737 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnzc7\" (UniqueName: \"kubernetes.io/projected/a118f6bb-d0e9-4297-85f5-b579dc740759-kube-api-access-rnzc7\") pod \"manila-3f39-account-create-2x5s8\" (UID: \"a118f6bb-d0e9-4297-85f5-b579dc740759\") " pod="openstack/manila-3f39-account-create-2x5s8" Nov 22 10:06:54 crc kubenswrapper[4743]: I1122 10:06:54.077815 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a118f6bb-d0e9-4297-85f5-b579dc740759-operator-scripts\") pod \"manila-3f39-account-create-2x5s8\" (UID: \"a118f6bb-d0e9-4297-85f5-b579dc740759\") " pod="openstack/manila-3f39-account-create-2x5s8" Nov 22 10:06:54 crc kubenswrapper[4743]: I1122 10:06:54.078505 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a118f6bb-d0e9-4297-85f5-b579dc740759-operator-scripts\") pod \"manila-3f39-account-create-2x5s8\" (UID: \"a118f6bb-d0e9-4297-85f5-b579dc740759\") " pod="openstack/manila-3f39-account-create-2x5s8" Nov 22 10:06:54 crc kubenswrapper[4743]: I1122 10:06:54.093264 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnzc7\" (UniqueName: \"kubernetes.io/projected/a118f6bb-d0e9-4297-85f5-b579dc740759-kube-api-access-rnzc7\") pod \"manila-3f39-account-create-2x5s8\" (UID: \"a118f6bb-d0e9-4297-85f5-b579dc740759\") " pod="openstack/manila-3f39-account-create-2x5s8" Nov 22 10:06:54 crc kubenswrapper[4743]: I1122 10:06:54.131446 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-qpjd6" Nov 22 10:06:54 crc kubenswrapper[4743]: I1122 10:06:54.261379 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3f39-account-create-2x5s8" Nov 22 10:06:54 crc kubenswrapper[4743]: I1122 10:06:54.678131 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-qpjd6"] Nov 22 10:06:54 crc kubenswrapper[4743]: W1122 10:06:54.688257 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e8e40fc_5b7d_40dd_81a9_3e230f4db2b4.slice/crio-11baebc1738d23d8625b857eaf5fdd3e236c450d9585f51dd03a6d734fe210ae WatchSource:0}: Error finding container 11baebc1738d23d8625b857eaf5fdd3e236c450d9585f51dd03a6d734fe210ae: Status 404 returned error can't find the container with id 11baebc1738d23d8625b857eaf5fdd3e236c450d9585f51dd03a6d734fe210ae Nov 22 10:06:54 crc kubenswrapper[4743]: I1122 10:06:54.794989 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-3f39-account-create-2x5s8"] Nov 22 10:06:54 crc kubenswrapper[4743]: W1122 10:06:54.797827 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda118f6bb_d0e9_4297_85f5_b579dc740759.slice/crio-d495919d39e908e34950ac938f08e7858b36d4e39db6ecdf76f93f21e527c037 WatchSource:0}: Error finding container d495919d39e908e34950ac938f08e7858b36d4e39db6ecdf76f93f21e527c037: Status 404 returned error can't find the container with id d495919d39e908e34950ac938f08e7858b36d4e39db6ecdf76f93f21e527c037 Nov 22 10:06:55 crc kubenswrapper[4743]: I1122 10:06:55.240449 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-qpjd6" event={"ID":"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4","Type":"ContainerStarted","Data":"ceca97e764fa576d858225fcf3bbdd07291418b769b76169581d4f154d20487e"} Nov 22 10:06:55 crc kubenswrapper[4743]: I1122 10:06:55.240806 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-qpjd6" event={"ID":"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4","Type":"ContainerStarted","Data":"11baebc1738d23d8625b857eaf5fdd3e236c450d9585f51dd03a6d734fe210ae"} Nov 22 10:06:55 crc kubenswrapper[4743]: I1122 10:06:55.243990 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3f39-account-create-2x5s8" event={"ID":"a118f6bb-d0e9-4297-85f5-b579dc740759","Type":"ContainerStarted","Data":"a14f738a065f4c7defdb2f542e6fe42fa6b15240b1e72be98929afc8a13777ac"} Nov 22 10:06:55 crc kubenswrapper[4743]: I1122 10:06:55.244066 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3f39-account-create-2x5s8" event={"ID":"a118f6bb-d0e9-4297-85f5-b579dc740759","Type":"ContainerStarted","Data":"d495919d39e908e34950ac938f08e7858b36d4e39db6ecdf76f93f21e527c037"} Nov 22 10:06:55 crc kubenswrapper[4743]: I1122 10:06:55.267054 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-create-qpjd6" podStartSLOduration=2.267036079 podStartE2EDuration="2.267036079s" podCreationTimestamp="2025-11-22 10:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:06:55.25835115 +0000 UTC m=+6288.964712212" watchObservedRunningTime="2025-11-22 10:06:55.267036079 +0000 UTC m=+6288.973397131" Nov 22 10:06:55 crc kubenswrapper[4743]: I1122 10:06:55.285165 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-3f39-account-create-2x5s8" podStartSLOduration=2.2851471500000002 podStartE2EDuration="2.28514715s" podCreationTimestamp="2025-11-22 10:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:06:55.272491656 +0000 UTC m=+6288.978852708" watchObservedRunningTime="2025-11-22 10:06:55.28514715 +0000 UTC m=+6288.991508202" Nov 22 10:06:56 crc kubenswrapper[4743]: I1122 10:06:56.253447 4743 generic.go:334] "Generic (PLEG): container finished" podID="1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4" containerID="ceca97e764fa576d858225fcf3bbdd07291418b769b76169581d4f154d20487e" exitCode=0 Nov 22 10:06:56 crc kubenswrapper[4743]: I1122 10:06:56.253506 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-qpjd6" event={"ID":"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4","Type":"ContainerDied","Data":"ceca97e764fa576d858225fcf3bbdd07291418b769b76169581d4f154d20487e"} Nov 22 10:06:56 crc kubenswrapper[4743]: I1122 10:06:56.257068 4743 generic.go:334] "Generic (PLEG): container finished" podID="a118f6bb-d0e9-4297-85f5-b579dc740759" containerID="a14f738a065f4c7defdb2f542e6fe42fa6b15240b1e72be98929afc8a13777ac" exitCode=0 Nov 22 10:06:56 crc kubenswrapper[4743]: I1122 10:06:56.257121 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3f39-account-create-2x5s8" event={"ID":"a118f6bb-d0e9-4297-85f5-b579dc740759","Type":"ContainerDied","Data":"a14f738a065f4c7defdb2f542e6fe42fa6b15240b1e72be98929afc8a13777ac"} Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.734050 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-qpjd6" Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.742831 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3f39-account-create-2x5s8" Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.757303 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-operator-scripts\") pod \"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4\" (UID: \"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4\") " Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.757801 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a118f6bb-d0e9-4297-85f5-b579dc740759-operator-scripts\") pod \"a118f6bb-d0e9-4297-85f5-b579dc740759\" (UID: \"a118f6bb-d0e9-4297-85f5-b579dc740759\") " Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.757936 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnzc7\" (UniqueName: \"kubernetes.io/projected/a118f6bb-d0e9-4297-85f5-b579dc740759-kube-api-access-rnzc7\") pod \"a118f6bb-d0e9-4297-85f5-b579dc740759\" (UID: \"a118f6bb-d0e9-4297-85f5-b579dc740759\") " Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.758090 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbncn\" (UniqueName: \"kubernetes.io/projected/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-kube-api-access-mbncn\") pod \"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4\" (UID: \"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4\") " Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.758306 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a118f6bb-d0e9-4297-85f5-b579dc740759-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a118f6bb-d0e9-4297-85f5-b579dc740759" (UID: "a118f6bb-d0e9-4297-85f5-b579dc740759"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.758317 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4" (UID: "1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.758830 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a118f6bb-d0e9-4297-85f5-b579dc740759-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.758860 4743 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.763456 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-kube-api-access-mbncn" (OuterVolumeSpecName: "kube-api-access-mbncn") pod "1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4" (UID: "1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4"). InnerVolumeSpecName "kube-api-access-mbncn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.764486 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a118f6bb-d0e9-4297-85f5-b579dc740759-kube-api-access-rnzc7" (OuterVolumeSpecName: "kube-api-access-rnzc7") pod "a118f6bb-d0e9-4297-85f5-b579dc740759" (UID: "a118f6bb-d0e9-4297-85f5-b579dc740759"). InnerVolumeSpecName "kube-api-access-rnzc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.860833 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnzc7\" (UniqueName: \"kubernetes.io/projected/a118f6bb-d0e9-4297-85f5-b579dc740759-kube-api-access-rnzc7\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:57 crc kubenswrapper[4743]: I1122 10:06:57.860882 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbncn\" (UniqueName: \"kubernetes.io/projected/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4-kube-api-access-mbncn\") on node \"crc\" DevicePath \"\"" Nov 22 10:06:58 crc kubenswrapper[4743]: I1122 10:06:58.285985 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-3f39-account-create-2x5s8" event={"ID":"a118f6bb-d0e9-4297-85f5-b579dc740759","Type":"ContainerDied","Data":"d495919d39e908e34950ac938f08e7858b36d4e39db6ecdf76f93f21e527c037"} Nov 22 10:06:58 crc kubenswrapper[4743]: I1122 10:06:58.286317 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d495919d39e908e34950ac938f08e7858b36d4e39db6ecdf76f93f21e527c037" Nov 22 10:06:58 crc kubenswrapper[4743]: I1122 10:06:58.285986 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-3f39-account-create-2x5s8" Nov 22 10:06:58 crc kubenswrapper[4743]: I1122 10:06:58.291683 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-qpjd6" Nov 22 10:06:58 crc kubenswrapper[4743]: I1122 10:06:58.291703 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-qpjd6" event={"ID":"1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4","Type":"ContainerDied","Data":"11baebc1738d23d8625b857eaf5fdd3e236c450d9585f51dd03a6d734fe210ae"} Nov 22 10:06:58 crc kubenswrapper[4743]: I1122 10:06:58.291750 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11baebc1738d23d8625b857eaf5fdd3e236c450d9585f51dd03a6d734fe210ae" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.259662 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-xthfh"] Nov 22 10:06:59 crc kubenswrapper[4743]: E1122 10:06:59.260113 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a118f6bb-d0e9-4297-85f5-b579dc740759" containerName="mariadb-account-create" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.260127 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a118f6bb-d0e9-4297-85f5-b579dc740759" containerName="mariadb-account-create" Nov 22 10:06:59 crc kubenswrapper[4743]: E1122 10:06:59.260136 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4" containerName="mariadb-database-create" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.260142 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4" containerName="mariadb-database-create" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.260359 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4" containerName="mariadb-database-create" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.260380 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a118f6bb-d0e9-4297-85f5-b579dc740759" containerName="mariadb-account-create" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.261391 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.263605 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.263834 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-6z6jf" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.275611 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-xthfh"] Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.287563 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-config-data\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.287653 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-job-config-data\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.287731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-combined-ca-bundle\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.287758 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7n2j\" (UniqueName: \"kubernetes.io/projected/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-kube-api-access-s7n2j\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.389228 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-combined-ca-bundle\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.389790 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7n2j\" (UniqueName: \"kubernetes.io/projected/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-kube-api-access-s7n2j\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.389915 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-config-data\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.389948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-job-config-data\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.396022 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-combined-ca-bundle\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.401702 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-config-data\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.403042 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-job-config-data\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.405058 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7n2j\" (UniqueName: \"kubernetes.io/projected/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-kube-api-access-s7n2j\") pod \"manila-db-sync-xthfh\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " pod="openstack/manila-db-sync-xthfh" Nov 22 10:06:59 crc kubenswrapper[4743]: I1122 10:06:59.595698 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-xthfh" Nov 22 10:07:00 crc kubenswrapper[4743]: I1122 10:07:00.417855 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-xthfh"] Nov 22 10:07:00 crc kubenswrapper[4743]: W1122 10:07:00.425497 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-80ef48719e05ebd85838dd60cfd626fa63c68fbcb3fce45d3c074f1f09654aaf WatchSource:0}: Error finding container 80ef48719e05ebd85838dd60cfd626fa63c68fbcb3fce45d3c074f1f09654aaf: Status 404 returned error can't find the container with id 80ef48719e05ebd85838dd60cfd626fa63c68fbcb3fce45d3c074f1f09654aaf Nov 22 10:07:01 crc kubenswrapper[4743]: I1122 10:07:01.241642 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:07:01 crc kubenswrapper[4743]: I1122 10:07:01.242382 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:07:01 crc kubenswrapper[4743]: I1122 10:07:01.242467 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 10:07:01 crc kubenswrapper[4743]: I1122 10:07:01.243989 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b404b343b65af21aff6649872514335dd4038e7612309804bf70aeba3bcb920"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:07:01 crc kubenswrapper[4743]: I1122 10:07:01.244119 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://6b404b343b65af21aff6649872514335dd4038e7612309804bf70aeba3bcb920" gracePeriod=600 Nov 22 10:07:01 crc kubenswrapper[4743]: I1122 10:07:01.323871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-xthfh" event={"ID":"a4ae49b4-51e1-4ade-a974-4ff8c96ab104","Type":"ContainerStarted","Data":"80ef48719e05ebd85838dd60cfd626fa63c68fbcb3fce45d3c074f1f09654aaf"} Nov 22 10:07:02 crc kubenswrapper[4743]: I1122 10:07:02.338920 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="6b404b343b65af21aff6649872514335dd4038e7612309804bf70aeba3bcb920" exitCode=0 Nov 22 10:07:02 crc kubenswrapper[4743]: I1122 10:07:02.339142 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"6b404b343b65af21aff6649872514335dd4038e7612309804bf70aeba3bcb920"} Nov 22 10:07:02 crc kubenswrapper[4743]: I1122 10:07:02.339764 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8"} Nov 22 10:07:02 crc kubenswrapper[4743]: I1122 10:07:02.339810 4743 scope.go:117] "RemoveContainer" containerID="fd7baf739caaffc5109b2cb11d7f34b3aef83fb58b3f8ff7d273785b162e7b68" Nov 22 10:07:06 crc kubenswrapper[4743]: I1122 10:07:06.397197 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-xthfh" event={"ID":"a4ae49b4-51e1-4ade-a974-4ff8c96ab104","Type":"ContainerStarted","Data":"b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1"} Nov 22 10:07:06 crc kubenswrapper[4743]: I1122 10:07:06.419215 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-xthfh" podStartSLOduration=2.741906411 podStartE2EDuration="7.419195485s" podCreationTimestamp="2025-11-22 10:06:59 +0000 UTC" firstStartedPulling="2025-11-22 10:07:00.428593217 +0000 UTC m=+6294.134954269" lastFinishedPulling="2025-11-22 10:07:05.105882291 +0000 UTC m=+6298.812243343" observedRunningTime="2025-11-22 10:07:06.414114259 +0000 UTC m=+6300.120475321" watchObservedRunningTime="2025-11-22 10:07:06.419195485 +0000 UTC m=+6300.125556537" Nov 22 10:07:07 crc kubenswrapper[4743]: I1122 10:07:07.407838 4743 generic.go:334] "Generic (PLEG): container finished" podID="a4ae49b4-51e1-4ade-a974-4ff8c96ab104" containerID="b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1" exitCode=0 Nov 22 10:07:07 crc kubenswrapper[4743]: I1122 10:07:07.407931 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-xthfh" event={"ID":"a4ae49b4-51e1-4ade-a974-4ff8c96ab104","Type":"ContainerDied","Data":"b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1"} Nov 22 10:07:08 crc kubenswrapper[4743]: E1122 10:07:08.514611 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-conmon-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.050460 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-xthfh" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.190373 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-config-data\") pod \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.190804 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-job-config-data\") pod \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.190997 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7n2j\" (UniqueName: \"kubernetes.io/projected/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-kube-api-access-s7n2j\") pod \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.191097 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-combined-ca-bundle\") pod \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\" (UID: \"a4ae49b4-51e1-4ade-a974-4ff8c96ab104\") " Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.196106 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "a4ae49b4-51e1-4ade-a974-4ff8c96ab104" (UID: "a4ae49b4-51e1-4ade-a974-4ff8c96ab104"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.196156 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-kube-api-access-s7n2j" (OuterVolumeSpecName: "kube-api-access-s7n2j") pod "a4ae49b4-51e1-4ade-a974-4ff8c96ab104" (UID: "a4ae49b4-51e1-4ade-a974-4ff8c96ab104"). InnerVolumeSpecName "kube-api-access-s7n2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.200315 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-config-data" (OuterVolumeSpecName: "config-data") pod "a4ae49b4-51e1-4ade-a974-4ff8c96ab104" (UID: "a4ae49b4-51e1-4ade-a974-4ff8c96ab104"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.228254 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4ae49b4-51e1-4ade-a974-4ff8c96ab104" (UID: "a4ae49b4-51e1-4ade-a974-4ff8c96ab104"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.293953 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.293982 4743 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-job-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.293993 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7n2j\" (UniqueName: \"kubernetes.io/projected/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-kube-api-access-s7n2j\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.294003 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4ae49b4-51e1-4ade-a974-4ff8c96ab104-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.428435 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-xthfh" event={"ID":"a4ae49b4-51e1-4ade-a974-4ff8c96ab104","Type":"ContainerDied","Data":"80ef48719e05ebd85838dd60cfd626fa63c68fbcb3fce45d3c074f1f09654aaf"} Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.428473 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ef48719e05ebd85838dd60cfd626fa63c68fbcb3fce45d3c074f1f09654aaf" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.428505 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-xthfh" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.702432 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 10:07:09 crc kubenswrapper[4743]: E1122 10:07:09.703064 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4ae49b4-51e1-4ade-a974-4ff8c96ab104" containerName="manila-db-sync" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.703075 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4ae49b4-51e1-4ade-a974-4ff8c96ab104" containerName="manila-db-sync" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.703283 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4ae49b4-51e1-4ade-a974-4ff8c96ab104" containerName="manila-db-sync" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.704379 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.706760 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.707092 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-6z6jf" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.708686 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.712280 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.716472 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.797733 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.805784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/053bd588-f677-48d5-b22d-93b3a70e8c4c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.805844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-config-data\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.805973 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-scripts\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.806017 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmw8d\" (UniqueName: \"kubernetes.io/projected/053bd588-f677-48d5-b22d-93b3a70e8c4c-kube-api-access-xmw8d\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.806058 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.806093 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.828452 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.828570 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.835267 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.895323 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55586cc989-mzvsh"] Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.900252 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.913822 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-scripts\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.913913 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-scripts\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.913934 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d8vz\" (UniqueName: \"kubernetes.io/projected/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-kube-api-access-9d8vz\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.913956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmw8d\" (UniqueName: \"kubernetes.io/projected/053bd588-f677-48d5-b22d-93b3a70e8c4c-kube-api-access-xmw8d\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.913989 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.914011 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.914042 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.914221 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/053bd588-f677-48d5-b22d-93b3a70e8c4c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.914268 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.914300 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-config-data\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.914561 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-config-data\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.914611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.914635 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.914722 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-ceph\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.919016 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/053bd588-f677-48d5-b22d-93b3a70e8c4c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.923168 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-scripts\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.925080 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-config-data\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.925552 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.927256 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55586cc989-mzvsh"] Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.929694 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/053bd588-f677-48d5-b22d-93b3a70e8c4c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:09 crc kubenswrapper[4743]: I1122 10:07:09.942086 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmw8d\" (UniqueName: \"kubernetes.io/projected/053bd588-f677-48d5-b22d-93b3a70e8c4c-kube-api-access-xmw8d\") pod \"manila-scheduler-0\" (UID: \"053bd588-f677-48d5-b22d-93b3a70e8c4c\") " pod="openstack/manila-scheduler-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.019741 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020091 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-config-data\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020130 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-ceph\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-sb\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020257 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4zhh\" (UniqueName: \"kubernetes.io/projected/94e9031c-368e-4df8-98ed-2dff38276a65-kube-api-access-b4zhh\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020285 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-scripts\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020301 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d8vz\" (UniqueName: \"kubernetes.io/projected/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-kube-api-access-9d8vz\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020323 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-dns-svc\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020350 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020410 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020440 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-nb\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020461 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-config\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.020719 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.021037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.026266 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-scripts\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.026294 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.026496 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-ceph\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.027208 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-config-data\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.030594 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.039105 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d8vz\" (UniqueName: \"kubernetes.io/projected/1e8ba118-c440-473d-a783-ff6a8e2e8ee5-kube-api-access-9d8vz\") pod \"manila-share-share1-0\" (UID: \"1e8ba118-c440-473d-a783-ff6a8e2e8ee5\") " pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.104866 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.110703 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.117010 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.128988 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-nb\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.129027 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-config\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.129140 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-sb\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.129180 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4zhh\" (UniqueName: \"kubernetes.io/projected/94e9031c-368e-4df8-98ed-2dff38276a65-kube-api-access-b4zhh\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.129211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-dns-svc\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.130269 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-dns-svc\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.130839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-nb\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.131384 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-config\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.131805 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.131895 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-sb\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.158326 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4zhh\" (UniqueName: \"kubernetes.io/projected/94e9031c-368e-4df8-98ed-2dff38276a65-kube-api-access-b4zhh\") pod \"dnsmasq-dns-55586cc989-mzvsh\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.192625 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.230461 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.231532 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221a89ef-fff9-464c-a3db-61deeb85a20b-logs\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.231680 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46x95\" (UniqueName: \"kubernetes.io/projected/221a89ef-fff9-464c-a3db-61deeb85a20b-kube-api-access-46x95\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.231717 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/221a89ef-fff9-464c-a3db-61deeb85a20b-etc-machine-id\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.231746 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.232005 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-config-data-custom\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.232062 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-config-data\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.232142 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-scripts\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.334898 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46x95\" (UniqueName: \"kubernetes.io/projected/221a89ef-fff9-464c-a3db-61deeb85a20b-kube-api-access-46x95\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.334965 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/221a89ef-fff9-464c-a3db-61deeb85a20b-etc-machine-id\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.334984 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.335028 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-config-data-custom\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.335072 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-config-data\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.335162 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-scripts\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.335224 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221a89ef-fff9-464c-a3db-61deeb85a20b-logs\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.338963 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/221a89ef-fff9-464c-a3db-61deeb85a20b-etc-machine-id\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.344047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-scripts\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.344258 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/221a89ef-fff9-464c-a3db-61deeb85a20b-logs\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.346646 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.349510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-config-data-custom\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.361450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/221a89ef-fff9-464c-a3db-61deeb85a20b-config-data\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.364298 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46x95\" (UniqueName: \"kubernetes.io/projected/221a89ef-fff9-464c-a3db-61deeb85a20b-kube-api-access-46x95\") pod \"manila-api-0\" (UID: \"221a89ef-fff9-464c-a3db-61deeb85a20b\") " pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.543406 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.699471 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Nov 22 10:07:10 crc kubenswrapper[4743]: I1122 10:07:10.966876 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Nov 22 10:07:11 crc kubenswrapper[4743]: I1122 10:07:11.067383 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55586cc989-mzvsh"] Nov 22 10:07:11 crc kubenswrapper[4743]: W1122 10:07:11.071945 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94e9031c_368e_4df8_98ed_2dff38276a65.slice/crio-ef821cee2abf29542098bdd235023be5c75dd051d0fcb34a8aba5b5d6480af13 WatchSource:0}: Error finding container ef821cee2abf29542098bdd235023be5c75dd051d0fcb34a8aba5b5d6480af13: Status 404 returned error can't find the container with id ef821cee2abf29542098bdd235023be5c75dd051d0fcb34a8aba5b5d6480af13 Nov 22 10:07:11 crc kubenswrapper[4743]: I1122 10:07:11.280494 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Nov 22 10:07:11 crc kubenswrapper[4743]: I1122 10:07:11.464563 4743 generic.go:334] "Generic (PLEG): container finished" podID="94e9031c-368e-4df8-98ed-2dff38276a65" containerID="8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d" exitCode=0 Nov 22 10:07:11 crc kubenswrapper[4743]: I1122 10:07:11.465048 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" event={"ID":"94e9031c-368e-4df8-98ed-2dff38276a65","Type":"ContainerDied","Data":"8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d"} Nov 22 10:07:11 crc kubenswrapper[4743]: I1122 10:07:11.465084 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" event={"ID":"94e9031c-368e-4df8-98ed-2dff38276a65","Type":"ContainerStarted","Data":"ef821cee2abf29542098bdd235023be5c75dd051d0fcb34a8aba5b5d6480af13"} Nov 22 10:07:11 crc kubenswrapper[4743]: I1122 10:07:11.473261 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"221a89ef-fff9-464c-a3db-61deeb85a20b","Type":"ContainerStarted","Data":"535b0e9ab559890c09ba7b30b696d383d86723551f3fdd8c51f58a0cde5ceafd"} Nov 22 10:07:11 crc kubenswrapper[4743]: I1122 10:07:11.486815 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1e8ba118-c440-473d-a783-ff6a8e2e8ee5","Type":"ContainerStarted","Data":"176982dace0de00501dec6bd159ab44bdabcb13fe90e2f2989e561624be16f07"} Nov 22 10:07:11 crc kubenswrapper[4743]: I1122 10:07:11.502313 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"053bd588-f677-48d5-b22d-93b3a70e8c4c","Type":"ContainerStarted","Data":"fcc92884529b0904cd04c979536dcc2bd0bf77ae70afde71ea40fbd29f97f2dd"} Nov 22 10:07:12 crc kubenswrapper[4743]: I1122 10:07:12.576476 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"221a89ef-fff9-464c-a3db-61deeb85a20b","Type":"ContainerStarted","Data":"1ce2494b5ad6866f005c6fca96bf8461fe8a8cf75b2ce88b963663ac2a30d2b7"} Nov 22 10:07:12 crc kubenswrapper[4743]: I1122 10:07:12.579881 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"053bd588-f677-48d5-b22d-93b3a70e8c4c","Type":"ContainerStarted","Data":"5207ec7b6df2ab2864a5d2853ce2a4101e9c66feae91a58fa9026f0ebff15a30"} Nov 22 10:07:12 crc kubenswrapper[4743]: I1122 10:07:12.579942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"053bd588-f677-48d5-b22d-93b3a70e8c4c","Type":"ContainerStarted","Data":"84914cd80ea3693cb3b40d1a1091245acb53a702882f873fdca1376869c8d64f"} Nov 22 10:07:12 crc kubenswrapper[4743]: I1122 10:07:12.582365 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" event={"ID":"94e9031c-368e-4df8-98ed-2dff38276a65","Type":"ContainerStarted","Data":"4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5"} Nov 22 10:07:12 crc kubenswrapper[4743]: I1122 10:07:12.582483 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:12 crc kubenswrapper[4743]: I1122 10:07:12.600730 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.880484654 podStartE2EDuration="3.600708847s" podCreationTimestamp="2025-11-22 10:07:09 +0000 UTC" firstStartedPulling="2025-11-22 10:07:10.728441554 +0000 UTC m=+6304.434802606" lastFinishedPulling="2025-11-22 10:07:11.448665737 +0000 UTC m=+6305.155026799" observedRunningTime="2025-11-22 10:07:12.597413812 +0000 UTC m=+6306.303774864" watchObservedRunningTime="2025-11-22 10:07:12.600708847 +0000 UTC m=+6306.307069899" Nov 22 10:07:12 crc kubenswrapper[4743]: I1122 10:07:12.628467 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" podStartSLOduration=3.628445254 podStartE2EDuration="3.628445254s" podCreationTimestamp="2025-11-22 10:07:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:07:12.624201052 +0000 UTC m=+6306.330562104" watchObservedRunningTime="2025-11-22 10:07:12.628445254 +0000 UTC m=+6306.334806306" Nov 22 10:07:12 crc kubenswrapper[4743]: I1122 10:07:12.770417 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 10:07:13 crc kubenswrapper[4743]: I1122 10:07:13.444051 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:13 crc kubenswrapper[4743]: I1122 10:07:13.601631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"221a89ef-fff9-464c-a3db-61deeb85a20b","Type":"ContainerStarted","Data":"5930dca9b77f0d43c28841663cf2b3e4812e01472a63e6eb0bd059e323cea180"} Nov 22 10:07:13 crc kubenswrapper[4743]: I1122 10:07:13.601679 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="ceilometer-central-agent" containerID="cri-o://1bab0c5f048eb28a1d5b09723635391121840be3358353f90be2c92c543ac44b" gracePeriod=30 Nov 22 10:07:13 crc kubenswrapper[4743]: I1122 10:07:13.601783 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="proxy-httpd" containerID="cri-o://0dd8fd6a384d89e03c514fca98b9d2817c709f5eae051b724a7c5ec55a8a9c0f" gracePeriod=30 Nov 22 10:07:13 crc kubenswrapper[4743]: I1122 10:07:13.601839 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="sg-core" containerID="cri-o://286ff6d7d709ec4fc4c9d197d1bfd5049f705c9389b082d7c36129ab0783c901" gracePeriod=30 Nov 22 10:07:13 crc kubenswrapper[4743]: I1122 10:07:13.601888 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="ceilometer-notification-agent" containerID="cri-o://50d71dacbea76e8a7c160a3ce24d9a9cca5f46778f51a251277c0e6d91b94f66" gracePeriod=30 Nov 22 10:07:13 crc kubenswrapper[4743]: I1122 10:07:13.602570 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Nov 22 10:07:13 crc kubenswrapper[4743]: I1122 10:07:13.625892 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.6258719619999997 podStartE2EDuration="3.625871962s" podCreationTimestamp="2025-11-22 10:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:07:13.624340068 +0000 UTC m=+6307.330701140" watchObservedRunningTime="2025-11-22 10:07:13.625871962 +0000 UTC m=+6307.332233014" Nov 22 10:07:14 crc kubenswrapper[4743]: I1122 10:07:14.618353 4743 generic.go:334] "Generic (PLEG): container finished" podID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerID="0dd8fd6a384d89e03c514fca98b9d2817c709f5eae051b724a7c5ec55a8a9c0f" exitCode=0 Nov 22 10:07:14 crc kubenswrapper[4743]: I1122 10:07:14.618726 4743 generic.go:334] "Generic (PLEG): container finished" podID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerID="286ff6d7d709ec4fc4c9d197d1bfd5049f705c9389b082d7c36129ab0783c901" exitCode=2 Nov 22 10:07:14 crc kubenswrapper[4743]: I1122 10:07:14.618739 4743 generic.go:334] "Generic (PLEG): container finished" podID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerID="1bab0c5f048eb28a1d5b09723635391121840be3358353f90be2c92c543ac44b" exitCode=0 Nov 22 10:07:14 crc kubenswrapper[4743]: I1122 10:07:14.618673 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35cc2975-92ac-45da-b001-d7c40ffdb8fa","Type":"ContainerDied","Data":"0dd8fd6a384d89e03c514fca98b9d2817c709f5eae051b724a7c5ec55a8a9c0f"} Nov 22 10:07:14 crc kubenswrapper[4743]: I1122 10:07:14.618942 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35cc2975-92ac-45da-b001-d7c40ffdb8fa","Type":"ContainerDied","Data":"286ff6d7d709ec4fc4c9d197d1bfd5049f705c9389b082d7c36129ab0783c901"} Nov 22 10:07:14 crc kubenswrapper[4743]: I1122 10:07:14.618973 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35cc2975-92ac-45da-b001-d7c40ffdb8fa","Type":"ContainerDied","Data":"1bab0c5f048eb28a1d5b09723635391121840be3358353f90be2c92c543ac44b"} Nov 22 10:07:16 crc kubenswrapper[4743]: I1122 10:07:16.676434 4743 generic.go:334] "Generic (PLEG): container finished" podID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerID="50d71dacbea76e8a7c160a3ce24d9a9cca5f46778f51a251277c0e6d91b94f66" exitCode=0 Nov 22 10:07:16 crc kubenswrapper[4743]: I1122 10:07:16.676503 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35cc2975-92ac-45da-b001-d7c40ffdb8fa","Type":"ContainerDied","Data":"50d71dacbea76e8a7c160a3ce24d9a9cca5f46778f51a251277c0e6d91b94f66"} Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.692492 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35cc2975-92ac-45da-b001-d7c40ffdb8fa","Type":"ContainerDied","Data":"78ea778c2416eaa8d6b3d9b7fe3e71abf1ad7827b079054307ca9f92e3601c0b"} Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.693710 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78ea778c2416eaa8d6b3d9b7fe3e71abf1ad7827b079054307ca9f92e3601c0b" Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.862470 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.911927 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-px2gc\" (UniqueName: \"kubernetes.io/projected/35cc2975-92ac-45da-b001-d7c40ffdb8fa-kube-api-access-px2gc\") pod \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.912021 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-sg-core-conf-yaml\") pod \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.912079 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-combined-ca-bundle\") pod \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.912109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-log-httpd\") pod \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.912146 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-scripts\") pod \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.912173 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-config-data\") pod \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.912266 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-run-httpd\") pod \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\" (UID: \"35cc2975-92ac-45da-b001-d7c40ffdb8fa\") " Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.913108 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "35cc2975-92ac-45da-b001-d7c40ffdb8fa" (UID: "35cc2975-92ac-45da-b001-d7c40ffdb8fa"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.913666 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "35cc2975-92ac-45da-b001-d7c40ffdb8fa" (UID: "35cc2975-92ac-45da-b001-d7c40ffdb8fa"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.916692 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-scripts" (OuterVolumeSpecName: "scripts") pod "35cc2975-92ac-45da-b001-d7c40ffdb8fa" (UID: "35cc2975-92ac-45da-b001-d7c40ffdb8fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.920774 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35cc2975-92ac-45da-b001-d7c40ffdb8fa-kube-api-access-px2gc" (OuterVolumeSpecName: "kube-api-access-px2gc") pod "35cc2975-92ac-45da-b001-d7c40ffdb8fa" (UID: "35cc2975-92ac-45da-b001-d7c40ffdb8fa"). InnerVolumeSpecName "kube-api-access-px2gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:07:17 crc kubenswrapper[4743]: I1122 10:07:17.948388 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "35cc2975-92ac-45da-b001-d7c40ffdb8fa" (UID: "35cc2975-92ac-45da-b001-d7c40ffdb8fa"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.012566 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35cc2975-92ac-45da-b001-d7c40ffdb8fa" (UID: "35cc2975-92ac-45da-b001-d7c40ffdb8fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.015027 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.015055 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.015064 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.015072 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35cc2975-92ac-45da-b001-d7c40ffdb8fa-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.015082 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-px2gc\" (UniqueName: \"kubernetes.io/projected/35cc2975-92ac-45da-b001-d7c40ffdb8fa-kube-api-access-px2gc\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.015091 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.025880 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-config-data" (OuterVolumeSpecName: "config-data") pod "35cc2975-92ac-45da-b001-d7c40ffdb8fa" (UID: "35cc2975-92ac-45da-b001-d7c40ffdb8fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.116765 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35cc2975-92ac-45da-b001-d7c40ffdb8fa-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.703805 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.704124 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1e8ba118-c440-473d-a783-ff6a8e2e8ee5","Type":"ContainerStarted","Data":"00b05b9578e1ac0770978300fcbf4282a0310d3a2977091bd5183d9cb55939cc"} Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.705133 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"1e8ba118-c440-473d-a783-ff6a8e2e8ee5","Type":"ContainerStarted","Data":"c12a8a409e8895bd028360cfc1532b9c106e98b9f3d19fcc1401f99e4ba13696"} Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.741601 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.0168049 podStartE2EDuration="9.741564191s" podCreationTimestamp="2025-11-22 10:07:09 +0000 UTC" firstStartedPulling="2025-11-22 10:07:10.971410435 +0000 UTC m=+6304.677771487" lastFinishedPulling="2025-11-22 10:07:17.696169716 +0000 UTC m=+6311.402530778" observedRunningTime="2025-11-22 10:07:18.723748 +0000 UTC m=+6312.430109052" watchObservedRunningTime="2025-11-22 10:07:18.741564191 +0000 UTC m=+6312.447925243" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.769958 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.781158 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.818726 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:18 crc kubenswrapper[4743]: E1122 10:07:18.819924 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="ceilometer-notification-agent" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.819952 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="ceilometer-notification-agent" Nov 22 10:07:18 crc kubenswrapper[4743]: E1122 10:07:18.819987 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="sg-core" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.819998 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="sg-core" Nov 22 10:07:18 crc kubenswrapper[4743]: E1122 10:07:18.820027 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="ceilometer-central-agent" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.820037 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="ceilometer-central-agent" Nov 22 10:07:18 crc kubenswrapper[4743]: E1122 10:07:18.820100 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="proxy-httpd" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.820110 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="proxy-httpd" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.820837 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="sg-core" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.820878 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="proxy-httpd" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.820905 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="ceilometer-central-agent" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.820931 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" containerName="ceilometer-notification-agent" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.826450 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.831716 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.832900 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.853728 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:18 crc kubenswrapper[4743]: E1122 10:07:18.877776 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod35cc2975_92ac_45da_b001_d7c40ffdb8fa.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-conmon-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.937552 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-config-data\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.938115 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-run-httpd\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.938164 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-log-httpd\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.938275 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.938404 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.938504 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8c9z\" (UniqueName: \"kubernetes.io/projected/c49cf484-8011-4ce4-8185-074116b2326f-kube-api-access-r8c9z\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:18 crc kubenswrapper[4743]: I1122 10:07:18.938542 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-scripts\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.040806 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8c9z\" (UniqueName: \"kubernetes.io/projected/c49cf484-8011-4ce4-8185-074116b2326f-kube-api-access-r8c9z\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.040860 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-scripts\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.040936 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-config-data\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.040997 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-run-httpd\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.041020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-log-httpd\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.041076 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.041163 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.041655 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-log-httpd\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.041658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-run-httpd\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.047303 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.047699 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-config-data\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.047839 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-scripts\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.048047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.058563 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8c9z\" (UniqueName: \"kubernetes.io/projected/c49cf484-8011-4ce4-8185-074116b2326f-kube-api-access-r8c9z\") pod \"ceilometer-0\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.159474 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.164645 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35cc2975-92ac-45da-b001-d7c40ffdb8fa" path="/var/lib/kubelet/pods/35cc2975-92ac-45da-b001-d7c40ffdb8fa/volumes" Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.629155 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:19 crc kubenswrapper[4743]: I1122 10:07:19.714731 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c49cf484-8011-4ce4-8185-074116b2326f","Type":"ContainerStarted","Data":"b6f2c6eebe5204e304147a8bfa09c841d8a7310ff1d577aa1c453e8b3c2b6867"} Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.020791 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.193267 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.231791 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.330873 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-qxqsl"] Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.331108 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" podUID="06eb355f-ad73-42ff-8549-9a2d8a71b5f8" containerName="dnsmasq-dns" containerID="cri-o://286f64e0c4317765d7f018c71499c21348eb7d8936b03824d2a454948f4a7504" gracePeriod=10 Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.730947 4743 generic.go:334] "Generic (PLEG): container finished" podID="06eb355f-ad73-42ff-8549-9a2d8a71b5f8" containerID="286f64e0c4317765d7f018c71499c21348eb7d8936b03824d2a454948f4a7504" exitCode=0 Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.731118 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" event={"ID":"06eb355f-ad73-42ff-8549-9a2d8a71b5f8","Type":"ContainerDied","Data":"286f64e0c4317765d7f018c71499c21348eb7d8936b03824d2a454948f4a7504"} Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.735042 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c49cf484-8011-4ce4-8185-074116b2326f","Type":"ContainerStarted","Data":"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5"} Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.848917 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.885129 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-dns-svc\") pod \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.885248 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-config\") pod \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.885364 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxz7r\" (UniqueName: \"kubernetes.io/projected/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-kube-api-access-pxz7r\") pod \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.885384 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-sb\") pod \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.885429 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-nb\") pod \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\" (UID: \"06eb355f-ad73-42ff-8549-9a2d8a71b5f8\") " Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.904397 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-kube-api-access-pxz7r" (OuterVolumeSpecName: "kube-api-access-pxz7r") pod "06eb355f-ad73-42ff-8549-9a2d8a71b5f8" (UID: "06eb355f-ad73-42ff-8549-9a2d8a71b5f8"). InnerVolumeSpecName "kube-api-access-pxz7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.948681 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-config" (OuterVolumeSpecName: "config") pod "06eb355f-ad73-42ff-8549-9a2d8a71b5f8" (UID: "06eb355f-ad73-42ff-8549-9a2d8a71b5f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.968258 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06eb355f-ad73-42ff-8549-9a2d8a71b5f8" (UID: "06eb355f-ad73-42ff-8549-9a2d8a71b5f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.988359 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-config\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.988426 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxz7r\" (UniqueName: \"kubernetes.io/projected/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-kube-api-access-pxz7r\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.988437 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:20 crc kubenswrapper[4743]: I1122 10:07:20.990175 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06eb355f-ad73-42ff-8549-9a2d8a71b5f8" (UID: "06eb355f-ad73-42ff-8549-9a2d8a71b5f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:07:21 crc kubenswrapper[4743]: I1122 10:07:21.000035 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06eb355f-ad73-42ff-8549-9a2d8a71b5f8" (UID: "06eb355f-ad73-42ff-8549-9a2d8a71b5f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:07:21 crc kubenswrapper[4743]: I1122 10:07:21.090321 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:21 crc kubenswrapper[4743]: I1122 10:07:21.090365 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06eb355f-ad73-42ff-8549-9a2d8a71b5f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:21 crc kubenswrapper[4743]: I1122 10:07:21.746612 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" event={"ID":"06eb355f-ad73-42ff-8549-9a2d8a71b5f8","Type":"ContainerDied","Data":"d248483764fc1c420e6fda4dd9b9eef1dbdd40fe72223d9986b6c5dfaa52ee7f"} Nov 22 10:07:21 crc kubenswrapper[4743]: I1122 10:07:21.746964 4743 scope.go:117] "RemoveContainer" containerID="286f64e0c4317765d7f018c71499c21348eb7d8936b03824d2a454948f4a7504" Nov 22 10:07:21 crc kubenswrapper[4743]: I1122 10:07:21.747139 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc68b6c7-qxqsl" Nov 22 10:07:21 crc kubenswrapper[4743]: I1122 10:07:21.754493 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c49cf484-8011-4ce4-8185-074116b2326f","Type":"ContainerStarted","Data":"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd"} Nov 22 10:07:21 crc kubenswrapper[4743]: I1122 10:07:21.858735 4743 scope.go:117] "RemoveContainer" containerID="628653c160e74df425da8386b159f78e50c01cfd9a66c71526c46757b2f93862" Nov 22 10:07:21 crc kubenswrapper[4743]: I1122 10:07:21.879664 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-qxqsl"] Nov 22 10:07:21 crc kubenswrapper[4743]: I1122 10:07:21.885447 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dc68b6c7-qxqsl"] Nov 22 10:07:22 crc kubenswrapper[4743]: I1122 10:07:22.765892 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c49cf484-8011-4ce4-8185-074116b2326f","Type":"ContainerStarted","Data":"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828"} Nov 22 10:07:23 crc kubenswrapper[4743]: I1122 10:07:23.163124 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06eb355f-ad73-42ff-8549-9a2d8a71b5f8" path="/var/lib/kubelet/pods/06eb355f-ad73-42ff-8549-9a2d8a71b5f8/volumes" Nov 22 10:07:24 crc kubenswrapper[4743]: I1122 10:07:24.626331 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:25 crc kubenswrapper[4743]: I1122 10:07:25.798879 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c49cf484-8011-4ce4-8185-074116b2326f","Type":"ContainerStarted","Data":"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655"} Nov 22 10:07:25 crc kubenswrapper[4743]: I1122 10:07:25.799005 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="ceilometer-central-agent" containerID="cri-o://e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5" gracePeriod=30 Nov 22 10:07:25 crc kubenswrapper[4743]: I1122 10:07:25.799048 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="sg-core" containerID="cri-o://e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828" gracePeriod=30 Nov 22 10:07:25 crc kubenswrapper[4743]: I1122 10:07:25.799062 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="proxy-httpd" containerID="cri-o://76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655" gracePeriod=30 Nov 22 10:07:25 crc kubenswrapper[4743]: I1122 10:07:25.799154 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="ceilometer-notification-agent" containerID="cri-o://59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd" gracePeriod=30 Nov 22 10:07:25 crc kubenswrapper[4743]: I1122 10:07:25.799450 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 10:07:25 crc kubenswrapper[4743]: I1122 10:07:25.827465 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.60634789 podStartE2EDuration="7.827443518s" podCreationTimestamp="2025-11-22 10:07:18 +0000 UTC" firstStartedPulling="2025-11-22 10:07:19.639715677 +0000 UTC m=+6313.346076729" lastFinishedPulling="2025-11-22 10:07:24.860811265 +0000 UTC m=+6318.567172357" observedRunningTime="2025-11-22 10:07:25.822911007 +0000 UTC m=+6319.529272059" watchObservedRunningTime="2025-11-22 10:07:25.827443518 +0000 UTC m=+6319.533804580" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.475269 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.597512 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-config-data\") pod \"c49cf484-8011-4ce4-8185-074116b2326f\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.597669 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-sg-core-conf-yaml\") pod \"c49cf484-8011-4ce4-8185-074116b2326f\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.597706 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8c9z\" (UniqueName: \"kubernetes.io/projected/c49cf484-8011-4ce4-8185-074116b2326f-kube-api-access-r8c9z\") pod \"c49cf484-8011-4ce4-8185-074116b2326f\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.597887 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-run-httpd\") pod \"c49cf484-8011-4ce4-8185-074116b2326f\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.597913 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-scripts\") pod \"c49cf484-8011-4ce4-8185-074116b2326f\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.597939 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-combined-ca-bundle\") pod \"c49cf484-8011-4ce4-8185-074116b2326f\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.597986 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-log-httpd\") pod \"c49cf484-8011-4ce4-8185-074116b2326f\" (UID: \"c49cf484-8011-4ce4-8185-074116b2326f\") " Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.598531 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c49cf484-8011-4ce4-8185-074116b2326f" (UID: "c49cf484-8011-4ce4-8185-074116b2326f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.599053 4743 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-run-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.599166 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c49cf484-8011-4ce4-8185-074116b2326f" (UID: "c49cf484-8011-4ce4-8185-074116b2326f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.606428 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-scripts" (OuterVolumeSpecName: "scripts") pod "c49cf484-8011-4ce4-8185-074116b2326f" (UID: "c49cf484-8011-4ce4-8185-074116b2326f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.606484 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c49cf484-8011-4ce4-8185-074116b2326f-kube-api-access-r8c9z" (OuterVolumeSpecName: "kube-api-access-r8c9z") pod "c49cf484-8011-4ce4-8185-074116b2326f" (UID: "c49cf484-8011-4ce4-8185-074116b2326f"). InnerVolumeSpecName "kube-api-access-r8c9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.626969 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c49cf484-8011-4ce4-8185-074116b2326f" (UID: "c49cf484-8011-4ce4-8185-074116b2326f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.674452 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c49cf484-8011-4ce4-8185-074116b2326f" (UID: "c49cf484-8011-4ce4-8185-074116b2326f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.693845 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-config-data" (OuterVolumeSpecName: "config-data") pod "c49cf484-8011-4ce4-8185-074116b2326f" (UID: "c49cf484-8011-4ce4-8185-074116b2326f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.701790 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.701823 4743 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.701839 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8c9z\" (UniqueName: \"kubernetes.io/projected/c49cf484-8011-4ce4-8185-074116b2326f-kube-api-access-r8c9z\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.701852 4743 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-scripts\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.701863 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c49cf484-8011-4ce4-8185-074116b2326f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.701874 4743 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c49cf484-8011-4ce4-8185-074116b2326f-log-httpd\") on node \"crc\" DevicePath \"\"" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.813917 4743 generic.go:334] "Generic (PLEG): container finished" podID="c49cf484-8011-4ce4-8185-074116b2326f" containerID="76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655" exitCode=0 Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.813950 4743 generic.go:334] "Generic (PLEG): container finished" podID="c49cf484-8011-4ce4-8185-074116b2326f" containerID="e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828" exitCode=2 Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.813957 4743 generic.go:334] "Generic (PLEG): container finished" podID="c49cf484-8011-4ce4-8185-074116b2326f" containerID="59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd" exitCode=0 Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.813963 4743 generic.go:334] "Generic (PLEG): container finished" podID="c49cf484-8011-4ce4-8185-074116b2326f" containerID="e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5" exitCode=0 Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.813980 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.813992 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c49cf484-8011-4ce4-8185-074116b2326f","Type":"ContainerDied","Data":"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655"} Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.814063 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c49cf484-8011-4ce4-8185-074116b2326f","Type":"ContainerDied","Data":"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828"} Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.814091 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c49cf484-8011-4ce4-8185-074116b2326f","Type":"ContainerDied","Data":"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd"} Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.814117 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c49cf484-8011-4ce4-8185-074116b2326f","Type":"ContainerDied","Data":"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5"} Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.814145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c49cf484-8011-4ce4-8185-074116b2326f","Type":"ContainerDied","Data":"b6f2c6eebe5204e304147a8bfa09c841d8a7310ff1d577aa1c453e8b3c2b6867"} Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.814122 4743 scope.go:117] "RemoveContainer" containerID="76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.852811 4743 scope.go:117] "RemoveContainer" containerID="e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.868972 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.881755 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.901026 4743 scope.go:117] "RemoveContainer" containerID="59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.902004 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:26 crc kubenswrapper[4743]: E1122 10:07:26.907721 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06eb355f-ad73-42ff-8549-9a2d8a71b5f8" containerName="dnsmasq-dns" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.907760 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="06eb355f-ad73-42ff-8549-9a2d8a71b5f8" containerName="dnsmasq-dns" Nov 22 10:07:26 crc kubenswrapper[4743]: E1122 10:07:26.907816 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="ceilometer-notification-agent" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.907827 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="ceilometer-notification-agent" Nov 22 10:07:26 crc kubenswrapper[4743]: E1122 10:07:26.907860 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="sg-core" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.907868 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="sg-core" Nov 22 10:07:26 crc kubenswrapper[4743]: E1122 10:07:26.907890 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="proxy-httpd" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.907897 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="proxy-httpd" Nov 22 10:07:26 crc kubenswrapper[4743]: E1122 10:07:26.907908 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="ceilometer-central-agent" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.907917 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="ceilometer-central-agent" Nov 22 10:07:26 crc kubenswrapper[4743]: E1122 10:07:26.907940 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06eb355f-ad73-42ff-8549-9a2d8a71b5f8" containerName="init" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.907948 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="06eb355f-ad73-42ff-8549-9a2d8a71b5f8" containerName="init" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.908345 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="ceilometer-central-agent" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.908374 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="sg-core" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.908389 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="proxy-httpd" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.908413 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="06eb355f-ad73-42ff-8549-9a2d8a71b5f8" containerName="dnsmasq-dns" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.908428 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c49cf484-8011-4ce4-8185-074116b2326f" containerName="ceilometer-notification-agent" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.911028 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.912901 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.914089 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Nov 22 10:07:26 crc kubenswrapper[4743]: I1122 10:07:26.914279 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.001172 4743 scope.go:117] "RemoveContainer" containerID="e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.007198 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206dab94-6e44-48a4-8ed8-888e77d0ccd8-log-httpd\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.007274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.007350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-scripts\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.007414 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206dab94-6e44-48a4-8ed8-888e77d0ccd8-run-httpd\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.007438 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-config-data\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.007476 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.007496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4skg\" (UniqueName: \"kubernetes.io/projected/206dab94-6e44-48a4-8ed8-888e77d0ccd8-kube-api-access-s4skg\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.019090 4743 scope.go:117] "RemoveContainer" containerID="76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655" Nov 22 10:07:27 crc kubenswrapper[4743]: E1122 10:07:27.020912 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655\": container with ID starting with 76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655 not found: ID does not exist" containerID="76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.020940 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655"} err="failed to get container status \"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655\": rpc error: code = NotFound desc = could not find container \"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655\": container with ID starting with 76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.020960 4743 scope.go:117] "RemoveContainer" containerID="e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828" Nov 22 10:07:27 crc kubenswrapper[4743]: E1122 10:07:27.021161 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828\": container with ID starting with e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828 not found: ID does not exist" containerID="e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.021184 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828"} err="failed to get container status \"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828\": rpc error: code = NotFound desc = could not find container \"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828\": container with ID starting with e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.021197 4743 scope.go:117] "RemoveContainer" containerID="59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd" Nov 22 10:07:27 crc kubenswrapper[4743]: E1122 10:07:27.021365 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd\": container with ID starting with 59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd not found: ID does not exist" containerID="59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.021385 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd"} err="failed to get container status \"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd\": rpc error: code = NotFound desc = could not find container \"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd\": container with ID starting with 59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.021396 4743 scope.go:117] "RemoveContainer" containerID="e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5" Nov 22 10:07:27 crc kubenswrapper[4743]: E1122 10:07:27.021565 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5\": container with ID starting with e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5 not found: ID does not exist" containerID="e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.021603 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5"} err="failed to get container status \"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5\": rpc error: code = NotFound desc = could not find container \"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5\": container with ID starting with e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.021620 4743 scope.go:117] "RemoveContainer" containerID="76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.021824 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655"} err="failed to get container status \"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655\": rpc error: code = NotFound desc = could not find container \"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655\": container with ID starting with 76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.021851 4743 scope.go:117] "RemoveContainer" containerID="e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.022037 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828"} err="failed to get container status \"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828\": rpc error: code = NotFound desc = could not find container \"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828\": container with ID starting with e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.022055 4743 scope.go:117] "RemoveContainer" containerID="59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.022215 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd"} err="failed to get container status \"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd\": rpc error: code = NotFound desc = could not find container \"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd\": container with ID starting with 59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.022233 4743 scope.go:117] "RemoveContainer" containerID="e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.022384 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5"} err="failed to get container status \"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5\": rpc error: code = NotFound desc = could not find container \"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5\": container with ID starting with e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.022402 4743 scope.go:117] "RemoveContainer" containerID="76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.022636 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655"} err="failed to get container status \"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655\": rpc error: code = NotFound desc = could not find container \"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655\": container with ID starting with 76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.022660 4743 scope.go:117] "RemoveContainer" containerID="e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.022952 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828"} err="failed to get container status \"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828\": rpc error: code = NotFound desc = could not find container \"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828\": container with ID starting with e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.022982 4743 scope.go:117] "RemoveContainer" containerID="59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.023245 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd"} err="failed to get container status \"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd\": rpc error: code = NotFound desc = could not find container \"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd\": container with ID starting with 59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.023293 4743 scope.go:117] "RemoveContainer" containerID="e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.023553 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5"} err="failed to get container status \"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5\": rpc error: code = NotFound desc = could not find container \"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5\": container with ID starting with e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.023596 4743 scope.go:117] "RemoveContainer" containerID="76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.023800 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655"} err="failed to get container status \"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655\": rpc error: code = NotFound desc = could not find container \"76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655\": container with ID starting with 76ba4a36ac60c044f3332046c4a49c3bfb2de8d55ee3c07cc7b5f0fd46000655 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.023828 4743 scope.go:117] "RemoveContainer" containerID="e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.024119 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828"} err="failed to get container status \"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828\": rpc error: code = NotFound desc = could not find container \"e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828\": container with ID starting with e0d909879408b49a51f9ffe6ab63bf8c6bb08898a3c5f2293cf30a41ebea6828 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.024147 4743 scope.go:117] "RemoveContainer" containerID="59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.024338 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd"} err="failed to get container status \"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd\": rpc error: code = NotFound desc = could not find container \"59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd\": container with ID starting with 59a7c4c17785e2138d6ecbf648d35c50e7e02ea9e1a921caf453ea6af43032cd not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.024364 4743 scope.go:117] "RemoveContainer" containerID="e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.024671 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5"} err="failed to get container status \"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5\": rpc error: code = NotFound desc = could not find container \"e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5\": container with ID starting with e8ecb30d8df903b3df1f7a6724764ccbe6afc292c66e2b7765bcd19e7ff3cba5 not found: ID does not exist" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.109238 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206dab94-6e44-48a4-8ed8-888e77d0ccd8-log-httpd\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.109302 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.109369 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-scripts\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.109397 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206dab94-6e44-48a4-8ed8-888e77d0ccd8-run-httpd\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.109415 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-config-data\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.109453 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.109473 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4skg\" (UniqueName: \"kubernetes.io/projected/206dab94-6e44-48a4-8ed8-888e77d0ccd8-kube-api-access-s4skg\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.110376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206dab94-6e44-48a4-8ed8-888e77d0ccd8-run-httpd\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.110711 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/206dab94-6e44-48a4-8ed8-888e77d0ccd8-log-httpd\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.114186 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.114629 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-scripts\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.114947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.116082 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/206dab94-6e44-48a4-8ed8-888e77d0ccd8-config-data\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.128671 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4skg\" (UniqueName: \"kubernetes.io/projected/206dab94-6e44-48a4-8ed8-888e77d0ccd8-kube-api-access-s4skg\") pod \"ceilometer-0\" (UID: \"206dab94-6e44-48a4-8ed8-888e77d0ccd8\") " pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.165751 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c49cf484-8011-4ce4-8185-074116b2326f" path="/var/lib/kubelet/pods/c49cf484-8011-4ce4-8185-074116b2326f/volumes" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.292356 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.756595 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Nov 22 10:07:27 crc kubenswrapper[4743]: I1122 10:07:27.833489 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206dab94-6e44-48a4-8ed8-888e77d0ccd8","Type":"ContainerStarted","Data":"85672e34acde801b9a8cd56c98553d64db652e677af1240465d5531bab3e02a5"} Nov 22 10:07:28 crc kubenswrapper[4743]: I1122 10:07:28.845490 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206dab94-6e44-48a4-8ed8-888e77d0ccd8","Type":"ContainerStarted","Data":"5f42b378d8578925f6151db46b687d39634bb10f212f2fd17a4d28352056f349"} Nov 22 10:07:29 crc kubenswrapper[4743]: E1122 10:07:29.189935 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-conmon-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:07:29 crc kubenswrapper[4743]: I1122 10:07:29.858772 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206dab94-6e44-48a4-8ed8-888e77d0ccd8","Type":"ContainerStarted","Data":"cfb0d84d526bab3a01fc77f71c6c70a0a17f9df67a99aa435fdb5cec5edc7e73"} Nov 22 10:07:30 crc kubenswrapper[4743]: I1122 10:07:30.875052 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206dab94-6e44-48a4-8ed8-888e77d0ccd8","Type":"ContainerStarted","Data":"1e8d6efe6aaa15ad6ea9e9e2516ecec5f2015413c382d615ee329d3792519e31"} Nov 22 10:07:31 crc kubenswrapper[4743]: I1122 10:07:31.626885 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Nov 22 10:07:31 crc kubenswrapper[4743]: I1122 10:07:31.759334 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Nov 22 10:07:31 crc kubenswrapper[4743]: I1122 10:07:31.886814 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"206dab94-6e44-48a4-8ed8-888e77d0ccd8","Type":"ContainerStarted","Data":"e9fd096e3986b8ec7b02044ce8c062cf14c77d80651ea963bd0de764f45904e1"} Nov 22 10:07:31 crc kubenswrapper[4743]: I1122 10:07:31.887003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Nov 22 10:07:31 crc kubenswrapper[4743]: I1122 10:07:31.907433 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8392484700000002 podStartE2EDuration="5.907413402s" podCreationTimestamp="2025-11-22 10:07:26 +0000 UTC" firstStartedPulling="2025-11-22 10:07:27.755061481 +0000 UTC m=+6321.461422563" lastFinishedPulling="2025-11-22 10:07:30.823226433 +0000 UTC m=+6324.529587495" observedRunningTime="2025-11-22 10:07:31.906112945 +0000 UTC m=+6325.612473997" watchObservedRunningTime="2025-11-22 10:07:31.907413402 +0000 UTC m=+6325.613774454" Nov 22 10:07:31 crc kubenswrapper[4743]: I1122 10:07:31.965621 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Nov 22 10:07:35 crc kubenswrapper[4743]: I1122 10:07:35.978345 4743 scope.go:117] "RemoveContainer" containerID="0fe17c12f93974794d3e3628703864a4451a452538ac160835e8ff3c5b1a9673" Nov 22 10:07:39 crc kubenswrapper[4743]: E1122 10:07:39.513556 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-conmon-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:07:49 crc kubenswrapper[4743]: E1122 10:07:49.796114 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-conmon-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:07:57 crc kubenswrapper[4743]: I1122 10:07:57.298234 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Nov 22 10:08:00 crc kubenswrapper[4743]: E1122 10:08:00.063109 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-conmon-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4ae49b4_51e1_4ade_a974_4ff8c96ab104.slice/crio-b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.290224 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-c58bf"] Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.292758 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.294951 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.301471 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-c58bf"] Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.422733 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9w6h\" (UniqueName: \"kubernetes.io/projected/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-kube-api-access-h9w6h\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.422784 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-dns-svc\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.423130 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-config\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.423805 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-nb\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.424019 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-sb\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.424168 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-openstack-cell1\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.525903 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-config\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.525951 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-nb\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.526003 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-sb\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.526060 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-openstack-cell1\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.526123 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9w6h\" (UniqueName: \"kubernetes.io/projected/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-kube-api-access-h9w6h\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.526150 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-dns-svc\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.526947 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-sb\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.527029 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-nb\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.527037 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-config\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.527502 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-openstack-cell1\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.527522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-dns-svc\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.550410 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9w6h\" (UniqueName: \"kubernetes.io/projected/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-kube-api-access-h9w6h\") pod \"dnsmasq-dns-7b54c866bc-c58bf\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:17 crc kubenswrapper[4743]: I1122 10:08:17.633364 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:18 crc kubenswrapper[4743]: I1122 10:08:18.094228 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-c58bf"] Nov 22 10:08:18 crc kubenswrapper[4743]: W1122 10:08:18.094536 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod476d4d3f_70b8_4808_a1b8_5c5da6a2fac8.slice/crio-06fa2edbf6d85fe5c408642fa367272a40897598f8033fb6bf912d036d493a0a WatchSource:0}: Error finding container 06fa2edbf6d85fe5c408642fa367272a40897598f8033fb6bf912d036d493a0a: Status 404 returned error can't find the container with id 06fa2edbf6d85fe5c408642fa367272a40897598f8033fb6bf912d036d493a0a Nov 22 10:08:18 crc kubenswrapper[4743]: I1122 10:08:18.367686 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" event={"ID":"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8","Type":"ContainerStarted","Data":"4d5bc90d70eea8dc8657d5d209304b1f79f8ada27885038e73e3e2da2730c587"} Nov 22 10:08:18 crc kubenswrapper[4743]: I1122 10:08:18.367953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" event={"ID":"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8","Type":"ContainerStarted","Data":"06fa2edbf6d85fe5c408642fa367272a40897598f8033fb6bf912d036d493a0a"} Nov 22 10:08:19 crc kubenswrapper[4743]: I1122 10:08:19.377084 4743 generic.go:334] "Generic (PLEG): container finished" podID="476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" containerID="4d5bc90d70eea8dc8657d5d209304b1f79f8ada27885038e73e3e2da2730c587" exitCode=0 Nov 22 10:08:19 crc kubenswrapper[4743]: I1122 10:08:19.377145 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" event={"ID":"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8","Type":"ContainerDied","Data":"4d5bc90d70eea8dc8657d5d209304b1f79f8ada27885038e73e3e2da2730c587"} Nov 22 10:08:19 crc kubenswrapper[4743]: I1122 10:08:19.377447 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" event={"ID":"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8","Type":"ContainerStarted","Data":"2e4d1f0ff8bd1b5412f37a140b8e0da0cdc231282c39ecf987875c1f92cc1a53"} Nov 22 10:08:19 crc kubenswrapper[4743]: I1122 10:08:19.377590 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:19 crc kubenswrapper[4743]: I1122 10:08:19.407206 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" podStartSLOduration=2.40715689 podStartE2EDuration="2.40715689s" podCreationTimestamp="2025-11-22 10:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:08:19.402530337 +0000 UTC m=+6373.108891389" watchObservedRunningTime="2025-11-22 10:08:19.40715689 +0000 UTC m=+6373.113517942" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.634760 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.693961 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55586cc989-mzvsh"] Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.694560 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" podUID="94e9031c-368e-4df8-98ed-2dff38276a65" containerName="dnsmasq-dns" containerID="cri-o://4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5" gracePeriod=10 Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.847535 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d6cd869d9-wrqfx"] Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.849447 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.870003 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6cd869d9-wrqfx"] Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.878428 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-dns-svc\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.878492 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khfmt\" (UniqueName: \"kubernetes.io/projected/62f45629-8f43-4b4c-a775-b49b0ed27106-kube-api-access-khfmt\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.878536 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-openstack-cell1\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.878644 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-config\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.878704 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-ovsdbserver-nb\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.878769 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-ovsdbserver-sb\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.980166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-openstack-cell1\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.980289 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-config\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.980646 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-ovsdbserver-nb\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.981047 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-openstack-cell1\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.981258 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-config\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.981701 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-ovsdbserver-sb\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.981862 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-ovsdbserver-nb\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.981366 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-ovsdbserver-sb\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.982052 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-dns-svc\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.982078 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khfmt\" (UniqueName: \"kubernetes.io/projected/62f45629-8f43-4b4c-a775-b49b0ed27106-kube-api-access-khfmt\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:27 crc kubenswrapper[4743]: I1122 10:08:27.982789 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62f45629-8f43-4b4c-a775-b49b0ed27106-dns-svc\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.016247 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khfmt\" (UniqueName: \"kubernetes.io/projected/62f45629-8f43-4b4c-a775-b49b0ed27106-kube-api-access-khfmt\") pod \"dnsmasq-dns-d6cd869d9-wrqfx\" (UID: \"62f45629-8f43-4b4c-a775-b49b0ed27106\") " pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.191673 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.308245 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.389971 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-nb\") pod \"94e9031c-368e-4df8-98ed-2dff38276a65\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.390032 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-dns-svc\") pod \"94e9031c-368e-4df8-98ed-2dff38276a65\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.390109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-config\") pod \"94e9031c-368e-4df8-98ed-2dff38276a65\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.390126 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4zhh\" (UniqueName: \"kubernetes.io/projected/94e9031c-368e-4df8-98ed-2dff38276a65-kube-api-access-b4zhh\") pod \"94e9031c-368e-4df8-98ed-2dff38276a65\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.390155 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-sb\") pod \"94e9031c-368e-4df8-98ed-2dff38276a65\" (UID: \"94e9031c-368e-4df8-98ed-2dff38276a65\") " Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.396116 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e9031c-368e-4df8-98ed-2dff38276a65-kube-api-access-b4zhh" (OuterVolumeSpecName: "kube-api-access-b4zhh") pod "94e9031c-368e-4df8-98ed-2dff38276a65" (UID: "94e9031c-368e-4df8-98ed-2dff38276a65"). InnerVolumeSpecName "kube-api-access-b4zhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.464725 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "94e9031c-368e-4df8-98ed-2dff38276a65" (UID: "94e9031c-368e-4df8-98ed-2dff38276a65"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.465884 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "94e9031c-368e-4df8-98ed-2dff38276a65" (UID: "94e9031c-368e-4df8-98ed-2dff38276a65"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.468022 4743 generic.go:334] "Generic (PLEG): container finished" podID="94e9031c-368e-4df8-98ed-2dff38276a65" containerID="4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5" exitCode=0 Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.468057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" event={"ID":"94e9031c-368e-4df8-98ed-2dff38276a65","Type":"ContainerDied","Data":"4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5"} Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.468082 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" event={"ID":"94e9031c-368e-4df8-98ed-2dff38276a65","Type":"ContainerDied","Data":"ef821cee2abf29542098bdd235023be5c75dd051d0fcb34a8aba5b5d6480af13"} Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.468098 4743 scope.go:117] "RemoveContainer" containerID="4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.468114 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55586cc989-mzvsh" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.476551 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "94e9031c-368e-4df8-98ed-2dff38276a65" (UID: "94e9031c-368e-4df8-98ed-2dff38276a65"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.477157 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-config" (OuterVolumeSpecName: "config") pod "94e9031c-368e-4df8-98ed-2dff38276a65" (UID: "94e9031c-368e-4df8-98ed-2dff38276a65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.489647 4743 scope.go:117] "RemoveContainer" containerID="8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.492772 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.492798 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.492808 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-config\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.492817 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4zhh\" (UniqueName: \"kubernetes.io/projected/94e9031c-368e-4df8-98ed-2dff38276a65-kube-api-access-b4zhh\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.492826 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94e9031c-368e-4df8-98ed-2dff38276a65-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.513903 4743 scope.go:117] "RemoveContainer" containerID="4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5" Nov 22 10:08:28 crc kubenswrapper[4743]: E1122 10:08:28.514325 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5\": container with ID starting with 4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5 not found: ID does not exist" containerID="4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.514369 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5"} err="failed to get container status \"4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5\": rpc error: code = NotFound desc = could not find container \"4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5\": container with ID starting with 4c236ca80f97fbb080021edcb66c1474683b1ef485853715847f2472d2e928c5 not found: ID does not exist" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.514393 4743 scope.go:117] "RemoveContainer" containerID="8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d" Nov 22 10:08:28 crc kubenswrapper[4743]: E1122 10:08:28.514767 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d\": container with ID starting with 8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d not found: ID does not exist" containerID="8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.514801 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d"} err="failed to get container status \"8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d\": rpc error: code = NotFound desc = could not find container \"8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d\": container with ID starting with 8ea42655542fc05f67168c20aecba85ea0e021f0500ecdec50bac18b0b6c3b3d not found: ID does not exist" Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.689239 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6cd869d9-wrqfx"] Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.823631 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55586cc989-mzvsh"] Nov 22 10:08:28 crc kubenswrapper[4743]: I1122 10:08:28.832390 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55586cc989-mzvsh"] Nov 22 10:08:29 crc kubenswrapper[4743]: I1122 10:08:29.163645 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94e9031c-368e-4df8-98ed-2dff38276a65" path="/var/lib/kubelet/pods/94e9031c-368e-4df8-98ed-2dff38276a65/volumes" Nov 22 10:08:29 crc kubenswrapper[4743]: I1122 10:08:29.479243 4743 generic.go:334] "Generic (PLEG): container finished" podID="62f45629-8f43-4b4c-a775-b49b0ed27106" containerID="effe5d0a06a73a25ed854d799339f6feb273b079bb9906b3169b24cab8a126d7" exitCode=0 Nov 22 10:08:29 crc kubenswrapper[4743]: I1122 10:08:29.479352 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" event={"ID":"62f45629-8f43-4b4c-a775-b49b0ed27106","Type":"ContainerDied","Data":"effe5d0a06a73a25ed854d799339f6feb273b079bb9906b3169b24cab8a126d7"} Nov 22 10:08:29 crc kubenswrapper[4743]: I1122 10:08:29.479911 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" event={"ID":"62f45629-8f43-4b4c-a775-b49b0ed27106","Type":"ContainerStarted","Data":"303ded3f45ddabfd7a577a14d53fe1a7ff30670a4bc57bde275e84e276f65956"} Nov 22 10:08:30 crc kubenswrapper[4743]: I1122 10:08:30.493333 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" event={"ID":"62f45629-8f43-4b4c-a775-b49b0ed27106","Type":"ContainerStarted","Data":"17f21c8c4bec7ad1d7ecd84df2137e9032beba2202dcfb22ec7fb0a11e7fa8f1"} Nov 22 10:08:30 crc kubenswrapper[4743]: I1122 10:08:30.493583 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:30 crc kubenswrapper[4743]: I1122 10:08:30.518989 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" podStartSLOduration=3.518968526 podStartE2EDuration="3.518968526s" podCreationTimestamp="2025-11-22 10:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:08:30.512077508 +0000 UTC m=+6384.218438570" watchObservedRunningTime="2025-11-22 10:08:30.518968526 +0000 UTC m=+6384.225329578" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.133129 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2"] Nov 22 10:08:34 crc kubenswrapper[4743]: E1122 10:08:34.135133 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e9031c-368e-4df8-98ed-2dff38276a65" containerName="dnsmasq-dns" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.135233 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e9031c-368e-4df8-98ed-2dff38276a65" containerName="dnsmasq-dns" Nov 22 10:08:34 crc kubenswrapper[4743]: E1122 10:08:34.135331 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e9031c-368e-4df8-98ed-2dff38276a65" containerName="init" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.135405 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e9031c-368e-4df8-98ed-2dff38276a65" containerName="init" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.135707 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e9031c-368e-4df8-98ed-2dff38276a65" containerName="dnsmasq-dns" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.136534 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.140292 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.140750 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.140923 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.141076 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.154563 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2"] Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.245466 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.245564 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrrxg\" (UniqueName: \"kubernetes.io/projected/54840e46-1eea-45a3-8028-b05dc2bb08e0-kube-api-access-hrrxg\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.245787 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.245821 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.245901 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.347396 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.347449 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.347529 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.347677 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.347730 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrrxg\" (UniqueName: \"kubernetes.io/projected/54840e46-1eea-45a3-8028-b05dc2bb08e0-kube-api-access-hrrxg\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.352985 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.353085 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.353446 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.359181 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.367894 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrrxg\" (UniqueName: \"kubernetes.io/projected/54840e46-1eea-45a3-8028-b05dc2bb08e0-kube-api-access-hrrxg\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:34 crc kubenswrapper[4743]: I1122 10:08:34.463039 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:08:35 crc kubenswrapper[4743]: I1122 10:08:35.091456 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2"] Nov 22 10:08:35 crc kubenswrapper[4743]: I1122 10:08:35.546796 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" event={"ID":"54840e46-1eea-45a3-8028-b05dc2bb08e0","Type":"ContainerStarted","Data":"32eb8698db2912ecad38f5caea6c8b52705ce1d4b5d11da3ec1e56bd93116c8e"} Nov 22 10:08:38 crc kubenswrapper[4743]: I1122 10:08:38.193892 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d6cd869d9-wrqfx" Nov 22 10:08:38 crc kubenswrapper[4743]: I1122 10:08:38.276669 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-c58bf"] Nov 22 10:08:38 crc kubenswrapper[4743]: I1122 10:08:38.277867 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" podUID="476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" containerName="dnsmasq-dns" containerID="cri-o://2e4d1f0ff8bd1b5412f37a140b8e0da0cdc231282c39ecf987875c1f92cc1a53" gracePeriod=10 Nov 22 10:08:38 crc kubenswrapper[4743]: I1122 10:08:38.597040 4743 generic.go:334] "Generic (PLEG): container finished" podID="476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" containerID="2e4d1f0ff8bd1b5412f37a140b8e0da0cdc231282c39ecf987875c1f92cc1a53" exitCode=0 Nov 22 10:08:38 crc kubenswrapper[4743]: I1122 10:08:38.597351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" event={"ID":"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8","Type":"ContainerDied","Data":"2e4d1f0ff8bd1b5412f37a140b8e0da0cdc231282c39ecf987875c1f92cc1a53"} Nov 22 10:08:42 crc kubenswrapper[4743]: I1122 10:08:42.634711 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" podUID="476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.149:5353: connect: connection refused" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.141820 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.209015 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-sb\") pod \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.209068 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-dns-svc\") pod \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.209204 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-config\") pod \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.209302 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-openstack-cell1\") pod \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.209384 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9w6h\" (UniqueName: \"kubernetes.io/projected/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-kube-api-access-h9w6h\") pod \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.209412 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-nb\") pod \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\" (UID: \"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8\") " Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.240966 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-kube-api-access-h9w6h" (OuterVolumeSpecName: "kube-api-access-h9w6h") pod "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" (UID: "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8"). InnerVolumeSpecName "kube-api-access-h9w6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.280694 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" (UID: "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.284183 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-config" (OuterVolumeSpecName: "config") pod "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" (UID: "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.296714 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" (UID: "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.304638 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" (UID: "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.312098 4743 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-dns-svc\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.312129 4743 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-config\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.312139 4743 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-openstack-cell1\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.312150 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9w6h\" (UniqueName: \"kubernetes.io/projected/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-kube-api-access-h9w6h\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.312159 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.312529 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" (UID: "476d4d3f-70b8-4808-a1b8-5c5da6a2fac8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.414366 4743 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.684814 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" event={"ID":"54840e46-1eea-45a3-8028-b05dc2bb08e0","Type":"ContainerStarted","Data":"e70eb05f8c73a8e9cd587ed37673802574f469a0c12209d1830890d099a37cc4"} Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.687116 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" event={"ID":"476d4d3f-70b8-4808-a1b8-5c5da6a2fac8","Type":"ContainerDied","Data":"06fa2edbf6d85fe5c408642fa367272a40897598f8033fb6bf912d036d493a0a"} Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.687154 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b54c866bc-c58bf" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.687173 4743 scope.go:117] "RemoveContainer" containerID="2e4d1f0ff8bd1b5412f37a140b8e0da0cdc231282c39ecf987875c1f92cc1a53" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.711596 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" podStartSLOduration=1.987106773 podStartE2EDuration="11.711562759s" podCreationTimestamp="2025-11-22 10:08:34 +0000 UTC" firstStartedPulling="2025-11-22 10:08:35.099087889 +0000 UTC m=+6388.805448941" lastFinishedPulling="2025-11-22 10:08:44.823543875 +0000 UTC m=+6398.529904927" observedRunningTime="2025-11-22 10:08:45.707122001 +0000 UTC m=+6399.413483053" watchObservedRunningTime="2025-11-22 10:08:45.711562759 +0000 UTC m=+6399.417923811" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.717917 4743 scope.go:117] "RemoveContainer" containerID="4d5bc90d70eea8dc8657d5d209304b1f79f8ada27885038e73e3e2da2730c587" Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.740667 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-c58bf"] Nov 22 10:08:45 crc kubenswrapper[4743]: I1122 10:08:45.751087 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b54c866bc-c58bf"] Nov 22 10:08:47 crc kubenswrapper[4743]: I1122 10:08:47.165138 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" path="/var/lib/kubelet/pods/476d4d3f-70b8-4808-a1b8-5c5da6a2fac8/volumes" Nov 22 10:08:58 crc kubenswrapper[4743]: I1122 10:08:58.814173 4743 generic.go:334] "Generic (PLEG): container finished" podID="54840e46-1eea-45a3-8028-b05dc2bb08e0" containerID="e70eb05f8c73a8e9cd587ed37673802574f469a0c12209d1830890d099a37cc4" exitCode=0 Nov 22 10:08:58 crc kubenswrapper[4743]: I1122 10:08:58.814259 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" event={"ID":"54840e46-1eea-45a3-8028-b05dc2bb08e0","Type":"ContainerDied","Data":"e70eb05f8c73a8e9cd587ed37673802574f469a0c12209d1830890d099a37cc4"} Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.455682 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.548521 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-pre-adoption-validation-combined-ca-bundle\") pod \"54840e46-1eea-45a3-8028-b05dc2bb08e0\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.548597 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrrxg\" (UniqueName: \"kubernetes.io/projected/54840e46-1eea-45a3-8028-b05dc2bb08e0-kube-api-access-hrrxg\") pod \"54840e46-1eea-45a3-8028-b05dc2bb08e0\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.548635 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ceph\") pod \"54840e46-1eea-45a3-8028-b05dc2bb08e0\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.548660 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ssh-key\") pod \"54840e46-1eea-45a3-8028-b05dc2bb08e0\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.548695 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-inventory\") pod \"54840e46-1eea-45a3-8028-b05dc2bb08e0\" (UID: \"54840e46-1eea-45a3-8028-b05dc2bb08e0\") " Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.556774 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ceph" (OuterVolumeSpecName: "ceph") pod "54840e46-1eea-45a3-8028-b05dc2bb08e0" (UID: "54840e46-1eea-45a3-8028-b05dc2bb08e0"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.557546 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "54840e46-1eea-45a3-8028-b05dc2bb08e0" (UID: "54840e46-1eea-45a3-8028-b05dc2bb08e0"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.558821 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54840e46-1eea-45a3-8028-b05dc2bb08e0-kube-api-access-hrrxg" (OuterVolumeSpecName: "kube-api-access-hrrxg") pod "54840e46-1eea-45a3-8028-b05dc2bb08e0" (UID: "54840e46-1eea-45a3-8028-b05dc2bb08e0"). InnerVolumeSpecName "kube-api-access-hrrxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.602185 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-inventory" (OuterVolumeSpecName: "inventory") pod "54840e46-1eea-45a3-8028-b05dc2bb08e0" (UID: "54840e46-1eea-45a3-8028-b05dc2bb08e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.603593 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "54840e46-1eea-45a3-8028-b05dc2bb08e0" (UID: "54840e46-1eea-45a3-8028-b05dc2bb08e0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.650423 4743 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.650467 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrrxg\" (UniqueName: \"kubernetes.io/projected/54840e46-1eea-45a3-8028-b05dc2bb08e0-kube-api-access-hrrxg\") on node \"crc\" DevicePath \"\"" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.650477 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.650486 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.650495 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54840e46-1eea-45a3-8028-b05dc2bb08e0-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.841361 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" event={"ID":"54840e46-1eea-45a3-8028-b05dc2bb08e0","Type":"ContainerDied","Data":"32eb8698db2912ecad38f5caea6c8b52705ce1d4b5d11da3ec1e56bd93116c8e"} Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.841939 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32eb8698db2912ecad38f5caea6c8b52705ce1d4b5d11da3ec1e56bd93116c8e" Nov 22 10:09:00 crc kubenswrapper[4743]: I1122 10:09:00.841459 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2" Nov 22 10:09:01 crc kubenswrapper[4743]: I1122 10:09:01.241578 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:09:01 crc kubenswrapper[4743]: I1122 10:09:01.241684 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.758706 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q"] Nov 22 10:09:11 crc kubenswrapper[4743]: E1122 10:09:11.759795 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54840e46-1eea-45a3-8028-b05dc2bb08e0" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.759811 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="54840e46-1eea-45a3-8028-b05dc2bb08e0" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 22 10:09:11 crc kubenswrapper[4743]: E1122 10:09:11.759834 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" containerName="dnsmasq-dns" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.759841 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" containerName="dnsmasq-dns" Nov 22 10:09:11 crc kubenswrapper[4743]: E1122 10:09:11.759866 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" containerName="init" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.759872 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" containerName="init" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.760111 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="476d4d3f-70b8-4808-a1b8-5c5da6a2fac8" containerName="dnsmasq-dns" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.760138 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="54840e46-1eea-45a3-8028-b05dc2bb08e0" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.761045 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.769018 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q"] Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.807027 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.807141 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.807036 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.807378 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.807592 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvrn2\" (UniqueName: \"kubernetes.io/projected/60fc17e1-9296-450c-979c-bd863fb3dce6-kube-api-access-xvrn2\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.807663 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.807840 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.807936 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.807965 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.910340 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.910414 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.910464 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvrn2\" (UniqueName: \"kubernetes.io/projected/60fc17e1-9296-450c-979c-bd863fb3dce6-kube-api-access-xvrn2\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.910515 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.910543 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.916876 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.918129 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.918501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.918919 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:11 crc kubenswrapper[4743]: I1122 10:09:11.932568 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvrn2\" (UniqueName: \"kubernetes.io/projected/60fc17e1-9296-450c-979c-bd863fb3dce6-kube-api-access-xvrn2\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:12 crc kubenswrapper[4743]: I1122 10:09:12.120713 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:09:12 crc kubenswrapper[4743]: I1122 10:09:12.689786 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:09:12 crc kubenswrapper[4743]: I1122 10:09:12.691945 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q"] Nov 22 10:09:12 crc kubenswrapper[4743]: I1122 10:09:12.963003 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" event={"ID":"60fc17e1-9296-450c-979c-bd863fb3dce6","Type":"ContainerStarted","Data":"0660d7154eca5cdaca8f3d286f77f89bb0885b66d8798153e70a05582796d70b"} Nov 22 10:09:13 crc kubenswrapper[4743]: I1122 10:09:13.974595 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" event={"ID":"60fc17e1-9296-450c-979c-bd863fb3dce6","Type":"ContainerStarted","Data":"8d0f44e4fc0e65412461d4778c2dd4a0bafbe24bf4003bae23bbc0861334b601"} Nov 22 10:09:13 crc kubenswrapper[4743]: I1122 10:09:13.993396 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" podStartSLOduration=2.4569150410000002 podStartE2EDuration="2.99337359s" podCreationTimestamp="2025-11-22 10:09:11 +0000 UTC" firstStartedPulling="2025-11-22 10:09:12.689473511 +0000 UTC m=+6426.395834563" lastFinishedPulling="2025-11-22 10:09:13.22593204 +0000 UTC m=+6426.932293112" observedRunningTime="2025-11-22 10:09:13.99095952 +0000 UTC m=+6427.697320572" watchObservedRunningTime="2025-11-22 10:09:13.99337359 +0000 UTC m=+6427.699734642" Nov 22 10:09:26 crc kubenswrapper[4743]: I1122 10:09:26.043941 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-nm4ks"] Nov 22 10:09:26 crc kubenswrapper[4743]: I1122 10:09:26.056654 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-nm4ks"] Nov 22 10:09:27 crc kubenswrapper[4743]: I1122 10:09:27.033249 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-7645-account-create-d5rhk"] Nov 22 10:09:27 crc kubenswrapper[4743]: I1122 10:09:27.043022 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-7645-account-create-d5rhk"] Nov 22 10:09:27 crc kubenswrapper[4743]: I1122 10:09:27.174220 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e2d3a4e-8b93-40b6-80fd-79dc6c707264" path="/var/lib/kubelet/pods/0e2d3a4e-8b93-40b6-80fd-79dc6c707264/volumes" Nov 22 10:09:27 crc kubenswrapper[4743]: I1122 10:09:27.175142 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd0777e-d097-4e1a-b0ca-953d57f46f0f" path="/var/lib/kubelet/pods/8fd0777e-d097-4e1a-b0ca-953d57f46f0f/volumes" Nov 22 10:09:31 crc kubenswrapper[4743]: I1122 10:09:31.241654 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:09:31 crc kubenswrapper[4743]: I1122 10:09:31.242157 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:09:33 crc kubenswrapper[4743]: I1122 10:09:33.042973 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-dc4b9"] Nov 22 10:09:33 crc kubenswrapper[4743]: I1122 10:09:33.051038 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-dc4b9"] Nov 22 10:09:33 crc kubenswrapper[4743]: I1122 10:09:33.172546 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ef38b4-5852-4537-94c3-f2cc93dbb21f" path="/var/lib/kubelet/pods/e9ef38b4-5852-4537-94c3-f2cc93dbb21f/volumes" Nov 22 10:09:34 crc kubenswrapper[4743]: I1122 10:09:34.036858 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-70ee-account-create-g8k5f"] Nov 22 10:09:34 crc kubenswrapper[4743]: I1122 10:09:34.047623 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-70ee-account-create-g8k5f"] Nov 22 10:09:35 crc kubenswrapper[4743]: I1122 10:09:35.166745 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77960df9-960d-47e5-851f-4f6a5df2384c" path="/var/lib/kubelet/pods/77960df9-960d-47e5-851f-4f6a5df2384c/volumes" Nov 22 10:09:36 crc kubenswrapper[4743]: I1122 10:09:36.155237 4743 scope.go:117] "RemoveContainer" containerID="631616f9386105f81aba653a922adaef78b96dfb7ce2edbda255565489ba1c32" Nov 22 10:09:36 crc kubenswrapper[4743]: I1122 10:09:36.181954 4743 scope.go:117] "RemoveContainer" containerID="a395112858fa8c07826ad2df2a9b95961ec3008005cc1ec208b942e9be38365e" Nov 22 10:09:36 crc kubenswrapper[4743]: I1122 10:09:36.230600 4743 scope.go:117] "RemoveContainer" containerID="0db14e04cac86c24013d603638b4f73e7a9c7cc262e9e6d3384046a2491e5a77" Nov 22 10:09:36 crc kubenswrapper[4743]: I1122 10:09:36.297062 4743 scope.go:117] "RemoveContainer" containerID="d8d0062a96c27d5409fcbf7ad8b8abe495b2c91bc13e7275d05ec09424884a52" Nov 22 10:10:01 crc kubenswrapper[4743]: I1122 10:10:01.241291 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:10:01 crc kubenswrapper[4743]: I1122 10:10:01.242940 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:10:01 crc kubenswrapper[4743]: I1122 10:10:01.243052 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 10:10:01 crc kubenswrapper[4743]: I1122 10:10:01.243924 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:10:01 crc kubenswrapper[4743]: I1122 10:10:01.244056 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" gracePeriod=600 Nov 22 10:10:01 crc kubenswrapper[4743]: E1122 10:10:01.369062 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:10:01 crc kubenswrapper[4743]: I1122 10:10:01.446561 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" exitCode=0 Nov 22 10:10:01 crc kubenswrapper[4743]: I1122 10:10:01.446613 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8"} Nov 22 10:10:01 crc kubenswrapper[4743]: I1122 10:10:01.446954 4743 scope.go:117] "RemoveContainer" containerID="6b404b343b65af21aff6649872514335dd4038e7612309804bf70aeba3bcb920" Nov 22 10:10:01 crc kubenswrapper[4743]: I1122 10:10:01.449161 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:10:01 crc kubenswrapper[4743]: E1122 10:10:01.449882 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:10:03 crc kubenswrapper[4743]: I1122 10:10:03.042018 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-chxkm"] Nov 22 10:10:03 crc kubenswrapper[4743]: I1122 10:10:03.056039 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-chxkm"] Nov 22 10:10:03 crc kubenswrapper[4743]: I1122 10:10:03.164658 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a318d4-4c4c-4125-81e5-346f22bf3073" path="/var/lib/kubelet/pods/b8a318d4-4c4c-4125-81e5-346f22bf3073/volumes" Nov 22 10:10:13 crc kubenswrapper[4743]: I1122 10:10:13.152190 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:10:13 crc kubenswrapper[4743]: E1122 10:10:13.152980 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:10:13 crc kubenswrapper[4743]: E1122 10:10:13.968311 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Nov 22 10:10:26 crc kubenswrapper[4743]: I1122 10:10:26.152347 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:10:26 crc kubenswrapper[4743]: E1122 10:10:26.153376 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:10:36 crc kubenswrapper[4743]: I1122 10:10:36.414849 4743 scope.go:117] "RemoveContainer" containerID="59e93eea519761ca46145223b879769b5b62892442c960ff2624fb0357997e6e" Nov 22 10:10:36 crc kubenswrapper[4743]: I1122 10:10:36.444828 4743 scope.go:117] "RemoveContainer" containerID="da6e43ebf416def8a950513c7fe5d6fdb93879cdda990d1e87185bab101f98bf" Nov 22 10:10:39 crc kubenswrapper[4743]: I1122 10:10:39.152699 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:10:39 crc kubenswrapper[4743]: E1122 10:10:39.153721 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:10:53 crc kubenswrapper[4743]: I1122 10:10:53.152728 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:10:53 crc kubenswrapper[4743]: E1122 10:10:53.153567 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:11:08 crc kubenswrapper[4743]: I1122 10:11:08.152653 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:11:08 crc kubenswrapper[4743]: E1122 10:11:08.153962 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.580458 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4nsgr"] Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.583822 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.596964 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4nsgr"] Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.648116 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhv7c\" (UniqueName: \"kubernetes.io/projected/f071ae58-28db-49a2-bf4f-0c43472f9dd7-kube-api-access-mhv7c\") pod \"community-operators-4nsgr\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.648182 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-utilities\") pod \"community-operators-4nsgr\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.648295 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-catalog-content\") pod \"community-operators-4nsgr\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.750536 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-catalog-content\") pod \"community-operators-4nsgr\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.750679 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhv7c\" (UniqueName: \"kubernetes.io/projected/f071ae58-28db-49a2-bf4f-0c43472f9dd7-kube-api-access-mhv7c\") pod \"community-operators-4nsgr\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.750745 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-utilities\") pod \"community-operators-4nsgr\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.751151 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-catalog-content\") pod \"community-operators-4nsgr\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.751208 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-utilities\") pod \"community-operators-4nsgr\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.769661 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhv7c\" (UniqueName: \"kubernetes.io/projected/f071ae58-28db-49a2-bf4f-0c43472f9dd7-kube-api-access-mhv7c\") pod \"community-operators-4nsgr\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:17 crc kubenswrapper[4743]: I1122 10:11:17.924734 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:18 crc kubenswrapper[4743]: I1122 10:11:18.379003 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4nsgr"] Nov 22 10:11:19 crc kubenswrapper[4743]: I1122 10:11:19.152533 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:11:19 crc kubenswrapper[4743]: E1122 10:11:19.153414 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:11:19 crc kubenswrapper[4743]: I1122 10:11:19.207006 4743 generic.go:334] "Generic (PLEG): container finished" podID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerID="de22739006fcb9b5858fad74e12dda1e5f6d2131b591d9fb26416afb6c484fa5" exitCode=0 Nov 22 10:11:19 crc kubenswrapper[4743]: I1122 10:11:19.207056 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nsgr" event={"ID":"f071ae58-28db-49a2-bf4f-0c43472f9dd7","Type":"ContainerDied","Data":"de22739006fcb9b5858fad74e12dda1e5f6d2131b591d9fb26416afb6c484fa5"} Nov 22 10:11:19 crc kubenswrapper[4743]: I1122 10:11:19.207086 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nsgr" event={"ID":"f071ae58-28db-49a2-bf4f-0c43472f9dd7","Type":"ContainerStarted","Data":"f0d30731df7f11fc7e44328158138b20f9cc575bb02e913786574b3c4e5f3fb7"} Nov 22 10:11:20 crc kubenswrapper[4743]: I1122 10:11:20.217022 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nsgr" event={"ID":"f071ae58-28db-49a2-bf4f-0c43472f9dd7","Type":"ContainerStarted","Data":"98a7448b95db23b32355e5642a6fe378afe6d36b42bcb838677a17f8a004b12f"} Nov 22 10:11:22 crc kubenswrapper[4743]: I1122 10:11:22.235332 4743 generic.go:334] "Generic (PLEG): container finished" podID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerID="98a7448b95db23b32355e5642a6fe378afe6d36b42bcb838677a17f8a004b12f" exitCode=0 Nov 22 10:11:22 crc kubenswrapper[4743]: I1122 10:11:22.235386 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nsgr" event={"ID":"f071ae58-28db-49a2-bf4f-0c43472f9dd7","Type":"ContainerDied","Data":"98a7448b95db23b32355e5642a6fe378afe6d36b42bcb838677a17f8a004b12f"} Nov 22 10:11:23 crc kubenswrapper[4743]: I1122 10:11:23.251299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nsgr" event={"ID":"f071ae58-28db-49a2-bf4f-0c43472f9dd7","Type":"ContainerStarted","Data":"e00f19022b797cc33513a95da1a3b17eef2d299381463e27c9a0912ae9047505"} Nov 22 10:11:23 crc kubenswrapper[4743]: I1122 10:11:23.271078 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4nsgr" podStartSLOduration=2.752655796 podStartE2EDuration="6.271058112s" podCreationTimestamp="2025-11-22 10:11:17 +0000 UTC" firstStartedPulling="2025-11-22 10:11:19.209483635 +0000 UTC m=+6552.915844687" lastFinishedPulling="2025-11-22 10:11:22.727885951 +0000 UTC m=+6556.434247003" observedRunningTime="2025-11-22 10:11:23.2688955 +0000 UTC m=+6556.975256552" watchObservedRunningTime="2025-11-22 10:11:23.271058112 +0000 UTC m=+6556.977419164" Nov 22 10:11:27 crc kubenswrapper[4743]: I1122 10:11:27.925027 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:27 crc kubenswrapper[4743]: I1122 10:11:27.925750 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:27 crc kubenswrapper[4743]: I1122 10:11:27.973872 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:28 crc kubenswrapper[4743]: I1122 10:11:28.342804 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:28 crc kubenswrapper[4743]: I1122 10:11:28.384371 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4nsgr"] Nov 22 10:11:30 crc kubenswrapper[4743]: I1122 10:11:30.310944 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4nsgr" podUID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerName="registry-server" containerID="cri-o://e00f19022b797cc33513a95da1a3b17eef2d299381463e27c9a0912ae9047505" gracePeriod=2 Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.327798 4743 generic.go:334] "Generic (PLEG): container finished" podID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerID="e00f19022b797cc33513a95da1a3b17eef2d299381463e27c9a0912ae9047505" exitCode=0 Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.328288 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nsgr" event={"ID":"f071ae58-28db-49a2-bf4f-0c43472f9dd7","Type":"ContainerDied","Data":"e00f19022b797cc33513a95da1a3b17eef2d299381463e27c9a0912ae9047505"} Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.495954 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.589718 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-utilities\") pod \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.590040 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhv7c\" (UniqueName: \"kubernetes.io/projected/f071ae58-28db-49a2-bf4f-0c43472f9dd7-kube-api-access-mhv7c\") pod \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.590166 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-catalog-content\") pod \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\" (UID: \"f071ae58-28db-49a2-bf4f-0c43472f9dd7\") " Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.594097 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-utilities" (OuterVolumeSpecName: "utilities") pod "f071ae58-28db-49a2-bf4f-0c43472f9dd7" (UID: "f071ae58-28db-49a2-bf4f-0c43472f9dd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.597861 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f071ae58-28db-49a2-bf4f-0c43472f9dd7-kube-api-access-mhv7c" (OuterVolumeSpecName: "kube-api-access-mhv7c") pod "f071ae58-28db-49a2-bf4f-0c43472f9dd7" (UID: "f071ae58-28db-49a2-bf4f-0c43472f9dd7"). InnerVolumeSpecName "kube-api-access-mhv7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.643847 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f071ae58-28db-49a2-bf4f-0c43472f9dd7" (UID: "f071ae58-28db-49a2-bf4f-0c43472f9dd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.693130 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.693180 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhv7c\" (UniqueName: \"kubernetes.io/projected/f071ae58-28db-49a2-bf4f-0c43472f9dd7-kube-api-access-mhv7c\") on node \"crc\" DevicePath \"\"" Nov 22 10:11:31 crc kubenswrapper[4743]: I1122 10:11:31.693195 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f071ae58-28db-49a2-bf4f-0c43472f9dd7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:11:32 crc kubenswrapper[4743]: I1122 10:11:32.152498 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:11:32 crc kubenswrapper[4743]: E1122 10:11:32.153139 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:11:32 crc kubenswrapper[4743]: I1122 10:11:32.338366 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4nsgr" event={"ID":"f071ae58-28db-49a2-bf4f-0c43472f9dd7","Type":"ContainerDied","Data":"f0d30731df7f11fc7e44328158138b20f9cc575bb02e913786574b3c4e5f3fb7"} Nov 22 10:11:32 crc kubenswrapper[4743]: I1122 10:11:32.338440 4743 scope.go:117] "RemoveContainer" containerID="e00f19022b797cc33513a95da1a3b17eef2d299381463e27c9a0912ae9047505" Nov 22 10:11:32 crc kubenswrapper[4743]: I1122 10:11:32.338483 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4nsgr" Nov 22 10:11:32 crc kubenswrapper[4743]: I1122 10:11:32.364875 4743 scope.go:117] "RemoveContainer" containerID="98a7448b95db23b32355e5642a6fe378afe6d36b42bcb838677a17f8a004b12f" Nov 22 10:11:32 crc kubenswrapper[4743]: I1122 10:11:32.393799 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4nsgr"] Nov 22 10:11:32 crc kubenswrapper[4743]: I1122 10:11:32.396522 4743 scope.go:117] "RemoveContainer" containerID="de22739006fcb9b5858fad74e12dda1e5f6d2131b591d9fb26416afb6c484fa5" Nov 22 10:11:32 crc kubenswrapper[4743]: I1122 10:11:32.402762 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4nsgr"] Nov 22 10:11:33 crc kubenswrapper[4743]: I1122 10:11:33.163181 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" path="/var/lib/kubelet/pods/f071ae58-28db-49a2-bf4f-0c43472f9dd7/volumes" Nov 22 10:11:44 crc kubenswrapper[4743]: I1122 10:11:44.152243 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:11:44 crc kubenswrapper[4743]: E1122 10:11:44.153418 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.251204 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9sjw5"] Nov 22 10:11:52 crc kubenswrapper[4743]: E1122 10:11:52.252345 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerName="extract-utilities" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.252364 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerName="extract-utilities" Nov 22 10:11:52 crc kubenswrapper[4743]: E1122 10:11:52.252395 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerName="registry-server" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.252403 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerName="registry-server" Nov 22 10:11:52 crc kubenswrapper[4743]: E1122 10:11:52.252422 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerName="extract-content" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.252429 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerName="extract-content" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.252739 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f071ae58-28db-49a2-bf4f-0c43472f9dd7" containerName="registry-server" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.255381 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.262661 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sjw5"] Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.327910 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-catalog-content\") pod \"redhat-marketplace-9sjw5\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.328271 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxgmp\" (UniqueName: \"kubernetes.io/projected/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-kube-api-access-lxgmp\") pod \"redhat-marketplace-9sjw5\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.328390 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-utilities\") pod \"redhat-marketplace-9sjw5\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.430509 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-catalog-content\") pod \"redhat-marketplace-9sjw5\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.430642 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxgmp\" (UniqueName: \"kubernetes.io/projected/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-kube-api-access-lxgmp\") pod \"redhat-marketplace-9sjw5\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.430675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-utilities\") pod \"redhat-marketplace-9sjw5\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.431220 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-catalog-content\") pod \"redhat-marketplace-9sjw5\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.431266 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-utilities\") pod \"redhat-marketplace-9sjw5\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.450661 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxgmp\" (UniqueName: \"kubernetes.io/projected/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-kube-api-access-lxgmp\") pod \"redhat-marketplace-9sjw5\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:52 crc kubenswrapper[4743]: I1122 10:11:52.589554 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:11:53 crc kubenswrapper[4743]: I1122 10:11:53.032408 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sjw5"] Nov 22 10:11:53 crc kubenswrapper[4743]: I1122 10:11:53.535675 4743 generic.go:334] "Generic (PLEG): container finished" podID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerID="c1d43726fb42a1b7824b9154247eee5224af7af530e3e6b8d7faf3b0b120e354" exitCode=0 Nov 22 10:11:53 crc kubenswrapper[4743]: I1122 10:11:53.535728 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sjw5" event={"ID":"c46958e8-bd57-4131-a89b-d5d9c2ab4dee","Type":"ContainerDied","Data":"c1d43726fb42a1b7824b9154247eee5224af7af530e3e6b8d7faf3b0b120e354"} Nov 22 10:11:53 crc kubenswrapper[4743]: I1122 10:11:53.535975 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sjw5" event={"ID":"c46958e8-bd57-4131-a89b-d5d9c2ab4dee","Type":"ContainerStarted","Data":"d7bd5f1148891bfdb97126dda62e9ee2978adc38760e595a6e228d4991ca4f5c"} Nov 22 10:11:54 crc kubenswrapper[4743]: I1122 10:11:54.547026 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sjw5" event={"ID":"c46958e8-bd57-4131-a89b-d5d9c2ab4dee","Type":"ContainerStarted","Data":"5a82b645213bfde16185a0b4bbe9a6856677b25aa5ddc2bcc981623ed0b82ce8"} Nov 22 10:11:55 crc kubenswrapper[4743]: I1122 10:11:55.558407 4743 generic.go:334] "Generic (PLEG): container finished" podID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerID="5a82b645213bfde16185a0b4bbe9a6856677b25aa5ddc2bcc981623ed0b82ce8" exitCode=0 Nov 22 10:11:55 crc kubenswrapper[4743]: I1122 10:11:55.558461 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sjw5" event={"ID":"c46958e8-bd57-4131-a89b-d5d9c2ab4dee","Type":"ContainerDied","Data":"5a82b645213bfde16185a0b4bbe9a6856677b25aa5ddc2bcc981623ed0b82ce8"} Nov 22 10:11:56 crc kubenswrapper[4743]: I1122 10:11:56.574597 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sjw5" event={"ID":"c46958e8-bd57-4131-a89b-d5d9c2ab4dee","Type":"ContainerStarted","Data":"1e7f80b4a168b53e497a751fcecb364f74032a5d02a7d7802b8a1f8e1bc7b0b9"} Nov 22 10:11:56 crc kubenswrapper[4743]: I1122 10:11:56.600664 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9sjw5" podStartSLOduration=2.140003807 podStartE2EDuration="4.600650081s" podCreationTimestamp="2025-11-22 10:11:52 +0000 UTC" firstStartedPulling="2025-11-22 10:11:53.53916819 +0000 UTC m=+6587.245529242" lastFinishedPulling="2025-11-22 10:11:55.999814464 +0000 UTC m=+6589.706175516" observedRunningTime="2025-11-22 10:11:56.594014471 +0000 UTC m=+6590.300375543" watchObservedRunningTime="2025-11-22 10:11:56.600650081 +0000 UTC m=+6590.307011133" Nov 22 10:11:59 crc kubenswrapper[4743]: I1122 10:11:59.151602 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:11:59 crc kubenswrapper[4743]: E1122 10:11:59.152297 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:12:02 crc kubenswrapper[4743]: I1122 10:12:02.590434 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:12:02 crc kubenswrapper[4743]: I1122 10:12:02.590931 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:12:02 crc kubenswrapper[4743]: I1122 10:12:02.642026 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:12:02 crc kubenswrapper[4743]: I1122 10:12:02.688543 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.249818 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sjw5"] Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.250728 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9sjw5" podUID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerName="registry-server" containerID="cri-o://1e7f80b4a168b53e497a751fcecb364f74032a5d02a7d7802b8a1f8e1bc7b0b9" gracePeriod=2 Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.658940 4743 generic.go:334] "Generic (PLEG): container finished" podID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerID="1e7f80b4a168b53e497a751fcecb364f74032a5d02a7d7802b8a1f8e1bc7b0b9" exitCode=0 Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.658996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sjw5" event={"ID":"c46958e8-bd57-4131-a89b-d5d9c2ab4dee","Type":"ContainerDied","Data":"1e7f80b4a168b53e497a751fcecb364f74032a5d02a7d7802b8a1f8e1bc7b0b9"} Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.659045 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9sjw5" event={"ID":"c46958e8-bd57-4131-a89b-d5d9c2ab4dee","Type":"ContainerDied","Data":"d7bd5f1148891bfdb97126dda62e9ee2978adc38760e595a6e228d4991ca4f5c"} Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.659060 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7bd5f1148891bfdb97126dda62e9ee2978adc38760e595a6e228d4991ca4f5c" Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.720745 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.847779 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-utilities\") pod \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.848207 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxgmp\" (UniqueName: \"kubernetes.io/projected/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-kube-api-access-lxgmp\") pod \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.848305 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-catalog-content\") pod \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\" (UID: \"c46958e8-bd57-4131-a89b-d5d9c2ab4dee\") " Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.848966 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-utilities" (OuterVolumeSpecName: "utilities") pod "c46958e8-bd57-4131-a89b-d5d9c2ab4dee" (UID: "c46958e8-bd57-4131-a89b-d5d9c2ab4dee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.853459 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-kube-api-access-lxgmp" (OuterVolumeSpecName: "kube-api-access-lxgmp") pod "c46958e8-bd57-4131-a89b-d5d9c2ab4dee" (UID: "c46958e8-bd57-4131-a89b-d5d9c2ab4dee"). InnerVolumeSpecName "kube-api-access-lxgmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.869705 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c46958e8-bd57-4131-a89b-d5d9c2ab4dee" (UID: "c46958e8-bd57-4131-a89b-d5d9c2ab4dee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.951122 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.951172 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxgmp\" (UniqueName: \"kubernetes.io/projected/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-kube-api-access-lxgmp\") on node \"crc\" DevicePath \"\"" Nov 22 10:12:06 crc kubenswrapper[4743]: I1122 10:12:06.951186 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c46958e8-bd57-4131-a89b-d5d9c2ab4dee-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:12:07 crc kubenswrapper[4743]: I1122 10:12:07.668337 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9sjw5" Nov 22 10:12:07 crc kubenswrapper[4743]: I1122 10:12:07.688483 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sjw5"] Nov 22 10:12:07 crc kubenswrapper[4743]: I1122 10:12:07.696900 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9sjw5"] Nov 22 10:12:09 crc kubenswrapper[4743]: I1122 10:12:09.164040 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" path="/var/lib/kubelet/pods/c46958e8-bd57-4131-a89b-d5d9c2ab4dee/volumes" Nov 22 10:12:14 crc kubenswrapper[4743]: I1122 10:12:14.151857 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:12:14 crc kubenswrapper[4743]: E1122 10:12:14.152541 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:12:27 crc kubenswrapper[4743]: I1122 10:12:27.158922 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:12:27 crc kubenswrapper[4743]: E1122 10:12:27.159945 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:12:42 crc kubenswrapper[4743]: I1122 10:12:42.152471 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:12:42 crc kubenswrapper[4743]: E1122 10:12:42.153159 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:12:53 crc kubenswrapper[4743]: I1122 10:12:53.152527 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:12:53 crc kubenswrapper[4743]: E1122 10:12:53.153993 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:13:04 crc kubenswrapper[4743]: I1122 10:13:04.151802 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:13:04 crc kubenswrapper[4743]: E1122 10:13:04.152599 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:13:16 crc kubenswrapper[4743]: I1122 10:13:16.151376 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:13:16 crc kubenswrapper[4743]: E1122 10:13:16.152185 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:13:30 crc kubenswrapper[4743]: I1122 10:13:30.152213 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:13:30 crc kubenswrapper[4743]: E1122 10:13:30.153081 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:13:36 crc kubenswrapper[4743]: I1122 10:13:36.576920 4743 scope.go:117] "RemoveContainer" containerID="1bab0c5f048eb28a1d5b09723635391121840be3358353f90be2c92c543ac44b" Nov 22 10:13:36 crc kubenswrapper[4743]: I1122 10:13:36.601130 4743 scope.go:117] "RemoveContainer" containerID="286ff6d7d709ec4fc4c9d197d1bfd5049f705c9389b082d7c36129ab0783c901" Nov 22 10:13:36 crc kubenswrapper[4743]: I1122 10:13:36.634892 4743 scope.go:117] "RemoveContainer" containerID="0dd8fd6a384d89e03c514fca98b9d2817c709f5eae051b724a7c5ec55a8a9c0f" Nov 22 10:13:36 crc kubenswrapper[4743]: I1122 10:13:36.654246 4743 scope.go:117] "RemoveContainer" containerID="50d71dacbea76e8a7c160a3ce24d9a9cca5f46778f51a251277c0e6d91b94f66" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.212198 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4hns4"] Nov 22 10:13:44 crc kubenswrapper[4743]: E1122 10:13:44.213230 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerName="registry-server" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.213245 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerName="registry-server" Nov 22 10:13:44 crc kubenswrapper[4743]: E1122 10:13:44.213294 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerName="extract-utilities" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.213300 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerName="extract-utilities" Nov 22 10:13:44 crc kubenswrapper[4743]: E1122 10:13:44.213315 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerName="extract-content" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.213324 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerName="extract-content" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.213674 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c46958e8-bd57-4131-a89b-d5d9c2ab4dee" containerName="registry-server" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.230099 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.240232 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hns4"] Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.342509 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-catalog-content\") pod \"certified-operators-4hns4\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.343015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj8lh\" (UniqueName: \"kubernetes.io/projected/3e269958-8479-4bae-b4af-e7011cc18655-kube-api-access-zj8lh\") pod \"certified-operators-4hns4\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.343178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-utilities\") pod \"certified-operators-4hns4\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.445632 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj8lh\" (UniqueName: \"kubernetes.io/projected/3e269958-8479-4bae-b4af-e7011cc18655-kube-api-access-zj8lh\") pod \"certified-operators-4hns4\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.445760 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-utilities\") pod \"certified-operators-4hns4\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.445818 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-catalog-content\") pod \"certified-operators-4hns4\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.446344 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-catalog-content\") pod \"certified-operators-4hns4\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.446364 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-utilities\") pod \"certified-operators-4hns4\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.470608 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj8lh\" (UniqueName: \"kubernetes.io/projected/3e269958-8479-4bae-b4af-e7011cc18655-kube-api-access-zj8lh\") pod \"certified-operators-4hns4\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:44 crc kubenswrapper[4743]: I1122 10:13:44.567185 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:45 crc kubenswrapper[4743]: I1122 10:13:45.151659 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:13:45 crc kubenswrapper[4743]: E1122 10:13:45.152233 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:13:45 crc kubenswrapper[4743]: I1122 10:13:45.170718 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hns4"] Nov 22 10:13:45 crc kubenswrapper[4743]: I1122 10:13:45.661024 4743 generic.go:334] "Generic (PLEG): container finished" podID="3e269958-8479-4bae-b4af-e7011cc18655" containerID="d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1" exitCode=0 Nov 22 10:13:45 crc kubenswrapper[4743]: I1122 10:13:45.661397 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hns4" event={"ID":"3e269958-8479-4bae-b4af-e7011cc18655","Type":"ContainerDied","Data":"d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1"} Nov 22 10:13:45 crc kubenswrapper[4743]: I1122 10:13:45.661424 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hns4" event={"ID":"3e269958-8479-4bae-b4af-e7011cc18655","Type":"ContainerStarted","Data":"91ca02658adf66b1d32a2fd713a94d0f7331b344758becec58431cf61aa09729"} Nov 22 10:13:46 crc kubenswrapper[4743]: I1122 10:13:46.676943 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hns4" event={"ID":"3e269958-8479-4bae-b4af-e7011cc18655","Type":"ContainerStarted","Data":"b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6"} Nov 22 10:13:48 crc kubenswrapper[4743]: I1122 10:13:48.702594 4743 generic.go:334] "Generic (PLEG): container finished" podID="3e269958-8479-4bae-b4af-e7011cc18655" containerID="b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6" exitCode=0 Nov 22 10:13:48 crc kubenswrapper[4743]: I1122 10:13:48.702664 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hns4" event={"ID":"3e269958-8479-4bae-b4af-e7011cc18655","Type":"ContainerDied","Data":"b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6"} Nov 22 10:13:49 crc kubenswrapper[4743]: I1122 10:13:49.714480 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hns4" event={"ID":"3e269958-8479-4bae-b4af-e7011cc18655","Type":"ContainerStarted","Data":"a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca"} Nov 22 10:13:54 crc kubenswrapper[4743]: I1122 10:13:54.567557 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:54 crc kubenswrapper[4743]: I1122 10:13:54.568082 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:54 crc kubenswrapper[4743]: I1122 10:13:54.625019 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:54 crc kubenswrapper[4743]: I1122 10:13:54.657250 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4hns4" podStartSLOduration=7.17083279 podStartE2EDuration="10.657220767s" podCreationTimestamp="2025-11-22 10:13:44 +0000 UTC" firstStartedPulling="2025-11-22 10:13:45.663313736 +0000 UTC m=+6699.369674778" lastFinishedPulling="2025-11-22 10:13:49.149701703 +0000 UTC m=+6702.856062755" observedRunningTime="2025-11-22 10:13:49.73485764 +0000 UTC m=+6703.441218692" watchObservedRunningTime="2025-11-22 10:13:54.657220767 +0000 UTC m=+6708.363581859" Nov 22 10:13:54 crc kubenswrapper[4743]: I1122 10:13:54.822730 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:54 crc kubenswrapper[4743]: I1122 10:13:54.874597 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hns4"] Nov 22 10:13:56 crc kubenswrapper[4743]: I1122 10:13:56.791357 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4hns4" podUID="3e269958-8479-4bae-b4af-e7011cc18655" containerName="registry-server" containerID="cri-o://a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca" gracePeriod=2 Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.174859 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:13:57 crc kubenswrapper[4743]: E1122 10:13:57.175647 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.368969 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.558724 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj8lh\" (UniqueName: \"kubernetes.io/projected/3e269958-8479-4bae-b4af-e7011cc18655-kube-api-access-zj8lh\") pod \"3e269958-8479-4bae-b4af-e7011cc18655\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.558865 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-catalog-content\") pod \"3e269958-8479-4bae-b4af-e7011cc18655\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.559149 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-utilities\") pod \"3e269958-8479-4bae-b4af-e7011cc18655\" (UID: \"3e269958-8479-4bae-b4af-e7011cc18655\") " Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.560024 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-utilities" (OuterVolumeSpecName: "utilities") pod "3e269958-8479-4bae-b4af-e7011cc18655" (UID: "3e269958-8479-4bae-b4af-e7011cc18655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.560279 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.565943 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e269958-8479-4bae-b4af-e7011cc18655-kube-api-access-zj8lh" (OuterVolumeSpecName: "kube-api-access-zj8lh") pod "3e269958-8479-4bae-b4af-e7011cc18655" (UID: "3e269958-8479-4bae-b4af-e7011cc18655"). InnerVolumeSpecName "kube-api-access-zj8lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.607084 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e269958-8479-4bae-b4af-e7011cc18655" (UID: "3e269958-8479-4bae-b4af-e7011cc18655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.662356 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj8lh\" (UniqueName: \"kubernetes.io/projected/3e269958-8479-4bae-b4af-e7011cc18655-kube-api-access-zj8lh\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.662401 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e269958-8479-4bae-b4af-e7011cc18655-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.801775 4743 generic.go:334] "Generic (PLEG): container finished" podID="3e269958-8479-4bae-b4af-e7011cc18655" containerID="a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca" exitCode=0 Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.801826 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hns4" event={"ID":"3e269958-8479-4bae-b4af-e7011cc18655","Type":"ContainerDied","Data":"a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca"} Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.801856 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hns4" event={"ID":"3e269958-8479-4bae-b4af-e7011cc18655","Type":"ContainerDied","Data":"91ca02658adf66b1d32a2fd713a94d0f7331b344758becec58431cf61aa09729"} Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.801865 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hns4" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.801878 4743 scope.go:117] "RemoveContainer" containerID="a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.830449 4743 scope.go:117] "RemoveContainer" containerID="b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.850858 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hns4"] Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.861636 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4hns4"] Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.863232 4743 scope.go:117] "RemoveContainer" containerID="d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.908200 4743 scope.go:117] "RemoveContainer" containerID="a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca" Nov 22 10:13:57 crc kubenswrapper[4743]: E1122 10:13:57.908677 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca\": container with ID starting with a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca not found: ID does not exist" containerID="a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.908744 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca"} err="failed to get container status \"a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca\": rpc error: code = NotFound desc = could not find container \"a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca\": container with ID starting with a8637eaee4c46f05f33dbce59f91cb708df5e974434785afe6e338a0546b9fca not found: ID does not exist" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.908786 4743 scope.go:117] "RemoveContainer" containerID="b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6" Nov 22 10:13:57 crc kubenswrapper[4743]: E1122 10:13:57.909088 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6\": container with ID starting with b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6 not found: ID does not exist" containerID="b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.909129 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6"} err="failed to get container status \"b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6\": rpc error: code = NotFound desc = could not find container \"b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6\": container with ID starting with b6d0e7a84f12b0dfc66a8efcb92b4c11947ea03a7874e77f80ce19126b2c9de6 not found: ID does not exist" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.909155 4743 scope.go:117] "RemoveContainer" containerID="d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1" Nov 22 10:13:57 crc kubenswrapper[4743]: E1122 10:13:57.909385 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1\": container with ID starting with d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1 not found: ID does not exist" containerID="d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1" Nov 22 10:13:57 crc kubenswrapper[4743]: I1122 10:13:57.909423 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1"} err="failed to get container status \"d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1\": rpc error: code = NotFound desc = could not find container \"d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1\": container with ID starting with d37b84b634a14a56c2bf9c62cf3ec502f894402a889e6192aa323b8917676aa1 not found: ID does not exist" Nov 22 10:13:59 crc kubenswrapper[4743]: I1122 10:13:59.163934 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e269958-8479-4bae-b4af-e7011cc18655" path="/var/lib/kubelet/pods/3e269958-8479-4bae-b4af-e7011cc18655/volumes" Nov 22 10:14:01 crc kubenswrapper[4743]: I1122 10:14:01.038090 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-mzjsv"] Nov 22 10:14:01 crc kubenswrapper[4743]: I1122 10:14:01.047543 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-mzjsv"] Nov 22 10:14:01 crc kubenswrapper[4743]: I1122 10:14:01.054963 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-fa27-account-create-bcq27"] Nov 22 10:14:01 crc kubenswrapper[4743]: I1122 10:14:01.062027 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-fa27-account-create-bcq27"] Nov 22 10:14:01 crc kubenswrapper[4743]: I1122 10:14:01.165386 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b746268-80b6-4be3-b32c-19b1fe639bef" path="/var/lib/kubelet/pods/8b746268-80b6-4be3-b32c-19b1fe639bef/volumes" Nov 22 10:14:01 crc kubenswrapper[4743]: I1122 10:14:01.166247 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef7f86b6-ae6d-4428-ba15-014b04aa2a93" path="/var/lib/kubelet/pods/ef7f86b6-ae6d-4428-ba15-014b04aa2a93/volumes" Nov 22 10:14:12 crc kubenswrapper[4743]: I1122 10:14:12.152177 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:14:12 crc kubenswrapper[4743]: E1122 10:14:12.153081 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:14:13 crc kubenswrapper[4743]: I1122 10:14:13.030983 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-dddhg"] Nov 22 10:14:13 crc kubenswrapper[4743]: I1122 10:14:13.041602 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-dddhg"] Nov 22 10:14:13 crc kubenswrapper[4743]: I1122 10:14:13.164508 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4529808-850b-47b2-adbd-99c4c2e9a0e2" path="/var/lib/kubelet/pods/b4529808-850b-47b2-adbd-99c4c2e9a0e2/volumes" Nov 22 10:14:26 crc kubenswrapper[4743]: I1122 10:14:26.152077 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:14:26 crc kubenswrapper[4743]: E1122 10:14:26.152880 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:14:36 crc kubenswrapper[4743]: I1122 10:14:36.702646 4743 scope.go:117] "RemoveContainer" containerID="425989f6b45af1ef2395a839296d880b505e064b07fb63094a0678a3946817b9" Nov 22 10:14:36 crc kubenswrapper[4743]: I1122 10:14:36.741447 4743 scope.go:117] "RemoveContainer" containerID="9bfea331c50322b595e5152c42d23fe3b7ebaab2a552122deec8ce333449e07b" Nov 22 10:14:36 crc kubenswrapper[4743]: I1122 10:14:36.817734 4743 scope.go:117] "RemoveContainer" containerID="6cb93f4b69bcf9bfe8c564273cc51e30b612a5c4dfd204dc8193235cf7973539" Nov 22 10:14:37 crc kubenswrapper[4743]: I1122 10:14:37.158296 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:14:37 crc kubenswrapper[4743]: E1122 10:14:37.158755 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:14:48 crc kubenswrapper[4743]: I1122 10:14:48.153104 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:14:48 crc kubenswrapper[4743]: E1122 10:14:48.155021 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:14:50 crc kubenswrapper[4743]: I1122 10:14:50.883629 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-65kbs"] Nov 22 10:14:50 crc kubenswrapper[4743]: E1122 10:14:50.885354 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e269958-8479-4bae-b4af-e7011cc18655" containerName="registry-server" Nov 22 10:14:50 crc kubenswrapper[4743]: I1122 10:14:50.885376 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e269958-8479-4bae-b4af-e7011cc18655" containerName="registry-server" Nov 22 10:14:50 crc kubenswrapper[4743]: E1122 10:14:50.885408 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e269958-8479-4bae-b4af-e7011cc18655" containerName="extract-utilities" Nov 22 10:14:50 crc kubenswrapper[4743]: I1122 10:14:50.885416 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e269958-8479-4bae-b4af-e7011cc18655" containerName="extract-utilities" Nov 22 10:14:50 crc kubenswrapper[4743]: E1122 10:14:50.885456 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e269958-8479-4bae-b4af-e7011cc18655" containerName="extract-content" Nov 22 10:14:50 crc kubenswrapper[4743]: I1122 10:14:50.885464 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e269958-8479-4bae-b4af-e7011cc18655" containerName="extract-content" Nov 22 10:14:50 crc kubenswrapper[4743]: I1122 10:14:50.885841 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e269958-8479-4bae-b4af-e7011cc18655" containerName="registry-server" Nov 22 10:14:50 crc kubenswrapper[4743]: I1122 10:14:50.888632 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:50 crc kubenswrapper[4743]: I1122 10:14:50.904412 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-65kbs"] Nov 22 10:14:50 crc kubenswrapper[4743]: I1122 10:14:50.997227 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-utilities\") pod \"redhat-operators-65kbs\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:50 crc kubenswrapper[4743]: I1122 10:14:50.997271 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd525\" (UniqueName: \"kubernetes.io/projected/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-kube-api-access-gd525\") pod \"redhat-operators-65kbs\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:50 crc kubenswrapper[4743]: I1122 10:14:50.997309 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-catalog-content\") pod \"redhat-operators-65kbs\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:51 crc kubenswrapper[4743]: I1122 10:14:51.099882 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-utilities\") pod \"redhat-operators-65kbs\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:51 crc kubenswrapper[4743]: I1122 10:14:51.100211 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd525\" (UniqueName: \"kubernetes.io/projected/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-kube-api-access-gd525\") pod \"redhat-operators-65kbs\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:51 crc kubenswrapper[4743]: I1122 10:14:51.100250 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-catalog-content\") pod \"redhat-operators-65kbs\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:51 crc kubenswrapper[4743]: I1122 10:14:51.100423 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-utilities\") pod \"redhat-operators-65kbs\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:51 crc kubenswrapper[4743]: I1122 10:14:51.100650 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-catalog-content\") pod \"redhat-operators-65kbs\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:51 crc kubenswrapper[4743]: I1122 10:14:51.123250 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd525\" (UniqueName: \"kubernetes.io/projected/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-kube-api-access-gd525\") pod \"redhat-operators-65kbs\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:51 crc kubenswrapper[4743]: I1122 10:14:51.210491 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:14:51 crc kubenswrapper[4743]: I1122 10:14:51.725672 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-65kbs"] Nov 22 10:14:52 crc kubenswrapper[4743]: I1122 10:14:52.372300 4743 generic.go:334] "Generic (PLEG): container finished" podID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerID="63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69" exitCode=0 Nov 22 10:14:52 crc kubenswrapper[4743]: I1122 10:14:52.372537 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65kbs" event={"ID":"0bf71ead-91e0-4fb3-8ed8-7738e726a02f","Type":"ContainerDied","Data":"63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69"} Nov 22 10:14:52 crc kubenswrapper[4743]: I1122 10:14:52.373089 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65kbs" event={"ID":"0bf71ead-91e0-4fb3-8ed8-7738e726a02f","Type":"ContainerStarted","Data":"fc9b9fe8f251cc4f58d1beb97d0419beae33785da9ab0cb558453ee63cc1e13a"} Nov 22 10:14:52 crc kubenswrapper[4743]: I1122 10:14:52.375088 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:14:54 crc kubenswrapper[4743]: I1122 10:14:54.391781 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65kbs" event={"ID":"0bf71ead-91e0-4fb3-8ed8-7738e726a02f","Type":"ContainerStarted","Data":"cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4"} Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.163295 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf"] Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.165957 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.170917 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.170938 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.178288 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf"] Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.302167 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccrtq\" (UniqueName: \"kubernetes.io/projected/096d6397-d699-489c-91ec-371da2bfc7d6-kube-api-access-ccrtq\") pod \"collect-profiles-29396775-5tknf\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.302232 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/096d6397-d699-489c-91ec-371da2bfc7d6-secret-volume\") pod \"collect-profiles-29396775-5tknf\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.302353 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/096d6397-d699-489c-91ec-371da2bfc7d6-config-volume\") pod \"collect-profiles-29396775-5tknf\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.404674 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/096d6397-d699-489c-91ec-371da2bfc7d6-config-volume\") pod \"collect-profiles-29396775-5tknf\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.404852 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccrtq\" (UniqueName: \"kubernetes.io/projected/096d6397-d699-489c-91ec-371da2bfc7d6-kube-api-access-ccrtq\") pod \"collect-profiles-29396775-5tknf\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.404921 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/096d6397-d699-489c-91ec-371da2bfc7d6-secret-volume\") pod \"collect-profiles-29396775-5tknf\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.406012 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/096d6397-d699-489c-91ec-371da2bfc7d6-config-volume\") pod \"collect-profiles-29396775-5tknf\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.411163 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/096d6397-d699-489c-91ec-371da2bfc7d6-secret-volume\") pod \"collect-profiles-29396775-5tknf\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.425698 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccrtq\" (UniqueName: \"kubernetes.io/projected/096d6397-d699-489c-91ec-371da2bfc7d6-kube-api-access-ccrtq\") pod \"collect-profiles-29396775-5tknf\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.444562 4743 generic.go:334] "Generic (PLEG): container finished" podID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerID="cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4" exitCode=0 Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.444628 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65kbs" event={"ID":"0bf71ead-91e0-4fb3-8ed8-7738e726a02f","Type":"ContainerDied","Data":"cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4"} Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.491871 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:00 crc kubenswrapper[4743]: I1122 10:15:00.952047 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf"] Nov 22 10:15:01 crc kubenswrapper[4743]: I1122 10:15:01.459653 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65kbs" event={"ID":"0bf71ead-91e0-4fb3-8ed8-7738e726a02f","Type":"ContainerStarted","Data":"bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc"} Nov 22 10:15:01 crc kubenswrapper[4743]: I1122 10:15:01.463253 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" event={"ID":"096d6397-d699-489c-91ec-371da2bfc7d6","Type":"ContainerStarted","Data":"81f0d092ff00bf71679e9441f288cdad51888cc7d2485624aa3145a121358e2b"} Nov 22 10:15:01 crc kubenswrapper[4743]: I1122 10:15:01.463302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" event={"ID":"096d6397-d699-489c-91ec-371da2bfc7d6","Type":"ContainerStarted","Data":"9ecf8a7438bf44444209f294a6293779a146471c0a91e3dd9da530f49405c37f"} Nov 22 10:15:01 crc kubenswrapper[4743]: I1122 10:15:01.486019 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-65kbs" podStartSLOduration=2.947555914 podStartE2EDuration="11.48599381s" podCreationTimestamp="2025-11-22 10:14:50 +0000 UTC" firstStartedPulling="2025-11-22 10:14:52.37473059 +0000 UTC m=+6766.081091652" lastFinishedPulling="2025-11-22 10:15:00.913168486 +0000 UTC m=+6774.619529548" observedRunningTime="2025-11-22 10:15:01.478094583 +0000 UTC m=+6775.184455645" watchObservedRunningTime="2025-11-22 10:15:01.48599381 +0000 UTC m=+6775.192354862" Nov 22 10:15:01 crc kubenswrapper[4743]: I1122 10:15:01.500994 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" podStartSLOduration=1.50097524 podStartE2EDuration="1.50097524s" podCreationTimestamp="2025-11-22 10:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:15:01.496001047 +0000 UTC m=+6775.202362109" watchObservedRunningTime="2025-11-22 10:15:01.50097524 +0000 UTC m=+6775.207336312" Nov 22 10:15:02 crc kubenswrapper[4743]: E1122 10:15:02.097049 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod096d6397_d699_489c_91ec_371da2bfc7d6.slice/crio-conmon-81f0d092ff00bf71679e9441f288cdad51888cc7d2485624aa3145a121358e2b.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:15:02 crc kubenswrapper[4743]: I1122 10:15:02.152067 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:15:02 crc kubenswrapper[4743]: I1122 10:15:02.473920 4743 generic.go:334] "Generic (PLEG): container finished" podID="096d6397-d699-489c-91ec-371da2bfc7d6" containerID="81f0d092ff00bf71679e9441f288cdad51888cc7d2485624aa3145a121358e2b" exitCode=0 Nov 22 10:15:02 crc kubenswrapper[4743]: I1122 10:15:02.474057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" event={"ID":"096d6397-d699-489c-91ec-371da2bfc7d6","Type":"ContainerDied","Data":"81f0d092ff00bf71679e9441f288cdad51888cc7d2485624aa3145a121358e2b"} Nov 22 10:15:02 crc kubenswrapper[4743]: I1122 10:15:02.479716 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"d0426b7b56555ab35404fad4ab48c6a869ddcec49b88d7575ac1795d9dd87a05"} Nov 22 10:15:03 crc kubenswrapper[4743]: I1122 10:15:03.898219 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.000002 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/096d6397-d699-489c-91ec-371da2bfc7d6-config-volume\") pod \"096d6397-d699-489c-91ec-371da2bfc7d6\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.000289 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccrtq\" (UniqueName: \"kubernetes.io/projected/096d6397-d699-489c-91ec-371da2bfc7d6-kube-api-access-ccrtq\") pod \"096d6397-d699-489c-91ec-371da2bfc7d6\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.000331 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/096d6397-d699-489c-91ec-371da2bfc7d6-secret-volume\") pod \"096d6397-d699-489c-91ec-371da2bfc7d6\" (UID: \"096d6397-d699-489c-91ec-371da2bfc7d6\") " Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.001837 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/096d6397-d699-489c-91ec-371da2bfc7d6-config-volume" (OuterVolumeSpecName: "config-volume") pod "096d6397-d699-489c-91ec-371da2bfc7d6" (UID: "096d6397-d699-489c-91ec-371da2bfc7d6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.007952 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/096d6397-d699-489c-91ec-371da2bfc7d6-kube-api-access-ccrtq" (OuterVolumeSpecName: "kube-api-access-ccrtq") pod "096d6397-d699-489c-91ec-371da2bfc7d6" (UID: "096d6397-d699-489c-91ec-371da2bfc7d6"). InnerVolumeSpecName "kube-api-access-ccrtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.008914 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/096d6397-d699-489c-91ec-371da2bfc7d6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "096d6397-d699-489c-91ec-371da2bfc7d6" (UID: "096d6397-d699-489c-91ec-371da2bfc7d6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.104113 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccrtq\" (UniqueName: \"kubernetes.io/projected/096d6397-d699-489c-91ec-371da2bfc7d6-kube-api-access-ccrtq\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.104329 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/096d6397-d699-489c-91ec-371da2bfc7d6-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.104430 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/096d6397-d699-489c-91ec-371da2bfc7d6-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.499136 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" event={"ID":"096d6397-d699-489c-91ec-371da2bfc7d6","Type":"ContainerDied","Data":"9ecf8a7438bf44444209f294a6293779a146471c0a91e3dd9da530f49405c37f"} Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.499406 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ecf8a7438bf44444209f294a6293779a146471c0a91e3dd9da530f49405c37f" Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.499198 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf" Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.567368 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2"] Nov 22 10:15:04 crc kubenswrapper[4743]: I1122 10:15:04.575339 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396730-spmx2"] Nov 22 10:15:05 crc kubenswrapper[4743]: I1122 10:15:05.171922 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="473686fe-38ae-4f9a-bfc0-c7946ebb17bc" path="/var/lib/kubelet/pods/473686fe-38ae-4f9a-bfc0-c7946ebb17bc/volumes" Nov 22 10:15:11 crc kubenswrapper[4743]: I1122 10:15:11.211139 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:15:11 crc kubenswrapper[4743]: I1122 10:15:11.211586 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:15:12 crc kubenswrapper[4743]: I1122 10:15:12.269002 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-65kbs" podUID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerName="registry-server" probeResult="failure" output=< Nov 22 10:15:12 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 10:15:12 crc kubenswrapper[4743]: > Nov 22 10:15:21 crc kubenswrapper[4743]: I1122 10:15:21.259403 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:15:21 crc kubenswrapper[4743]: I1122 10:15:21.309724 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:15:22 crc kubenswrapper[4743]: I1122 10:15:22.073171 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-65kbs"] Nov 22 10:15:22 crc kubenswrapper[4743]: I1122 10:15:22.695466 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-65kbs" podUID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerName="registry-server" containerID="cri-o://bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc" gracePeriod=2 Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.209971 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.285448 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd525\" (UniqueName: \"kubernetes.io/projected/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-kube-api-access-gd525\") pod \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.285534 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-catalog-content\") pod \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.285563 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-utilities\") pod \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\" (UID: \"0bf71ead-91e0-4fb3-8ed8-7738e726a02f\") " Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.286903 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-utilities" (OuterVolumeSpecName: "utilities") pod "0bf71ead-91e0-4fb3-8ed8-7738e726a02f" (UID: "0bf71ead-91e0-4fb3-8ed8-7738e726a02f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.293460 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-kube-api-access-gd525" (OuterVolumeSpecName: "kube-api-access-gd525") pod "0bf71ead-91e0-4fb3-8ed8-7738e726a02f" (UID: "0bf71ead-91e0-4fb3-8ed8-7738e726a02f"). InnerVolumeSpecName "kube-api-access-gd525". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.377799 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bf71ead-91e0-4fb3-8ed8-7738e726a02f" (UID: "0bf71ead-91e0-4fb3-8ed8-7738e726a02f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.388295 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.388343 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.388356 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd525\" (UniqueName: \"kubernetes.io/projected/0bf71ead-91e0-4fb3-8ed8-7738e726a02f-kube-api-access-gd525\") on node \"crc\" DevicePath \"\"" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.720096 4743 generic.go:334] "Generic (PLEG): container finished" podID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerID="bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc" exitCode=0 Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.720469 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65kbs" event={"ID":"0bf71ead-91e0-4fb3-8ed8-7738e726a02f","Type":"ContainerDied","Data":"bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc"} Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.720497 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-65kbs" event={"ID":"0bf71ead-91e0-4fb3-8ed8-7738e726a02f","Type":"ContainerDied","Data":"fc9b9fe8f251cc4f58d1beb97d0419beae33785da9ab0cb558453ee63cc1e13a"} Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.720518 4743 scope.go:117] "RemoveContainer" containerID="bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.720753 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-65kbs" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.745869 4743 scope.go:117] "RemoveContainer" containerID="cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.756251 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-65kbs"] Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.764312 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-65kbs"] Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.776268 4743 scope.go:117] "RemoveContainer" containerID="63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.829493 4743 scope.go:117] "RemoveContainer" containerID="bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc" Nov 22 10:15:23 crc kubenswrapper[4743]: E1122 10:15:23.830190 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc\": container with ID starting with bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc not found: ID does not exist" containerID="bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.830220 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc"} err="failed to get container status \"bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc\": rpc error: code = NotFound desc = could not find container \"bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc\": container with ID starting with bba56a38972eb9bd26c7f9e188203b071727422133c6fcdbd00e029c138ff1fc not found: ID does not exist" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.830246 4743 scope.go:117] "RemoveContainer" containerID="cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4" Nov 22 10:15:23 crc kubenswrapper[4743]: E1122 10:15:23.830514 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4\": container with ID starting with cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4 not found: ID does not exist" containerID="cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.830537 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4"} err="failed to get container status \"cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4\": rpc error: code = NotFound desc = could not find container \"cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4\": container with ID starting with cebe6b5e64ee5b51abd3f2872b3bd0e5980fcb395c9d9226cb2b5e627328baa4 not found: ID does not exist" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.830555 4743 scope.go:117] "RemoveContainer" containerID="63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69" Nov 22 10:15:23 crc kubenswrapper[4743]: E1122 10:15:23.830792 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69\": container with ID starting with 63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69 not found: ID does not exist" containerID="63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69" Nov 22 10:15:23 crc kubenswrapper[4743]: I1122 10:15:23.830817 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69"} err="failed to get container status \"63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69\": rpc error: code = NotFound desc = could not find container \"63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69\": container with ID starting with 63f21e45681b96fb8e74224079a3d8a8775deb560fda2c749aad51ba9fed2e69 not found: ID does not exist" Nov 22 10:15:25 crc kubenswrapper[4743]: I1122 10:15:25.164472 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" path="/var/lib/kubelet/pods/0bf71ead-91e0-4fb3-8ed8-7738e726a02f/volumes" Nov 22 10:15:36 crc kubenswrapper[4743]: I1122 10:15:36.967751 4743 scope.go:117] "RemoveContainer" containerID="b0b9f55113c4c4681d970884f32060a4d74a46af79a18307032740fd5d878788" Nov 22 10:16:23 crc kubenswrapper[4743]: I1122 10:16:23.056248 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-78bc-account-create-tfdlp"] Nov 22 10:16:23 crc kubenswrapper[4743]: I1122 10:16:23.073322 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-dz28f"] Nov 22 10:16:23 crc kubenswrapper[4743]: I1122 10:16:23.081963 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-78bc-account-create-tfdlp"] Nov 22 10:16:23 crc kubenswrapper[4743]: I1122 10:16:23.089014 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-dz28f"] Nov 22 10:16:23 crc kubenswrapper[4743]: I1122 10:16:23.166147 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="727ee333-0f62-4dc1-acc9-2f03f1b9269c" path="/var/lib/kubelet/pods/727ee333-0f62-4dc1-acc9-2f03f1b9269c/volumes" Nov 22 10:16:23 crc kubenswrapper[4743]: I1122 10:16:23.168684 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b02d6b-89a6-4119-96ff-0a80cb68e437" path="/var/lib/kubelet/pods/d1b02d6b-89a6-4119-96ff-0a80cb68e437/volumes" Nov 22 10:16:35 crc kubenswrapper[4743]: I1122 10:16:35.039878 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-zbhvb"] Nov 22 10:16:35 crc kubenswrapper[4743]: I1122 10:16:35.046920 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-zbhvb"] Nov 22 10:16:35 crc kubenswrapper[4743]: I1122 10:16:35.164097 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f172421-21a5-46a3-847f-04f7590e9615" path="/var/lib/kubelet/pods/3f172421-21a5-46a3-847f-04f7590e9615/volumes" Nov 22 10:16:37 crc kubenswrapper[4743]: I1122 10:16:37.058417 4743 scope.go:117] "RemoveContainer" containerID="305adb6454c27b78db6c12a014f0fd8a3af5113065c4932ea5b35797429acca6" Nov 22 10:16:37 crc kubenswrapper[4743]: I1122 10:16:37.096436 4743 scope.go:117] "RemoveContainer" containerID="4e06356717f4dd0511f005510a94de0df4a596b48645ae695ff638e7e1c00071" Nov 22 10:16:37 crc kubenswrapper[4743]: I1122 10:16:37.137851 4743 scope.go:117] "RemoveContainer" containerID="6f247fd94cfb097d1a6bb0ab87800adfede67fd65b637dc2238e5e9759fec00c" Nov 22 10:16:58 crc kubenswrapper[4743]: I1122 10:16:58.031456 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-3f39-account-create-2x5s8"] Nov 22 10:16:58 crc kubenswrapper[4743]: I1122 10:16:58.044061 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-qpjd6"] Nov 22 10:16:58 crc kubenswrapper[4743]: I1122 10:16:58.052089 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-3f39-account-create-2x5s8"] Nov 22 10:16:58 crc kubenswrapper[4743]: I1122 10:16:58.059235 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-qpjd6"] Nov 22 10:16:59 crc kubenswrapper[4743]: I1122 10:16:59.162938 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4" path="/var/lib/kubelet/pods/1e8e40fc-5b7d-40dd-81a9-3e230f4db2b4/volumes" Nov 22 10:16:59 crc kubenswrapper[4743]: I1122 10:16:59.164001 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a118f6bb-d0e9-4297-85f5-b579dc740759" path="/var/lib/kubelet/pods/a118f6bb-d0e9-4297-85f5-b579dc740759/volumes" Nov 22 10:17:09 crc kubenswrapper[4743]: I1122 10:17:09.049253 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-xthfh"] Nov 22 10:17:09 crc kubenswrapper[4743]: I1122 10:17:09.064275 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-xthfh"] Nov 22 10:17:09 crc kubenswrapper[4743]: I1122 10:17:09.168066 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4ae49b4-51e1-4ade-a974-4ff8c96ab104" path="/var/lib/kubelet/pods/a4ae49b4-51e1-4ade-a974-4ff8c96ab104/volumes" Nov 22 10:17:31 crc kubenswrapper[4743]: I1122 10:17:31.241711 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:17:31 crc kubenswrapper[4743]: I1122 10:17:31.242248 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:17:37 crc kubenswrapper[4743]: I1122 10:17:37.245881 4743 scope.go:117] "RemoveContainer" containerID="b6973c3bee8f400bd9e9570332545a46be2360cb3d7105ce55aa89dc276d69b1" Nov 22 10:17:37 crc kubenswrapper[4743]: I1122 10:17:37.279862 4743 scope.go:117] "RemoveContainer" containerID="ceca97e764fa576d858225fcf3bbdd07291418b769b76169581d4f154d20487e" Nov 22 10:17:37 crc kubenswrapper[4743]: I1122 10:17:37.338546 4743 scope.go:117] "RemoveContainer" containerID="a14f738a065f4c7defdb2f542e6fe42fa6b15240b1e72be98929afc8a13777ac" Nov 22 10:18:01 crc kubenswrapper[4743]: I1122 10:18:01.241323 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:18:01 crc kubenswrapper[4743]: I1122 10:18:01.241926 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:18:31 crc kubenswrapper[4743]: I1122 10:18:31.241796 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:18:31 crc kubenswrapper[4743]: I1122 10:18:31.242432 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:18:31 crc kubenswrapper[4743]: I1122 10:18:31.242490 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 10:18:31 crc kubenswrapper[4743]: I1122 10:18:31.243378 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0426b7b56555ab35404fad4ab48c6a869ddcec49b88d7575ac1795d9dd87a05"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:18:31 crc kubenswrapper[4743]: I1122 10:18:31.243425 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://d0426b7b56555ab35404fad4ab48c6a869ddcec49b88d7575ac1795d9dd87a05" gracePeriod=600 Nov 22 10:18:31 crc kubenswrapper[4743]: I1122 10:18:31.781990 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="d0426b7b56555ab35404fad4ab48c6a869ddcec49b88d7575ac1795d9dd87a05" exitCode=0 Nov 22 10:18:31 crc kubenswrapper[4743]: I1122 10:18:31.782062 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"d0426b7b56555ab35404fad4ab48c6a869ddcec49b88d7575ac1795d9dd87a05"} Nov 22 10:18:31 crc kubenswrapper[4743]: I1122 10:18:31.782505 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307"} Nov 22 10:18:31 crc kubenswrapper[4743]: I1122 10:18:31.782550 4743 scope.go:117] "RemoveContainer" containerID="2613fe8fbb1670631af320e89f0b9c9eed45e2a88445a55805243ec4a9f1bcb8" Nov 22 10:18:37 crc kubenswrapper[4743]: I1122 10:18:37.468298 4743 scope.go:117] "RemoveContainer" containerID="5a82b645213bfde16185a0b4bbe9a6856677b25aa5ddc2bcc981623ed0b82ce8" Nov 22 10:18:37 crc kubenswrapper[4743]: I1122 10:18:37.578690 4743 scope.go:117] "RemoveContainer" containerID="c1d43726fb42a1b7824b9154247eee5224af7af530e3e6b8d7faf3b0b120e354" Nov 22 10:18:37 crc kubenswrapper[4743]: I1122 10:18:37.649406 4743 scope.go:117] "RemoveContainer" containerID="1e7f80b4a168b53e497a751fcecb364f74032a5d02a7d7802b8a1f8e1bc7b0b9" Nov 22 10:20:00 crc kubenswrapper[4743]: E1122 10:20:00.272050 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60fc17e1_9296_450c_979c_bd863fb3dce6.slice/crio-conmon-8d0f44e4fc0e65412461d4778c2dd4a0bafbe24bf4003bae23bbc0861334b601.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60fc17e1_9296_450c_979c_bd863fb3dce6.slice/crio-8d0f44e4fc0e65412461d4778c2dd4a0bafbe24bf4003bae23bbc0861334b601.scope\": RecentStats: unable to find data in memory cache]" Nov 22 10:20:00 crc kubenswrapper[4743]: I1122 10:20:00.602686 4743 generic.go:334] "Generic (PLEG): container finished" podID="60fc17e1-9296-450c-979c-bd863fb3dce6" containerID="8d0f44e4fc0e65412461d4778c2dd4a0bafbe24bf4003bae23bbc0861334b601" exitCode=0 Nov 22 10:20:00 crc kubenswrapper[4743]: I1122 10:20:00.602757 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" event={"ID":"60fc17e1-9296-450c-979c-bd863fb3dce6","Type":"ContainerDied","Data":"8d0f44e4fc0e65412461d4778c2dd4a0bafbe24bf4003bae23bbc0861334b601"} Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.107487 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.290108 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-inventory\") pod \"60fc17e1-9296-450c-979c-bd863fb3dce6\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.290260 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvrn2\" (UniqueName: \"kubernetes.io/projected/60fc17e1-9296-450c-979c-bd863fb3dce6-kube-api-access-xvrn2\") pod \"60fc17e1-9296-450c-979c-bd863fb3dce6\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.290282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-tripleo-cleanup-combined-ca-bundle\") pod \"60fc17e1-9296-450c-979c-bd863fb3dce6\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.290335 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ssh-key\") pod \"60fc17e1-9296-450c-979c-bd863fb3dce6\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.290427 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ceph\") pod \"60fc17e1-9296-450c-979c-bd863fb3dce6\" (UID: \"60fc17e1-9296-450c-979c-bd863fb3dce6\") " Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.295685 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60fc17e1-9296-450c-979c-bd863fb3dce6-kube-api-access-xvrn2" (OuterVolumeSpecName: "kube-api-access-xvrn2") pod "60fc17e1-9296-450c-979c-bd863fb3dce6" (UID: "60fc17e1-9296-450c-979c-bd863fb3dce6"). InnerVolumeSpecName "kube-api-access-xvrn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.296138 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ceph" (OuterVolumeSpecName: "ceph") pod "60fc17e1-9296-450c-979c-bd863fb3dce6" (UID: "60fc17e1-9296-450c-979c-bd863fb3dce6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.297896 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "60fc17e1-9296-450c-979c-bd863fb3dce6" (UID: "60fc17e1-9296-450c-979c-bd863fb3dce6"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.321400 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-inventory" (OuterVolumeSpecName: "inventory") pod "60fc17e1-9296-450c-979c-bd863fb3dce6" (UID: "60fc17e1-9296-450c-979c-bd863fb3dce6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.321893 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "60fc17e1-9296-450c-979c-bd863fb3dce6" (UID: "60fc17e1-9296-450c-979c-bd863fb3dce6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.394744 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvrn2\" (UniqueName: \"kubernetes.io/projected/60fc17e1-9296-450c-979c-bd863fb3dce6-kube-api-access-xvrn2\") on node \"crc\" DevicePath \"\"" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.394793 4743 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.394823 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.394845 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.394862 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/60fc17e1-9296-450c-979c-bd863fb3dce6-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.622600 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" event={"ID":"60fc17e1-9296-450c-979c-bd863fb3dce6","Type":"ContainerDied","Data":"0660d7154eca5cdaca8f3d286f77f89bb0885b66d8798153e70a05582796d70b"} Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.622644 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0660d7154eca5cdaca8f3d286f77f89bb0885b66d8798153e70a05582796d70b" Nov 22 10:20:02 crc kubenswrapper[4743]: I1122 10:20:02.623108 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.640256 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-zmtf8"] Nov 22 10:20:11 crc kubenswrapper[4743]: E1122 10:20:11.641201 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerName="extract-utilities" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.641214 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerName="extract-utilities" Nov 22 10:20:11 crc kubenswrapper[4743]: E1122 10:20:11.641234 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60fc17e1-9296-450c-979c-bd863fb3dce6" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.641243 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="60fc17e1-9296-450c-979c-bd863fb3dce6" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 22 10:20:11 crc kubenswrapper[4743]: E1122 10:20:11.641273 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerName="registry-server" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.641279 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerName="registry-server" Nov 22 10:20:11 crc kubenswrapper[4743]: E1122 10:20:11.641293 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerName="extract-content" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.641299 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerName="extract-content" Nov 22 10:20:11 crc kubenswrapper[4743]: E1122 10:20:11.641312 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="096d6397-d699-489c-91ec-371da2bfc7d6" containerName="collect-profiles" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.641318 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="096d6397-d699-489c-91ec-371da2bfc7d6" containerName="collect-profiles" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.641504 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="096d6397-d699-489c-91ec-371da2bfc7d6" containerName="collect-profiles" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.641514 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf71ead-91e0-4fb3-8ed8-7738e726a02f" containerName="registry-server" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.641530 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="60fc17e1-9296-450c-979c-bd863fb3dce6" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.642359 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.646479 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.646750 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.646932 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.647043 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.654587 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-zmtf8"] Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.804015 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-inventory\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.804082 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ceph\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.804159 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.804198 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.804248 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w62bm\" (UniqueName: \"kubernetes.io/projected/78397a11-5fa6-4b3d-9c5b-09f32678adca-kube-api-access-w62bm\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.906139 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-inventory\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.906207 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ceph\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.906243 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.906283 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.906334 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w62bm\" (UniqueName: \"kubernetes.io/projected/78397a11-5fa6-4b3d-9c5b-09f32678adca-kube-api-access-w62bm\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.912195 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-inventory\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.912418 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.912781 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.926541 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ceph\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:11 crc kubenswrapper[4743]: I1122 10:20:11.927048 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w62bm\" (UniqueName: \"kubernetes.io/projected/78397a11-5fa6-4b3d-9c5b-09f32678adca-kube-api-access-w62bm\") pod \"bootstrap-openstack-openstack-cell1-zmtf8\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:12 crc kubenswrapper[4743]: I1122 10:20:12.012210 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:20:12 crc kubenswrapper[4743]: I1122 10:20:12.569393 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-zmtf8"] Nov 22 10:20:12 crc kubenswrapper[4743]: I1122 10:20:12.570114 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:20:12 crc kubenswrapper[4743]: I1122 10:20:12.743382 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" event={"ID":"78397a11-5fa6-4b3d-9c5b-09f32678adca","Type":"ContainerStarted","Data":"901bcf6b9213f5ad169801ae1a89f38cfdf5eee52ff060f3bc18be7e7c819dcb"} Nov 22 10:20:14 crc kubenswrapper[4743]: I1122 10:20:14.766412 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" event={"ID":"78397a11-5fa6-4b3d-9c5b-09f32678adca","Type":"ContainerStarted","Data":"945e4535fd0a9733288451c57b6064409ec67642b2b0d6f2af17aca5ab004126"} Nov 22 10:20:14 crc kubenswrapper[4743]: I1122 10:20:14.791182 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" podStartSLOduration=2.522714603 podStartE2EDuration="3.791159634s" podCreationTimestamp="2025-11-22 10:20:11 +0000 UTC" firstStartedPulling="2025-11-22 10:20:12.569892163 +0000 UTC m=+7086.276253215" lastFinishedPulling="2025-11-22 10:20:13.838337194 +0000 UTC m=+7087.544698246" observedRunningTime="2025-11-22 10:20:14.783820103 +0000 UTC m=+7088.490181165" watchObservedRunningTime="2025-11-22 10:20:14.791159634 +0000 UTC m=+7088.497520686" Nov 22 10:20:31 crc kubenswrapper[4743]: I1122 10:20:31.241303 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:20:31 crc kubenswrapper[4743]: I1122 10:20:31.241908 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:21:01 crc kubenswrapper[4743]: I1122 10:21:01.241473 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:21:01 crc kubenswrapper[4743]: I1122 10:21:01.242141 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:21:31 crc kubenswrapper[4743]: I1122 10:21:31.241790 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:21:31 crc kubenswrapper[4743]: I1122 10:21:31.242773 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:21:31 crc kubenswrapper[4743]: I1122 10:21:31.242900 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 10:21:31 crc kubenswrapper[4743]: I1122 10:21:31.244527 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:21:31 crc kubenswrapper[4743]: I1122 10:21:31.244671 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" gracePeriod=600 Nov 22 10:21:31 crc kubenswrapper[4743]: E1122 10:21:31.379916 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:21:31 crc kubenswrapper[4743]: I1122 10:21:31.523332 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" exitCode=0 Nov 22 10:21:31 crc kubenswrapper[4743]: I1122 10:21:31.523412 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307"} Nov 22 10:21:31 crc kubenswrapper[4743]: I1122 10:21:31.523759 4743 scope.go:117] "RemoveContainer" containerID="d0426b7b56555ab35404fad4ab48c6a869ddcec49b88d7575ac1795d9dd87a05" Nov 22 10:21:31 crc kubenswrapper[4743]: I1122 10:21:31.524958 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:21:31 crc kubenswrapper[4743]: E1122 10:21:31.525320 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:21:42 crc kubenswrapper[4743]: I1122 10:21:42.152843 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:21:42 crc kubenswrapper[4743]: E1122 10:21:42.153682 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:21:56 crc kubenswrapper[4743]: I1122 10:21:56.151718 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:21:56 crc kubenswrapper[4743]: E1122 10:21:56.152382 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:22:07 crc kubenswrapper[4743]: I1122 10:22:07.161016 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:22:07 crc kubenswrapper[4743]: E1122 10:22:07.162093 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.054800 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r299w"] Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.060643 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.064830 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r299w"] Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.171811 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-catalog-content\") pod \"community-operators-r299w\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.171946 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-utilities\") pod \"community-operators-r299w\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.172029 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fps9h\" (UniqueName: \"kubernetes.io/projected/93707677-3a77-4e97-a4f6-16a031bdd57b-kube-api-access-fps9h\") pod \"community-operators-r299w\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.273766 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-catalog-content\") pod \"community-operators-r299w\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.273840 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-utilities\") pod \"community-operators-r299w\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.273904 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fps9h\" (UniqueName: \"kubernetes.io/projected/93707677-3a77-4e97-a4f6-16a031bdd57b-kube-api-access-fps9h\") pod \"community-operators-r299w\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.274875 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-catalog-content\") pod \"community-operators-r299w\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.274924 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-utilities\") pod \"community-operators-r299w\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.304872 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fps9h\" (UniqueName: \"kubernetes.io/projected/93707677-3a77-4e97-a4f6-16a031bdd57b-kube-api-access-fps9h\") pod \"community-operators-r299w\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.384602 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:20 crc kubenswrapper[4743]: I1122 10:22:20.941001 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r299w"] Nov 22 10:22:21 crc kubenswrapper[4743]: I1122 10:22:21.056137 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r299w" event={"ID":"93707677-3a77-4e97-a4f6-16a031bdd57b","Type":"ContainerStarted","Data":"fec1c86dc41a53ffbf1e108d96b8266fbf8d93f77788dca52cf1988cf1acc960"} Nov 22 10:22:21 crc kubenswrapper[4743]: I1122 10:22:21.152800 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:22:21 crc kubenswrapper[4743]: E1122 10:22:21.153242 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:22:22 crc kubenswrapper[4743]: I1122 10:22:22.066133 4743 generic.go:334] "Generic (PLEG): container finished" podID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerID="b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab" exitCode=0 Nov 22 10:22:22 crc kubenswrapper[4743]: I1122 10:22:22.066181 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r299w" event={"ID":"93707677-3a77-4e97-a4f6-16a031bdd57b","Type":"ContainerDied","Data":"b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab"} Nov 22 10:22:23 crc kubenswrapper[4743]: I1122 10:22:23.080482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r299w" event={"ID":"93707677-3a77-4e97-a4f6-16a031bdd57b","Type":"ContainerStarted","Data":"babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58"} Nov 22 10:22:25 crc kubenswrapper[4743]: I1122 10:22:25.099916 4743 generic.go:334] "Generic (PLEG): container finished" podID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerID="babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58" exitCode=0 Nov 22 10:22:25 crc kubenswrapper[4743]: I1122 10:22:25.099969 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r299w" event={"ID":"93707677-3a77-4e97-a4f6-16a031bdd57b","Type":"ContainerDied","Data":"babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58"} Nov 22 10:22:26 crc kubenswrapper[4743]: I1122 10:22:26.113400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r299w" event={"ID":"93707677-3a77-4e97-a4f6-16a031bdd57b","Type":"ContainerStarted","Data":"1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508"} Nov 22 10:22:26 crc kubenswrapper[4743]: I1122 10:22:26.140943 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r299w" podStartSLOduration=3.733268855 podStartE2EDuration="7.140918172s" podCreationTimestamp="2025-11-22 10:22:19 +0000 UTC" firstStartedPulling="2025-11-22 10:22:22.068050389 +0000 UTC m=+7215.774411441" lastFinishedPulling="2025-11-22 10:22:25.475699706 +0000 UTC m=+7219.182060758" observedRunningTime="2025-11-22 10:22:26.135166866 +0000 UTC m=+7219.841527928" watchObservedRunningTime="2025-11-22 10:22:26.140918172 +0000 UTC m=+7219.847279224" Nov 22 10:22:30 crc kubenswrapper[4743]: I1122 10:22:30.386035 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:30 crc kubenswrapper[4743]: I1122 10:22:30.386663 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:30 crc kubenswrapper[4743]: I1122 10:22:30.438753 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:31 crc kubenswrapper[4743]: I1122 10:22:31.207061 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:31 crc kubenswrapper[4743]: I1122 10:22:31.284078 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r299w"] Nov 22 10:22:33 crc kubenswrapper[4743]: I1122 10:22:33.175983 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r299w" podUID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerName="registry-server" containerID="cri-o://1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508" gracePeriod=2 Nov 22 10:22:33 crc kubenswrapper[4743]: I1122 10:22:33.777407 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:33 crc kubenswrapper[4743]: I1122 10:22:33.978391 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-catalog-content\") pod \"93707677-3a77-4e97-a4f6-16a031bdd57b\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " Nov 22 10:22:33 crc kubenswrapper[4743]: I1122 10:22:33.978773 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fps9h\" (UniqueName: \"kubernetes.io/projected/93707677-3a77-4e97-a4f6-16a031bdd57b-kube-api-access-fps9h\") pod \"93707677-3a77-4e97-a4f6-16a031bdd57b\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " Nov 22 10:22:33 crc kubenswrapper[4743]: I1122 10:22:33.978820 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-utilities\") pod \"93707677-3a77-4e97-a4f6-16a031bdd57b\" (UID: \"93707677-3a77-4e97-a4f6-16a031bdd57b\") " Nov 22 10:22:33 crc kubenswrapper[4743]: I1122 10:22:33.979792 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-utilities" (OuterVolumeSpecName: "utilities") pod "93707677-3a77-4e97-a4f6-16a031bdd57b" (UID: "93707677-3a77-4e97-a4f6-16a031bdd57b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:22:33 crc kubenswrapper[4743]: I1122 10:22:33.983913 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93707677-3a77-4e97-a4f6-16a031bdd57b-kube-api-access-fps9h" (OuterVolumeSpecName: "kube-api-access-fps9h") pod "93707677-3a77-4e97-a4f6-16a031bdd57b" (UID: "93707677-3a77-4e97-a4f6-16a031bdd57b"). InnerVolumeSpecName "kube-api-access-fps9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.035961 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93707677-3a77-4e97-a4f6-16a031bdd57b" (UID: "93707677-3a77-4e97-a4f6-16a031bdd57b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.080421 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.080462 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fps9h\" (UniqueName: \"kubernetes.io/projected/93707677-3a77-4e97-a4f6-16a031bdd57b-kube-api-access-fps9h\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.080473 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93707677-3a77-4e97-a4f6-16a031bdd57b-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.186444 4743 generic.go:334] "Generic (PLEG): container finished" podID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerID="1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508" exitCode=0 Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.186489 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r299w" event={"ID":"93707677-3a77-4e97-a4f6-16a031bdd57b","Type":"ContainerDied","Data":"1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508"} Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.186503 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r299w" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.186521 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r299w" event={"ID":"93707677-3a77-4e97-a4f6-16a031bdd57b","Type":"ContainerDied","Data":"fec1c86dc41a53ffbf1e108d96b8266fbf8d93f77788dca52cf1988cf1acc960"} Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.186541 4743 scope.go:117] "RemoveContainer" containerID="1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.214447 4743 scope.go:117] "RemoveContainer" containerID="babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.219238 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r299w"] Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.233337 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r299w"] Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.245549 4743 scope.go:117] "RemoveContainer" containerID="b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.287938 4743 scope.go:117] "RemoveContainer" containerID="1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508" Nov 22 10:22:34 crc kubenswrapper[4743]: E1122 10:22:34.288354 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508\": container with ID starting with 1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508 not found: ID does not exist" containerID="1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.288405 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508"} err="failed to get container status \"1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508\": rpc error: code = NotFound desc = could not find container \"1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508\": container with ID starting with 1bb5b41ad34e7108d68ea7f63f60362f4a8fdafe0a52f3c56bae5da14050e508 not found: ID does not exist" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.288431 4743 scope.go:117] "RemoveContainer" containerID="babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58" Nov 22 10:22:34 crc kubenswrapper[4743]: E1122 10:22:34.288973 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58\": container with ID starting with babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58 not found: ID does not exist" containerID="babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.289017 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58"} err="failed to get container status \"babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58\": rpc error: code = NotFound desc = could not find container \"babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58\": container with ID starting with babfb765767c52284b3fbba2ced6a8dbcb019e5b477843e7f63bc281471cdd58 not found: ID does not exist" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.289045 4743 scope.go:117] "RemoveContainer" containerID="b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab" Nov 22 10:22:34 crc kubenswrapper[4743]: E1122 10:22:34.289366 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab\": container with ID starting with b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab not found: ID does not exist" containerID="b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab" Nov 22 10:22:34 crc kubenswrapper[4743]: I1122 10:22:34.289431 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab"} err="failed to get container status \"b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab\": rpc error: code = NotFound desc = could not find container \"b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab\": container with ID starting with b0e217808bf9b781f468b4fb66bd350b0a74cdde4505decf6d532fb1756f9aab not found: ID does not exist" Nov 22 10:22:35 crc kubenswrapper[4743]: I1122 10:22:35.162262 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93707677-3a77-4e97-a4f6-16a031bdd57b" path="/var/lib/kubelet/pods/93707677-3a77-4e97-a4f6-16a031bdd57b/volumes" Nov 22 10:22:36 crc kubenswrapper[4743]: I1122 10:22:36.152201 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:22:36 crc kubenswrapper[4743]: E1122 10:22:36.152934 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.007255 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pwsdp"] Nov 22 10:22:43 crc kubenswrapper[4743]: E1122 10:22:43.009743 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerName="extract-content" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.009910 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerName="extract-content" Nov 22 10:22:43 crc kubenswrapper[4743]: E1122 10:22:43.010035 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerName="registry-server" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.010122 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerName="registry-server" Nov 22 10:22:43 crc kubenswrapper[4743]: E1122 10:22:43.010226 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerName="extract-utilities" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.010309 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerName="extract-utilities" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.010682 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="93707677-3a77-4e97-a4f6-16a031bdd57b" containerName="registry-server" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.012945 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.017892 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwsdp"] Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.025097 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbknf\" (UniqueName: \"kubernetes.io/projected/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-kube-api-access-vbknf\") pod \"redhat-marketplace-pwsdp\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.025212 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-catalog-content\") pod \"redhat-marketplace-pwsdp\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.025266 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-utilities\") pod \"redhat-marketplace-pwsdp\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.126426 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-catalog-content\") pod \"redhat-marketplace-pwsdp\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.126492 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-utilities\") pod \"redhat-marketplace-pwsdp\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.126626 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbknf\" (UniqueName: \"kubernetes.io/projected/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-kube-api-access-vbknf\") pod \"redhat-marketplace-pwsdp\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.127407 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-catalog-content\") pod \"redhat-marketplace-pwsdp\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.127697 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-utilities\") pod \"redhat-marketplace-pwsdp\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.155858 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbknf\" (UniqueName: \"kubernetes.io/projected/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-kube-api-access-vbknf\") pod \"redhat-marketplace-pwsdp\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.344697 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:43 crc kubenswrapper[4743]: I1122 10:22:43.843365 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwsdp"] Nov 22 10:22:44 crc kubenswrapper[4743]: I1122 10:22:44.300561 4743 generic.go:334] "Generic (PLEG): container finished" podID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerID="c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae" exitCode=0 Nov 22 10:22:44 crc kubenswrapper[4743]: I1122 10:22:44.300694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwsdp" event={"ID":"b0298bc8-bfda-438e-a0cb-d0ede0346ce9","Type":"ContainerDied","Data":"c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae"} Nov 22 10:22:44 crc kubenswrapper[4743]: I1122 10:22:44.300885 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwsdp" event={"ID":"b0298bc8-bfda-438e-a0cb-d0ede0346ce9","Type":"ContainerStarted","Data":"1c2a8a44a9af0b9b96c790656c09edb688a8bb68038fe4cae6bf4de533f2552f"} Nov 22 10:22:45 crc kubenswrapper[4743]: I1122 10:22:45.315223 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwsdp" event={"ID":"b0298bc8-bfda-438e-a0cb-d0ede0346ce9","Type":"ContainerStarted","Data":"08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67"} Nov 22 10:22:46 crc kubenswrapper[4743]: I1122 10:22:46.327897 4743 generic.go:334] "Generic (PLEG): container finished" podID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerID="08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67" exitCode=0 Nov 22 10:22:46 crc kubenswrapper[4743]: I1122 10:22:46.328005 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwsdp" event={"ID":"b0298bc8-bfda-438e-a0cb-d0ede0346ce9","Type":"ContainerDied","Data":"08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67"} Nov 22 10:22:47 crc kubenswrapper[4743]: I1122 10:22:47.339555 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwsdp" event={"ID":"b0298bc8-bfda-438e-a0cb-d0ede0346ce9","Type":"ContainerStarted","Data":"e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb"} Nov 22 10:22:47 crc kubenswrapper[4743]: I1122 10:22:47.363965 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pwsdp" podStartSLOduration=2.93092287 podStartE2EDuration="5.36394758s" podCreationTimestamp="2025-11-22 10:22:42 +0000 UTC" firstStartedPulling="2025-11-22 10:22:44.302635005 +0000 UTC m=+7238.008996057" lastFinishedPulling="2025-11-22 10:22:46.735659715 +0000 UTC m=+7240.442020767" observedRunningTime="2025-11-22 10:22:47.363566729 +0000 UTC m=+7241.069927781" watchObservedRunningTime="2025-11-22 10:22:47.36394758 +0000 UTC m=+7241.070308622" Nov 22 10:22:49 crc kubenswrapper[4743]: I1122 10:22:49.152062 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:22:49 crc kubenswrapper[4743]: E1122 10:22:49.152708 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:22:53 crc kubenswrapper[4743]: I1122 10:22:53.346234 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:53 crc kubenswrapper[4743]: I1122 10:22:53.346762 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:53 crc kubenswrapper[4743]: I1122 10:22:53.401953 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:53 crc kubenswrapper[4743]: I1122 10:22:53.480178 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:53 crc kubenswrapper[4743]: I1122 10:22:53.651955 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwsdp"] Nov 22 10:22:55 crc kubenswrapper[4743]: I1122 10:22:55.424971 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pwsdp" podUID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerName="registry-server" containerID="cri-o://e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb" gracePeriod=2 Nov 22 10:22:55 crc kubenswrapper[4743]: I1122 10:22:55.914000 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.042928 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-utilities\") pod \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.042998 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbknf\" (UniqueName: \"kubernetes.io/projected/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-kube-api-access-vbknf\") pod \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.043309 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-catalog-content\") pod \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\" (UID: \"b0298bc8-bfda-438e-a0cb-d0ede0346ce9\") " Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.045037 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-utilities" (OuterVolumeSpecName: "utilities") pod "b0298bc8-bfda-438e-a0cb-d0ede0346ce9" (UID: "b0298bc8-bfda-438e-a0cb-d0ede0346ce9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.048434 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-kube-api-access-vbknf" (OuterVolumeSpecName: "kube-api-access-vbknf") pod "b0298bc8-bfda-438e-a0cb-d0ede0346ce9" (UID: "b0298bc8-bfda-438e-a0cb-d0ede0346ce9"). InnerVolumeSpecName "kube-api-access-vbknf". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.061545 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0298bc8-bfda-438e-a0cb-d0ede0346ce9" (UID: "b0298bc8-bfda-438e-a0cb-d0ede0346ce9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.145942 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.145968 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.145980 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbknf\" (UniqueName: \"kubernetes.io/projected/b0298bc8-bfda-438e-a0cb-d0ede0346ce9-kube-api-access-vbknf\") on node \"crc\" DevicePath \"\"" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.435027 4743 generic.go:334] "Generic (PLEG): container finished" podID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerID="e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb" exitCode=0 Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.435079 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pwsdp" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.435094 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwsdp" event={"ID":"b0298bc8-bfda-438e-a0cb-d0ede0346ce9","Type":"ContainerDied","Data":"e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb"} Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.436210 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pwsdp" event={"ID":"b0298bc8-bfda-438e-a0cb-d0ede0346ce9","Type":"ContainerDied","Data":"1c2a8a44a9af0b9b96c790656c09edb688a8bb68038fe4cae6bf4de533f2552f"} Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.436229 4743 scope.go:117] "RemoveContainer" containerID="e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.460878 4743 scope.go:117] "RemoveContainer" containerID="08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.484395 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwsdp"] Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.499321 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pwsdp"] Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.510193 4743 scope.go:117] "RemoveContainer" containerID="c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.549592 4743 scope.go:117] "RemoveContainer" containerID="e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb" Nov 22 10:22:56 crc kubenswrapper[4743]: E1122 10:22:56.550072 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb\": container with ID starting with e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb not found: ID does not exist" containerID="e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.550119 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb"} err="failed to get container status \"e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb\": rpc error: code = NotFound desc = could not find container \"e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb\": container with ID starting with e834179a48f2b9c87f3267355992863b4380b37cfb39e4fe45f33f27835337bb not found: ID does not exist" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.550153 4743 scope.go:117] "RemoveContainer" containerID="08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67" Nov 22 10:22:56 crc kubenswrapper[4743]: E1122 10:22:56.550502 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67\": container with ID starting with 08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67 not found: ID does not exist" containerID="08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.550543 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67"} err="failed to get container status \"08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67\": rpc error: code = NotFound desc = could not find container \"08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67\": container with ID starting with 08618c5f425824e4fce3ca16a2c4b7d5089988a343aa3c50f7d8d30b273a3e67 not found: ID does not exist" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.550583 4743 scope.go:117] "RemoveContainer" containerID="c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae" Nov 22 10:22:56 crc kubenswrapper[4743]: E1122 10:22:56.550920 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae\": container with ID starting with c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae not found: ID does not exist" containerID="c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae" Nov 22 10:22:56 crc kubenswrapper[4743]: I1122 10:22:56.550967 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae"} err="failed to get container status \"c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae\": rpc error: code = NotFound desc = could not find container \"c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae\": container with ID starting with c72fb7583cb667fd261ac4aa8333cef19d1fb7ea05a9b8eaa7cc9f683070c5ae not found: ID does not exist" Nov 22 10:22:57 crc kubenswrapper[4743]: I1122 10:22:57.169506 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" path="/var/lib/kubelet/pods/b0298bc8-bfda-438e-a0cb-d0ede0346ce9/volumes" Nov 22 10:23:04 crc kubenswrapper[4743]: I1122 10:23:04.151661 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:23:04 crc kubenswrapper[4743]: E1122 10:23:04.152821 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:23:15 crc kubenswrapper[4743]: I1122 10:23:15.152286 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:23:15 crc kubenswrapper[4743]: E1122 10:23:15.153228 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:23:18 crc kubenswrapper[4743]: I1122 10:23:18.646909 4743 generic.go:334] "Generic (PLEG): container finished" podID="78397a11-5fa6-4b3d-9c5b-09f32678adca" containerID="945e4535fd0a9733288451c57b6064409ec67642b2b0d6f2af17aca5ab004126" exitCode=0 Nov 22 10:23:18 crc kubenswrapper[4743]: I1122 10:23:18.647057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" event={"ID":"78397a11-5fa6-4b3d-9c5b-09f32678adca","Type":"ContainerDied","Data":"945e4535fd0a9733288451c57b6064409ec67642b2b0d6f2af17aca5ab004126"} Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.140844 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.280561 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-inventory\") pod \"78397a11-5fa6-4b3d-9c5b-09f32678adca\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.280788 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ssh-key\") pod \"78397a11-5fa6-4b3d-9c5b-09f32678adca\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.280872 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ceph\") pod \"78397a11-5fa6-4b3d-9c5b-09f32678adca\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.280937 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w62bm\" (UniqueName: \"kubernetes.io/projected/78397a11-5fa6-4b3d-9c5b-09f32678adca-kube-api-access-w62bm\") pod \"78397a11-5fa6-4b3d-9c5b-09f32678adca\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.281080 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-bootstrap-combined-ca-bundle\") pod \"78397a11-5fa6-4b3d-9c5b-09f32678adca\" (UID: \"78397a11-5fa6-4b3d-9c5b-09f32678adca\") " Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.298735 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ceph" (OuterVolumeSpecName: "ceph") pod "78397a11-5fa6-4b3d-9c5b-09f32678adca" (UID: "78397a11-5fa6-4b3d-9c5b-09f32678adca"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.298773 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78397a11-5fa6-4b3d-9c5b-09f32678adca-kube-api-access-w62bm" (OuterVolumeSpecName: "kube-api-access-w62bm") pod "78397a11-5fa6-4b3d-9c5b-09f32678adca" (UID: "78397a11-5fa6-4b3d-9c5b-09f32678adca"). InnerVolumeSpecName "kube-api-access-w62bm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.298883 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "78397a11-5fa6-4b3d-9c5b-09f32678adca" (UID: "78397a11-5fa6-4b3d-9c5b-09f32678adca"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.311409 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-inventory" (OuterVolumeSpecName: "inventory") pod "78397a11-5fa6-4b3d-9c5b-09f32678adca" (UID: "78397a11-5fa6-4b3d-9c5b-09f32678adca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.321274 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "78397a11-5fa6-4b3d-9c5b-09f32678adca" (UID: "78397a11-5fa6-4b3d-9c5b-09f32678adca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.384111 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.384149 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w62bm\" (UniqueName: \"kubernetes.io/projected/78397a11-5fa6-4b3d-9c5b-09f32678adca-kube-api-access-w62bm\") on node \"crc\" DevicePath \"\"" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.384166 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.384175 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.384184 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78397a11-5fa6-4b3d-9c5b-09f32678adca-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.678820 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" event={"ID":"78397a11-5fa6-4b3d-9c5b-09f32678adca","Type":"ContainerDied","Data":"901bcf6b9213f5ad169801ae1a89f38cfdf5eee52ff060f3bc18be7e7c819dcb"} Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.678864 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="901bcf6b9213f5ad169801ae1a89f38cfdf5eee52ff060f3bc18be7e7c819dcb" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.678939 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-zmtf8" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.749821 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fqxgm"] Nov 22 10:23:20 crc kubenswrapper[4743]: E1122 10:23:20.750416 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerName="registry-server" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.750439 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerName="registry-server" Nov 22 10:23:20 crc kubenswrapper[4743]: E1122 10:23:20.750485 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerName="extract-content" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.750494 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerName="extract-content" Nov 22 10:23:20 crc kubenswrapper[4743]: E1122 10:23:20.750524 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerName="extract-utilities" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.750534 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerName="extract-utilities" Nov 22 10:23:20 crc kubenswrapper[4743]: E1122 10:23:20.750547 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78397a11-5fa6-4b3d-9c5b-09f32678adca" containerName="bootstrap-openstack-openstack-cell1" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.750556 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="78397a11-5fa6-4b3d-9c5b-09f32678adca" containerName="bootstrap-openstack-openstack-cell1" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.750822 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0298bc8-bfda-438e-a0cb-d0ede0346ce9" containerName="registry-server" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.750865 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="78397a11-5fa6-4b3d-9c5b-09f32678adca" containerName="bootstrap-openstack-openstack-cell1" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.751885 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.758508 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.758916 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.759033 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.759051 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.761906 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fqxgm"] Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.892403 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ceph\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.892517 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-inventory\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.892618 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ssh-key\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.892735 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xmnm\" (UniqueName: \"kubernetes.io/projected/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-kube-api-access-8xmnm\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.994499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-inventory\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.994629 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ssh-key\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.994675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xmnm\" (UniqueName: \"kubernetes.io/projected/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-kube-api-access-8xmnm\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.994740 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ceph\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.998636 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-inventory\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:20 crc kubenswrapper[4743]: I1122 10:23:20.998655 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ssh-key\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:21 crc kubenswrapper[4743]: I1122 10:23:21.004763 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ceph\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:21 crc kubenswrapper[4743]: I1122 10:23:21.016033 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xmnm\" (UniqueName: \"kubernetes.io/projected/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-kube-api-access-8xmnm\") pod \"download-cache-openstack-openstack-cell1-fqxgm\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:21 crc kubenswrapper[4743]: I1122 10:23:21.075173 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:23:21 crc kubenswrapper[4743]: I1122 10:23:21.620977 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-fqxgm"] Nov 22 10:23:21 crc kubenswrapper[4743]: I1122 10:23:21.696337 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" event={"ID":"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a","Type":"ContainerStarted","Data":"8d5658fbf8dea8d912628edebfe30ecc979544d645c7c804d134d249a8528f32"} Nov 22 10:23:22 crc kubenswrapper[4743]: I1122 10:23:22.717038 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" event={"ID":"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a","Type":"ContainerStarted","Data":"071ca548402028fd093501a493554c0df2acbb76acca4a55029aa0f21c39e9e5"} Nov 22 10:23:22 crc kubenswrapper[4743]: I1122 10:23:22.744531 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" podStartSLOduration=2.286584125 podStartE2EDuration="2.74451628s" podCreationTimestamp="2025-11-22 10:23:20 +0000 UTC" firstStartedPulling="2025-11-22 10:23:21.618428525 +0000 UTC m=+7275.324789577" lastFinishedPulling="2025-11-22 10:23:22.07636068 +0000 UTC m=+7275.782721732" observedRunningTime="2025-11-22 10:23:22.739125295 +0000 UTC m=+7276.445486347" watchObservedRunningTime="2025-11-22 10:23:22.74451628 +0000 UTC m=+7276.450877332" Nov 22 10:23:26 crc kubenswrapper[4743]: I1122 10:23:26.152137 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:23:26 crc kubenswrapper[4743]: E1122 10:23:26.152876 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:23:37 crc kubenswrapper[4743]: I1122 10:23:37.159261 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:23:37 crc kubenswrapper[4743]: E1122 10:23:37.160479 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:23:49 crc kubenswrapper[4743]: I1122 10:23:49.151490 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:23:49 crc kubenswrapper[4743]: E1122 10:23:49.152239 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.811738 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nhl7r"] Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.814452 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.829107 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-catalog-content\") pod \"certified-operators-nhl7r\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.829229 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-utilities\") pod \"certified-operators-nhl7r\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.829331 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bswjl\" (UniqueName: \"kubernetes.io/projected/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-kube-api-access-bswjl\") pod \"certified-operators-nhl7r\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.838186 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nhl7r"] Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.932196 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bswjl\" (UniqueName: \"kubernetes.io/projected/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-kube-api-access-bswjl\") pod \"certified-operators-nhl7r\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.932363 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-catalog-content\") pod \"certified-operators-nhl7r\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.932423 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-utilities\") pod \"certified-operators-nhl7r\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.932994 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-catalog-content\") pod \"certified-operators-nhl7r\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.932994 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-utilities\") pod \"certified-operators-nhl7r\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:53 crc kubenswrapper[4743]: I1122 10:23:53.958321 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bswjl\" (UniqueName: \"kubernetes.io/projected/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-kube-api-access-bswjl\") pod \"certified-operators-nhl7r\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:54 crc kubenswrapper[4743]: I1122 10:23:54.136794 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:23:54 crc kubenswrapper[4743]: I1122 10:23:54.799259 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nhl7r"] Nov 22 10:23:55 crc kubenswrapper[4743]: I1122 10:23:55.027248 4743 generic.go:334] "Generic (PLEG): container finished" podID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerID="802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b" exitCode=0 Nov 22 10:23:55 crc kubenswrapper[4743]: I1122 10:23:55.027299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhl7r" event={"ID":"53f928b3-cffa-4b56-9253-d3af1e5e6c8d","Type":"ContainerDied","Data":"802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b"} Nov 22 10:23:55 crc kubenswrapper[4743]: I1122 10:23:55.027349 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhl7r" event={"ID":"53f928b3-cffa-4b56-9253-d3af1e5e6c8d","Type":"ContainerStarted","Data":"b2282700c2dbe0d650e59516c3c3f99ad3b1493657059e8a95a2416ff8ac2419"} Nov 22 10:23:56 crc kubenswrapper[4743]: I1122 10:23:56.043299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhl7r" event={"ID":"53f928b3-cffa-4b56-9253-d3af1e5e6c8d","Type":"ContainerStarted","Data":"790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e"} Nov 22 10:23:58 crc kubenswrapper[4743]: I1122 10:23:58.065237 4743 generic.go:334] "Generic (PLEG): container finished" podID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerID="790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e" exitCode=0 Nov 22 10:23:58 crc kubenswrapper[4743]: I1122 10:23:58.065277 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhl7r" event={"ID":"53f928b3-cffa-4b56-9253-d3af1e5e6c8d","Type":"ContainerDied","Data":"790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e"} Nov 22 10:23:59 crc kubenswrapper[4743]: I1122 10:23:59.080522 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhl7r" event={"ID":"53f928b3-cffa-4b56-9253-d3af1e5e6c8d","Type":"ContainerStarted","Data":"5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057"} Nov 22 10:24:04 crc kubenswrapper[4743]: I1122 10:24:04.137353 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:24:04 crc kubenswrapper[4743]: I1122 10:24:04.138000 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:24:04 crc kubenswrapper[4743]: I1122 10:24:04.151228 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:24:04 crc kubenswrapper[4743]: E1122 10:24:04.151483 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:24:04 crc kubenswrapper[4743]: I1122 10:24:04.188663 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:24:04 crc kubenswrapper[4743]: I1122 10:24:04.212495 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nhl7r" podStartSLOduration=7.638329984 podStartE2EDuration="11.21246905s" podCreationTimestamp="2025-11-22 10:23:53 +0000 UTC" firstStartedPulling="2025-11-22 10:23:55.029313437 +0000 UTC m=+7308.735674489" lastFinishedPulling="2025-11-22 10:23:58.603452463 +0000 UTC m=+7312.309813555" observedRunningTime="2025-11-22 10:23:59.105080353 +0000 UTC m=+7312.811441415" watchObservedRunningTime="2025-11-22 10:24:04.21246905 +0000 UTC m=+7317.918830102" Nov 22 10:24:05 crc kubenswrapper[4743]: I1122 10:24:05.196281 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:24:05 crc kubenswrapper[4743]: I1122 10:24:05.246403 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nhl7r"] Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.170676 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nhl7r" podUID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerName="registry-server" containerID="cri-o://5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057" gracePeriod=2 Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.688532 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.734043 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-catalog-content\") pod \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.734174 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bswjl\" (UniqueName: \"kubernetes.io/projected/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-kube-api-access-bswjl\") pod \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.735332 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-utilities\") pod \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\" (UID: \"53f928b3-cffa-4b56-9253-d3af1e5e6c8d\") " Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.736285 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-utilities" (OuterVolumeSpecName: "utilities") pod "53f928b3-cffa-4b56-9253-d3af1e5e6c8d" (UID: "53f928b3-cffa-4b56-9253-d3af1e5e6c8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.742073 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-kube-api-access-bswjl" (OuterVolumeSpecName: "kube-api-access-bswjl") pod "53f928b3-cffa-4b56-9253-d3af1e5e6c8d" (UID: "53f928b3-cffa-4b56-9253-d3af1e5e6c8d"). InnerVolumeSpecName "kube-api-access-bswjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.778543 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53f928b3-cffa-4b56-9253-d3af1e5e6c8d" (UID: "53f928b3-cffa-4b56-9253-d3af1e5e6c8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.837779 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.837812 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:24:07 crc kubenswrapper[4743]: I1122 10:24:07.837823 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bswjl\" (UniqueName: \"kubernetes.io/projected/53f928b3-cffa-4b56-9253-d3af1e5e6c8d-kube-api-access-bswjl\") on node \"crc\" DevicePath \"\"" Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.182176 4743 generic.go:334] "Generic (PLEG): container finished" podID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerID="5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057" exitCode=0 Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.182226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhl7r" event={"ID":"53f928b3-cffa-4b56-9253-d3af1e5e6c8d","Type":"ContainerDied","Data":"5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057"} Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.182257 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nhl7r" event={"ID":"53f928b3-cffa-4b56-9253-d3af1e5e6c8d","Type":"ContainerDied","Data":"b2282700c2dbe0d650e59516c3c3f99ad3b1493657059e8a95a2416ff8ac2419"} Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.182259 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nhl7r" Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.182274 4743 scope.go:117] "RemoveContainer" containerID="5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057" Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.203745 4743 scope.go:117] "RemoveContainer" containerID="790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e" Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.235082 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nhl7r"] Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.244192 4743 scope.go:117] "RemoveContainer" containerID="802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b" Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.245798 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nhl7r"] Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.285393 4743 scope.go:117] "RemoveContainer" containerID="5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057" Nov 22 10:24:08 crc kubenswrapper[4743]: E1122 10:24:08.285986 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057\": container with ID starting with 5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057 not found: ID does not exist" containerID="5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057" Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.286020 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057"} err="failed to get container status \"5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057\": rpc error: code = NotFound desc = could not find container \"5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057\": container with ID starting with 5d721b73a1a04a0ff49c5dc71946ba164259c9eafc47735910b595a5f3364057 not found: ID does not exist" Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.286043 4743 scope.go:117] "RemoveContainer" containerID="790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e" Nov 22 10:24:08 crc kubenswrapper[4743]: E1122 10:24:08.286347 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e\": container with ID starting with 790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e not found: ID does not exist" containerID="790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e" Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.286371 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e"} err="failed to get container status \"790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e\": rpc error: code = NotFound desc = could not find container \"790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e\": container with ID starting with 790a628369085c0b7dbaac1299ab557d05d9e289e1682217394499f3712d327e not found: ID does not exist" Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.286387 4743 scope.go:117] "RemoveContainer" containerID="802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b" Nov 22 10:24:08 crc kubenswrapper[4743]: E1122 10:24:08.286799 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b\": container with ID starting with 802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b not found: ID does not exist" containerID="802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b" Nov 22 10:24:08 crc kubenswrapper[4743]: I1122 10:24:08.286845 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b"} err="failed to get container status \"802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b\": rpc error: code = NotFound desc = could not find container \"802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b\": container with ID starting with 802bf9252d6cc63078ceb6fe70345224a114afef73d86f7b04c62abc0edb6f1b not found: ID does not exist" Nov 22 10:24:09 crc kubenswrapper[4743]: I1122 10:24:09.164124 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" path="/var/lib/kubelet/pods/53f928b3-cffa-4b56-9253-d3af1e5e6c8d/volumes" Nov 22 10:24:19 crc kubenswrapper[4743]: I1122 10:24:19.152741 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:24:19 crc kubenswrapper[4743]: E1122 10:24:19.157783 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:24:33 crc kubenswrapper[4743]: I1122 10:24:33.152014 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:24:33 crc kubenswrapper[4743]: E1122 10:24:33.152782 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:24:45 crc kubenswrapper[4743]: I1122 10:24:45.151951 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:24:45 crc kubenswrapper[4743]: E1122 10:24:45.152918 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:24:52 crc kubenswrapper[4743]: I1122 10:24:52.639770 4743 generic.go:334] "Generic (PLEG): container finished" podID="8fc576b2-ff5d-47bd-bfae-9cbcc92c632a" containerID="071ca548402028fd093501a493554c0df2acbb76acca4a55029aa0f21c39e9e5" exitCode=0 Nov 22 10:24:52 crc kubenswrapper[4743]: I1122 10:24:52.639864 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" event={"ID":"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a","Type":"ContainerDied","Data":"071ca548402028fd093501a493554c0df2acbb76acca4a55029aa0f21c39e9e5"} Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.117259 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.275847 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xmnm\" (UniqueName: \"kubernetes.io/projected/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-kube-api-access-8xmnm\") pod \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.276115 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ceph\") pod \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.276152 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ssh-key\") pod \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.276183 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-inventory\") pod \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\" (UID: \"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a\") " Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.287779 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ceph" (OuterVolumeSpecName: "ceph") pod "8fc576b2-ff5d-47bd-bfae-9cbcc92c632a" (UID: "8fc576b2-ff5d-47bd-bfae-9cbcc92c632a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.287891 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-kube-api-access-8xmnm" (OuterVolumeSpecName: "kube-api-access-8xmnm") pod "8fc576b2-ff5d-47bd-bfae-9cbcc92c632a" (UID: "8fc576b2-ff5d-47bd-bfae-9cbcc92c632a"). InnerVolumeSpecName "kube-api-access-8xmnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.307222 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8fc576b2-ff5d-47bd-bfae-9cbcc92c632a" (UID: "8fc576b2-ff5d-47bd-bfae-9cbcc92c632a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.309640 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-inventory" (OuterVolumeSpecName: "inventory") pod "8fc576b2-ff5d-47bd-bfae-9cbcc92c632a" (UID: "8fc576b2-ff5d-47bd-bfae-9cbcc92c632a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.379124 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.379194 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.379209 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.379221 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xmnm\" (UniqueName: \"kubernetes.io/projected/8fc576b2-ff5d-47bd-bfae-9cbcc92c632a-kube-api-access-8xmnm\") on node \"crc\" DevicePath \"\"" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.661561 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" event={"ID":"8fc576b2-ff5d-47bd-bfae-9cbcc92c632a","Type":"ContainerDied","Data":"8d5658fbf8dea8d912628edebfe30ecc979544d645c7c804d134d249a8528f32"} Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.661628 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d5658fbf8dea8d912628edebfe30ecc979544d645c7c804d134d249a8528f32" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.661640 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-fqxgm" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.744103 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vqx9f"] Nov 22 10:24:54 crc kubenswrapper[4743]: E1122 10:24:54.744538 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerName="registry-server" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.744556 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerName="registry-server" Nov 22 10:24:54 crc kubenswrapper[4743]: E1122 10:24:54.744765 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerName="extract-content" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.744781 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerName="extract-content" Nov 22 10:24:54 crc kubenswrapper[4743]: E1122 10:24:54.744798 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerName="extract-utilities" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.744805 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerName="extract-utilities" Nov 22 10:24:54 crc kubenswrapper[4743]: E1122 10:24:54.744819 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc576b2-ff5d-47bd-bfae-9cbcc92c632a" containerName="download-cache-openstack-openstack-cell1" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.744825 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc576b2-ff5d-47bd-bfae-9cbcc92c632a" containerName="download-cache-openstack-openstack-cell1" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.745054 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc576b2-ff5d-47bd-bfae-9cbcc92c632a" containerName="download-cache-openstack-openstack-cell1" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.745079 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f928b3-cffa-4b56-9253-d3af1e5e6c8d" containerName="registry-server" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.745889 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.748468 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.748901 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.749540 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.753830 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.762619 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vqx9f"] Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.890954 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ceph\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.891016 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfrl6\" (UniqueName: \"kubernetes.io/projected/bac0d1da-40df-4390-a976-bd5e354f7e4e-kube-api-access-xfrl6\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.891052 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-inventory\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.891083 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.993503 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ceph\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.993548 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfrl6\" (UniqueName: \"kubernetes.io/projected/bac0d1da-40df-4390-a976-bd5e354f7e4e-kube-api-access-xfrl6\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.993566 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-inventory\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.993603 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.997371 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-inventory\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.997414 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ceph\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:54 crc kubenswrapper[4743]: I1122 10:24:54.998795 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ssh-key\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:55 crc kubenswrapper[4743]: I1122 10:24:55.012544 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfrl6\" (UniqueName: \"kubernetes.io/projected/bac0d1da-40df-4390-a976-bd5e354f7e4e-kube-api-access-xfrl6\") pod \"configure-network-openstack-openstack-cell1-vqx9f\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:55 crc kubenswrapper[4743]: I1122 10:24:55.062101 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:24:55 crc kubenswrapper[4743]: I1122 10:24:55.584616 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-vqx9f"] Nov 22 10:24:55 crc kubenswrapper[4743]: I1122 10:24:55.671519 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" event={"ID":"bac0d1da-40df-4390-a976-bd5e354f7e4e","Type":"ContainerStarted","Data":"6308b616c1a82a22b01e6aba0ed5de93aa105606f8d1ffdd599153d2628ed22f"} Nov 22 10:24:56 crc kubenswrapper[4743]: I1122 10:24:56.684729 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" event={"ID":"bac0d1da-40df-4390-a976-bd5e354f7e4e","Type":"ContainerStarted","Data":"ad0bf4ac445efd565ed0532b5792c2bca79af49103e62e00bca94395e6656f2a"} Nov 22 10:24:56 crc kubenswrapper[4743]: I1122 10:24:56.706555 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" podStartSLOduration=2.20619162 podStartE2EDuration="2.706531373s" podCreationTimestamp="2025-11-22 10:24:54 +0000 UTC" firstStartedPulling="2025-11-22 10:24:55.593946106 +0000 UTC m=+7369.300307158" lastFinishedPulling="2025-11-22 10:24:56.094285859 +0000 UTC m=+7369.800646911" observedRunningTime="2025-11-22 10:24:56.701217961 +0000 UTC m=+7370.407579053" watchObservedRunningTime="2025-11-22 10:24:56.706531373 +0000 UTC m=+7370.412892455" Nov 22 10:24:58 crc kubenswrapper[4743]: I1122 10:24:58.152343 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:24:58 crc kubenswrapper[4743]: E1122 10:24:58.152992 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.836128 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-95nhv"] Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.843868 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.846933 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95nhv"] Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.854205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-utilities\") pod \"redhat-operators-95nhv\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.854271 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-catalog-content\") pod \"redhat-operators-95nhv\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.854311 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlwd4\" (UniqueName: \"kubernetes.io/projected/2386d073-b0b8-47bd-a2fe-765b42e6f76a-kube-api-access-qlwd4\") pod \"redhat-operators-95nhv\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.955995 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-utilities\") pod \"redhat-operators-95nhv\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.956056 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-catalog-content\") pod \"redhat-operators-95nhv\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.956095 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlwd4\" (UniqueName: \"kubernetes.io/projected/2386d073-b0b8-47bd-a2fe-765b42e6f76a-kube-api-access-qlwd4\") pod \"redhat-operators-95nhv\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.956562 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-utilities\") pod \"redhat-operators-95nhv\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.956668 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-catalog-content\") pod \"redhat-operators-95nhv\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:01 crc kubenswrapper[4743]: I1122 10:25:01.975919 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlwd4\" (UniqueName: \"kubernetes.io/projected/2386d073-b0b8-47bd-a2fe-765b42e6f76a-kube-api-access-qlwd4\") pod \"redhat-operators-95nhv\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:02 crc kubenswrapper[4743]: I1122 10:25:02.179516 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:02 crc kubenswrapper[4743]: I1122 10:25:02.679589 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95nhv"] Nov 22 10:25:02 crc kubenswrapper[4743]: I1122 10:25:02.740816 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95nhv" event={"ID":"2386d073-b0b8-47bd-a2fe-765b42e6f76a","Type":"ContainerStarted","Data":"36f093c98106ba892f0472dd5d6d7bfbe4160e3df8559899c1c89af79ea0b779"} Nov 22 10:25:03 crc kubenswrapper[4743]: I1122 10:25:03.751920 4743 generic.go:334] "Generic (PLEG): container finished" podID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerID="8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc" exitCode=0 Nov 22 10:25:03 crc kubenswrapper[4743]: I1122 10:25:03.751972 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95nhv" event={"ID":"2386d073-b0b8-47bd-a2fe-765b42e6f76a","Type":"ContainerDied","Data":"8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc"} Nov 22 10:25:04 crc kubenswrapper[4743]: I1122 10:25:04.766262 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95nhv" event={"ID":"2386d073-b0b8-47bd-a2fe-765b42e6f76a","Type":"ContainerStarted","Data":"c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017"} Nov 22 10:25:08 crc kubenswrapper[4743]: I1122 10:25:08.808135 4743 generic.go:334] "Generic (PLEG): container finished" podID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerID="c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017" exitCode=0 Nov 22 10:25:08 crc kubenswrapper[4743]: I1122 10:25:08.808198 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95nhv" event={"ID":"2386d073-b0b8-47bd-a2fe-765b42e6f76a","Type":"ContainerDied","Data":"c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017"} Nov 22 10:25:09 crc kubenswrapper[4743]: I1122 10:25:09.824762 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95nhv" event={"ID":"2386d073-b0b8-47bd-a2fe-765b42e6f76a","Type":"ContainerStarted","Data":"0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4"} Nov 22 10:25:09 crc kubenswrapper[4743]: I1122 10:25:09.842943 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-95nhv" podStartSLOduration=3.379442375 podStartE2EDuration="8.842921394s" podCreationTimestamp="2025-11-22 10:25:01 +0000 UTC" firstStartedPulling="2025-11-22 10:25:03.754061473 +0000 UTC m=+7377.460422525" lastFinishedPulling="2025-11-22 10:25:09.217540492 +0000 UTC m=+7382.923901544" observedRunningTime="2025-11-22 10:25:09.840205726 +0000 UTC m=+7383.546566778" watchObservedRunningTime="2025-11-22 10:25:09.842921394 +0000 UTC m=+7383.549282446" Nov 22 10:25:11 crc kubenswrapper[4743]: I1122 10:25:11.157538 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:25:11 crc kubenswrapper[4743]: E1122 10:25:11.158226 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:25:12 crc kubenswrapper[4743]: I1122 10:25:12.180132 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:12 crc kubenswrapper[4743]: I1122 10:25:12.180202 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:13 crc kubenswrapper[4743]: I1122 10:25:13.228746 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-95nhv" podUID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerName="registry-server" probeResult="failure" output=< Nov 22 10:25:13 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 10:25:13 crc kubenswrapper[4743]: > Nov 22 10:25:22 crc kubenswrapper[4743]: I1122 10:25:22.224788 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:22 crc kubenswrapper[4743]: I1122 10:25:22.275482 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:22 crc kubenswrapper[4743]: I1122 10:25:22.460322 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95nhv"] Nov 22 10:25:23 crc kubenswrapper[4743]: I1122 10:25:23.152628 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:25:23 crc kubenswrapper[4743]: E1122 10:25:23.153053 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:25:23 crc kubenswrapper[4743]: I1122 10:25:23.950422 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-95nhv" podUID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerName="registry-server" containerID="cri-o://0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4" gracePeriod=2 Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.424559 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.543380 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-catalog-content\") pod \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.543461 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-utilities\") pod \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.543538 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlwd4\" (UniqueName: \"kubernetes.io/projected/2386d073-b0b8-47bd-a2fe-765b42e6f76a-kube-api-access-qlwd4\") pod \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\" (UID: \"2386d073-b0b8-47bd-a2fe-765b42e6f76a\") " Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.544908 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-utilities" (OuterVolumeSpecName: "utilities") pod "2386d073-b0b8-47bd-a2fe-765b42e6f76a" (UID: "2386d073-b0b8-47bd-a2fe-765b42e6f76a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.550122 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2386d073-b0b8-47bd-a2fe-765b42e6f76a-kube-api-access-qlwd4" (OuterVolumeSpecName: "kube-api-access-qlwd4") pod "2386d073-b0b8-47bd-a2fe-765b42e6f76a" (UID: "2386d073-b0b8-47bd-a2fe-765b42e6f76a"). InnerVolumeSpecName "kube-api-access-qlwd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.641133 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2386d073-b0b8-47bd-a2fe-765b42e6f76a" (UID: "2386d073-b0b8-47bd-a2fe-765b42e6f76a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.646422 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.646477 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2386d073-b0b8-47bd-a2fe-765b42e6f76a-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.646492 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlwd4\" (UniqueName: \"kubernetes.io/projected/2386d073-b0b8-47bd-a2fe-765b42e6f76a-kube-api-access-qlwd4\") on node \"crc\" DevicePath \"\"" Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.968307 4743 generic.go:334] "Generic (PLEG): container finished" podID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerID="0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4" exitCode=0 Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.968404 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95nhv" Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.968413 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95nhv" event={"ID":"2386d073-b0b8-47bd-a2fe-765b42e6f76a","Type":"ContainerDied","Data":"0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4"} Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.968496 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95nhv" event={"ID":"2386d073-b0b8-47bd-a2fe-765b42e6f76a","Type":"ContainerDied","Data":"36f093c98106ba892f0472dd5d6d7bfbe4160e3df8559899c1c89af79ea0b779"} Nov 22 10:25:24 crc kubenswrapper[4743]: I1122 10:25:24.968541 4743 scope.go:117] "RemoveContainer" containerID="0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4" Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.008279 4743 scope.go:117] "RemoveContainer" containerID="c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017" Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.030553 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95nhv"] Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.041927 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-95nhv"] Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.045042 4743 scope.go:117] "RemoveContainer" containerID="8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc" Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.117235 4743 scope.go:117] "RemoveContainer" containerID="0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4" Nov 22 10:25:25 crc kubenswrapper[4743]: E1122 10:25:25.118083 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4\": container with ID starting with 0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4 not found: ID does not exist" containerID="0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4" Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.118231 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4"} err="failed to get container status \"0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4\": rpc error: code = NotFound desc = could not find container \"0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4\": container with ID starting with 0c633cabcecdb99bdb4a51a9014cce575a5627ff687efbc4c8018ec9826a6ea4 not found: ID does not exist" Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.118354 4743 scope.go:117] "RemoveContainer" containerID="c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017" Nov 22 10:25:25 crc kubenswrapper[4743]: E1122 10:25:25.118771 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017\": container with ID starting with c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017 not found: ID does not exist" containerID="c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017" Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.118894 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017"} err="failed to get container status \"c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017\": rpc error: code = NotFound desc = could not find container \"c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017\": container with ID starting with c004c042b0bbeb51af15a06e5ab67443e71d4cab37dd1479130d1eee60bc4017 not found: ID does not exist" Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.118990 4743 scope.go:117] "RemoveContainer" containerID="8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc" Nov 22 10:25:25 crc kubenswrapper[4743]: E1122 10:25:25.119479 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc\": container with ID starting with 8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc not found: ID does not exist" containerID="8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc" Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.119610 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc"} err="failed to get container status \"8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc\": rpc error: code = NotFound desc = could not find container \"8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc\": container with ID starting with 8acfae03e970503c634b3914ab6ed2aaeb476581fd90f9227a82bd2d7bb447cc not found: ID does not exist" Nov 22 10:25:25 crc kubenswrapper[4743]: I1122 10:25:25.173206 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" path="/var/lib/kubelet/pods/2386d073-b0b8-47bd-a2fe-765b42e6f76a/volumes" Nov 22 10:25:37 crc kubenswrapper[4743]: I1122 10:25:37.161410 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:25:37 crc kubenswrapper[4743]: E1122 10:25:37.162232 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:25:51 crc kubenswrapper[4743]: I1122 10:25:51.151770 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:25:51 crc kubenswrapper[4743]: E1122 10:25:51.152663 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:26:02 crc kubenswrapper[4743]: I1122 10:26:02.152241 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:26:02 crc kubenswrapper[4743]: E1122 10:26:02.153008 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:26:14 crc kubenswrapper[4743]: I1122 10:26:14.151492 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:26:14 crc kubenswrapper[4743]: E1122 10:26:14.153319 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:26:15 crc kubenswrapper[4743]: I1122 10:26:15.530339 4743 generic.go:334] "Generic (PLEG): container finished" podID="bac0d1da-40df-4390-a976-bd5e354f7e4e" containerID="ad0bf4ac445efd565ed0532b5792c2bca79af49103e62e00bca94395e6656f2a" exitCode=0 Nov 22 10:26:15 crc kubenswrapper[4743]: I1122 10:26:15.530421 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" event={"ID":"bac0d1da-40df-4390-a976-bd5e354f7e4e","Type":"ContainerDied","Data":"ad0bf4ac445efd565ed0532b5792c2bca79af49103e62e00bca94395e6656f2a"} Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.040524 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.148995 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ceph\") pod \"bac0d1da-40df-4390-a976-bd5e354f7e4e\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.149206 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ssh-key\") pod \"bac0d1da-40df-4390-a976-bd5e354f7e4e\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.149292 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfrl6\" (UniqueName: \"kubernetes.io/projected/bac0d1da-40df-4390-a976-bd5e354f7e4e-kube-api-access-xfrl6\") pod \"bac0d1da-40df-4390-a976-bd5e354f7e4e\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.149411 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-inventory\") pod \"bac0d1da-40df-4390-a976-bd5e354f7e4e\" (UID: \"bac0d1da-40df-4390-a976-bd5e354f7e4e\") " Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.160520 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bac0d1da-40df-4390-a976-bd5e354f7e4e-kube-api-access-xfrl6" (OuterVolumeSpecName: "kube-api-access-xfrl6") pod "bac0d1da-40df-4390-a976-bd5e354f7e4e" (UID: "bac0d1da-40df-4390-a976-bd5e354f7e4e"). InnerVolumeSpecName "kube-api-access-xfrl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.171261 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ceph" (OuterVolumeSpecName: "ceph") pod "bac0d1da-40df-4390-a976-bd5e354f7e4e" (UID: "bac0d1da-40df-4390-a976-bd5e354f7e4e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.178142 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-inventory" (OuterVolumeSpecName: "inventory") pod "bac0d1da-40df-4390-a976-bd5e354f7e4e" (UID: "bac0d1da-40df-4390-a976-bd5e354f7e4e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.189831 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bac0d1da-40df-4390-a976-bd5e354f7e4e" (UID: "bac0d1da-40df-4390-a976-bd5e354f7e4e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.251899 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.252172 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.252182 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bac0d1da-40df-4390-a976-bd5e354f7e4e-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.252192 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfrl6\" (UniqueName: \"kubernetes.io/projected/bac0d1da-40df-4390-a976-bd5e354f7e4e-kube-api-access-xfrl6\") on node \"crc\" DevicePath \"\"" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.556538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" event={"ID":"bac0d1da-40df-4390-a976-bd5e354f7e4e","Type":"ContainerDied","Data":"6308b616c1a82a22b01e6aba0ed5de93aa105606f8d1ffdd599153d2628ed22f"} Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.556587 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6308b616c1a82a22b01e6aba0ed5de93aa105606f8d1ffdd599153d2628ed22f" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.556603 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-vqx9f" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.661656 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-xdzxc"] Nov 22 10:26:17 crc kubenswrapper[4743]: E1122 10:26:17.662161 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerName="extract-content" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.662178 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerName="extract-content" Nov 22 10:26:17 crc kubenswrapper[4743]: E1122 10:26:17.662190 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bac0d1da-40df-4390-a976-bd5e354f7e4e" containerName="configure-network-openstack-openstack-cell1" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.662197 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="bac0d1da-40df-4390-a976-bd5e354f7e4e" containerName="configure-network-openstack-openstack-cell1" Nov 22 10:26:17 crc kubenswrapper[4743]: E1122 10:26:17.662216 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerName="extract-utilities" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.662224 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerName="extract-utilities" Nov 22 10:26:17 crc kubenswrapper[4743]: E1122 10:26:17.662237 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerName="registry-server" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.662242 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerName="registry-server" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.662454 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="2386d073-b0b8-47bd-a2fe-765b42e6f76a" containerName="registry-server" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.662474 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="bac0d1da-40df-4390-a976-bd5e354f7e4e" containerName="configure-network-openstack-openstack-cell1" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.663217 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.665230 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.665253 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.665484 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.665615 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.669257 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-xdzxc"] Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.761718 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ssh-key\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.761873 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ceph\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.762164 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q58dz\" (UniqueName: \"kubernetes.io/projected/29a07311-f1b0-47cd-bf93-0f13bcd05354-kube-api-access-q58dz\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.762391 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-inventory\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.864450 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-inventory\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.864544 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ssh-key\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.864609 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ceph\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.864668 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q58dz\" (UniqueName: \"kubernetes.io/projected/29a07311-f1b0-47cd-bf93-0f13bcd05354-kube-api-access-q58dz\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.869018 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-inventory\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.869018 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ssh-key\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.869212 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ceph\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.889018 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q58dz\" (UniqueName: \"kubernetes.io/projected/29a07311-f1b0-47cd-bf93-0f13bcd05354-kube-api-access-q58dz\") pod \"validate-network-openstack-openstack-cell1-xdzxc\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:17 crc kubenswrapper[4743]: I1122 10:26:17.993315 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:18 crc kubenswrapper[4743]: I1122 10:26:18.555974 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-xdzxc"] Nov 22 10:26:18 crc kubenswrapper[4743]: I1122 10:26:18.558490 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:26:18 crc kubenswrapper[4743]: I1122 10:26:18.569770 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" event={"ID":"29a07311-f1b0-47cd-bf93-0f13bcd05354","Type":"ContainerStarted","Data":"784ddeaf2a9493ca15e2279b69551687baf4b0181c9abb340f1ced9361340b14"} Nov 22 10:26:19 crc kubenswrapper[4743]: I1122 10:26:19.589466 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" event={"ID":"29a07311-f1b0-47cd-bf93-0f13bcd05354","Type":"ContainerStarted","Data":"b541df4c96a2b85514e49330267981aa3f6dcc07e4ebfa0e2356cebde6fd046c"} Nov 22 10:26:19 crc kubenswrapper[4743]: I1122 10:26:19.614136 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" podStartSLOduration=2.04330026 podStartE2EDuration="2.614115069s" podCreationTimestamp="2025-11-22 10:26:17 +0000 UTC" firstStartedPulling="2025-11-22 10:26:18.558208632 +0000 UTC m=+7452.264569674" lastFinishedPulling="2025-11-22 10:26:19.129023431 +0000 UTC m=+7452.835384483" observedRunningTime="2025-11-22 10:26:19.610457554 +0000 UTC m=+7453.316818606" watchObservedRunningTime="2025-11-22 10:26:19.614115069 +0000 UTC m=+7453.320476121" Nov 22 10:26:24 crc kubenswrapper[4743]: I1122 10:26:24.633070 4743 generic.go:334] "Generic (PLEG): container finished" podID="29a07311-f1b0-47cd-bf93-0f13bcd05354" containerID="b541df4c96a2b85514e49330267981aa3f6dcc07e4ebfa0e2356cebde6fd046c" exitCode=0 Nov 22 10:26:24 crc kubenswrapper[4743]: I1122 10:26:24.633150 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" event={"ID":"29a07311-f1b0-47cd-bf93-0f13bcd05354","Type":"ContainerDied","Data":"b541df4c96a2b85514e49330267981aa3f6dcc07e4ebfa0e2356cebde6fd046c"} Nov 22 10:26:25 crc kubenswrapper[4743]: I1122 10:26:25.152498 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:26:25 crc kubenswrapper[4743]: E1122 10:26:25.152872 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.117977 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.254752 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-inventory\") pod \"29a07311-f1b0-47cd-bf93-0f13bcd05354\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.255232 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ceph\") pod \"29a07311-f1b0-47cd-bf93-0f13bcd05354\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.256056 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q58dz\" (UniqueName: \"kubernetes.io/projected/29a07311-f1b0-47cd-bf93-0f13bcd05354-kube-api-access-q58dz\") pod \"29a07311-f1b0-47cd-bf93-0f13bcd05354\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.256115 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ssh-key\") pod \"29a07311-f1b0-47cd-bf93-0f13bcd05354\" (UID: \"29a07311-f1b0-47cd-bf93-0f13bcd05354\") " Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.262317 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ceph" (OuterVolumeSpecName: "ceph") pod "29a07311-f1b0-47cd-bf93-0f13bcd05354" (UID: "29a07311-f1b0-47cd-bf93-0f13bcd05354"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.267846 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a07311-f1b0-47cd-bf93-0f13bcd05354-kube-api-access-q58dz" (OuterVolumeSpecName: "kube-api-access-q58dz") pod "29a07311-f1b0-47cd-bf93-0f13bcd05354" (UID: "29a07311-f1b0-47cd-bf93-0f13bcd05354"). InnerVolumeSpecName "kube-api-access-q58dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.288123 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-inventory" (OuterVolumeSpecName: "inventory") pod "29a07311-f1b0-47cd-bf93-0f13bcd05354" (UID: "29a07311-f1b0-47cd-bf93-0f13bcd05354"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.293154 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "29a07311-f1b0-47cd-bf93-0f13bcd05354" (UID: "29a07311-f1b0-47cd-bf93-0f13bcd05354"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.358861 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.358904 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.358922 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q58dz\" (UniqueName: \"kubernetes.io/projected/29a07311-f1b0-47cd-bf93-0f13bcd05354-kube-api-access-q58dz\") on node \"crc\" DevicePath \"\"" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.358934 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/29a07311-f1b0-47cd-bf93-0f13bcd05354-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.658114 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" event={"ID":"29a07311-f1b0-47cd-bf93-0f13bcd05354","Type":"ContainerDied","Data":"784ddeaf2a9493ca15e2279b69551687baf4b0181c9abb340f1ced9361340b14"} Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.658165 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="784ddeaf2a9493ca15e2279b69551687baf4b0181c9abb340f1ced9361340b14" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.658220 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-xdzxc" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.755949 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6kt62"] Nov 22 10:26:26 crc kubenswrapper[4743]: E1122 10:26:26.756411 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a07311-f1b0-47cd-bf93-0f13bcd05354" containerName="validate-network-openstack-openstack-cell1" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.756426 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a07311-f1b0-47cd-bf93-0f13bcd05354" containerName="validate-network-openstack-openstack-cell1" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.756644 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a07311-f1b0-47cd-bf93-0f13bcd05354" containerName="validate-network-openstack-openstack-cell1" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.757354 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.760365 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.760674 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.761007 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.761942 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.780095 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6kt62"] Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.870035 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ceph\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.870121 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65wwk\" (UniqueName: \"kubernetes.io/projected/b46ca6e3-19cc-454d-85ce-c57f91d88e20-kube-api-access-65wwk\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.870186 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-inventory\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.870228 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ssh-key\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.972216 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ceph\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.972292 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65wwk\" (UniqueName: \"kubernetes.io/projected/b46ca6e3-19cc-454d-85ce-c57f91d88e20-kube-api-access-65wwk\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.972340 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-inventory\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.972359 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ssh-key\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.977357 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ceph\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.977929 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ssh-key\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.978537 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-inventory\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:26 crc kubenswrapper[4743]: I1122 10:26:26.990711 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65wwk\" (UniqueName: \"kubernetes.io/projected/b46ca6e3-19cc-454d-85ce-c57f91d88e20-kube-api-access-65wwk\") pod \"install-os-openstack-openstack-cell1-6kt62\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:27 crc kubenswrapper[4743]: I1122 10:26:27.084737 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:26:27 crc kubenswrapper[4743]: I1122 10:26:27.630519 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-6kt62"] Nov 22 10:26:27 crc kubenswrapper[4743]: I1122 10:26:27.671493 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6kt62" event={"ID":"b46ca6e3-19cc-454d-85ce-c57f91d88e20","Type":"ContainerStarted","Data":"a901254def916ee84cc59fa4222cbef3b5c1caddf2dfe37ca3c41ce86f690206"} Nov 22 10:26:28 crc kubenswrapper[4743]: I1122 10:26:28.682482 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6kt62" event={"ID":"b46ca6e3-19cc-454d-85ce-c57f91d88e20","Type":"ContainerStarted","Data":"b6079900e3dac1090e28620e7af975ee5a1dbfc3b204062dfe68cb264f79f4a4"} Nov 22 10:26:28 crc kubenswrapper[4743]: I1122 10:26:28.706362 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-6kt62" podStartSLOduration=2.234249003 podStartE2EDuration="2.70634074s" podCreationTimestamp="2025-11-22 10:26:26 +0000 UTC" firstStartedPulling="2025-11-22 10:26:27.637933293 +0000 UTC m=+7461.344294355" lastFinishedPulling="2025-11-22 10:26:28.11002504 +0000 UTC m=+7461.816386092" observedRunningTime="2025-11-22 10:26:28.700907474 +0000 UTC m=+7462.407268546" watchObservedRunningTime="2025-11-22 10:26:28.70634074 +0000 UTC m=+7462.412701792" Nov 22 10:26:39 crc kubenswrapper[4743]: I1122 10:26:39.152205 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:26:39 crc kubenswrapper[4743]: I1122 10:26:39.855226 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"40328998db4cc7d85f34000888325dea9a42d2a672d786787ca0b5e402b918dd"} Nov 22 10:27:14 crc kubenswrapper[4743]: I1122 10:27:14.208131 4743 generic.go:334] "Generic (PLEG): container finished" podID="b46ca6e3-19cc-454d-85ce-c57f91d88e20" containerID="b6079900e3dac1090e28620e7af975ee5a1dbfc3b204062dfe68cb264f79f4a4" exitCode=0 Nov 22 10:27:14 crc kubenswrapper[4743]: I1122 10:27:14.208213 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6kt62" event={"ID":"b46ca6e3-19cc-454d-85ce-c57f91d88e20","Type":"ContainerDied","Data":"b6079900e3dac1090e28620e7af975ee5a1dbfc3b204062dfe68cb264f79f4a4"} Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.631544 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.729468 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-inventory\") pod \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.729690 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ssh-key\") pod \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.730287 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65wwk\" (UniqueName: \"kubernetes.io/projected/b46ca6e3-19cc-454d-85ce-c57f91d88e20-kube-api-access-65wwk\") pod \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.730571 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ceph\") pod \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\" (UID: \"b46ca6e3-19cc-454d-85ce-c57f91d88e20\") " Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.735516 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ceph" (OuterVolumeSpecName: "ceph") pod "b46ca6e3-19cc-454d-85ce-c57f91d88e20" (UID: "b46ca6e3-19cc-454d-85ce-c57f91d88e20"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.735782 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b46ca6e3-19cc-454d-85ce-c57f91d88e20-kube-api-access-65wwk" (OuterVolumeSpecName: "kube-api-access-65wwk") pod "b46ca6e3-19cc-454d-85ce-c57f91d88e20" (UID: "b46ca6e3-19cc-454d-85ce-c57f91d88e20"). InnerVolumeSpecName "kube-api-access-65wwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.758623 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-inventory" (OuterVolumeSpecName: "inventory") pod "b46ca6e3-19cc-454d-85ce-c57f91d88e20" (UID: "b46ca6e3-19cc-454d-85ce-c57f91d88e20"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.764237 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b46ca6e3-19cc-454d-85ce-c57f91d88e20" (UID: "b46ca6e3-19cc-454d-85ce-c57f91d88e20"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.833322 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.833429 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65wwk\" (UniqueName: \"kubernetes.io/projected/b46ca6e3-19cc-454d-85ce-c57f91d88e20-kube-api-access-65wwk\") on node \"crc\" DevicePath \"\"" Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.833487 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:27:15 crc kubenswrapper[4743]: I1122 10:27:15.833556 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b46ca6e3-19cc-454d-85ce-c57f91d88e20-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.228602 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-6kt62" event={"ID":"b46ca6e3-19cc-454d-85ce-c57f91d88e20","Type":"ContainerDied","Data":"a901254def916ee84cc59fa4222cbef3b5c1caddf2dfe37ca3c41ce86f690206"} Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.228647 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a901254def916ee84cc59fa4222cbef3b5c1caddf2dfe37ca3c41ce86f690206" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.228653 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-6kt62" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.327525 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-dtcqr"] Nov 22 10:27:16 crc kubenswrapper[4743]: E1122 10:27:16.328306 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b46ca6e3-19cc-454d-85ce-c57f91d88e20" containerName="install-os-openstack-openstack-cell1" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.328323 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b46ca6e3-19cc-454d-85ce-c57f91d88e20" containerName="install-os-openstack-openstack-cell1" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.328525 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b46ca6e3-19cc-454d-85ce-c57f91d88e20" containerName="install-os-openstack-openstack-cell1" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.329328 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.332103 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.332707 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.332968 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.333549 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.341160 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-dtcqr"] Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.446021 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ceph\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.446206 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ssh-key\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.446411 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r26fr\" (UniqueName: \"kubernetes.io/projected/94c102fa-b4f9-413a-92fb-533fccbe12c7-kube-api-access-r26fr\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.446462 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-inventory\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.550217 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ceph\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.550631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ssh-key\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.550836 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r26fr\" (UniqueName: \"kubernetes.io/projected/94c102fa-b4f9-413a-92fb-533fccbe12c7-kube-api-access-r26fr\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.550956 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-inventory\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.555374 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-inventory\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.557822 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ceph\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.562411 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ssh-key\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.566521 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r26fr\" (UniqueName: \"kubernetes.io/projected/94c102fa-b4f9-413a-92fb-533fccbe12c7-kube-api-access-r26fr\") pod \"configure-os-openstack-openstack-cell1-dtcqr\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:16 crc kubenswrapper[4743]: I1122 10:27:16.650788 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:27:17 crc kubenswrapper[4743]: I1122 10:27:17.228337 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-dtcqr"] Nov 22 10:27:17 crc kubenswrapper[4743]: I1122 10:27:17.244128 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" event={"ID":"94c102fa-b4f9-413a-92fb-533fccbe12c7","Type":"ContainerStarted","Data":"ff5a98c6136d2c30d9c1774eb25feb72c6528e241d4c8de7eb557542595342e9"} Nov 22 10:27:18 crc kubenswrapper[4743]: I1122 10:27:18.256568 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" event={"ID":"94c102fa-b4f9-413a-92fb-533fccbe12c7","Type":"ContainerStarted","Data":"1582ef3314f22746637aaa6eaf290bb507122c0b2634360ef88826aa52c8be3f"} Nov 22 10:27:18 crc kubenswrapper[4743]: I1122 10:27:18.287366 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" podStartSLOduration=1.9138874989999999 podStartE2EDuration="2.287348095s" podCreationTimestamp="2025-11-22 10:27:16 +0000 UTC" firstStartedPulling="2025-11-22 10:27:17.221678597 +0000 UTC m=+7510.928039649" lastFinishedPulling="2025-11-22 10:27:17.595139183 +0000 UTC m=+7511.301500245" observedRunningTime="2025-11-22 10:27:18.274788655 +0000 UTC m=+7511.981149717" watchObservedRunningTime="2025-11-22 10:27:18.287348095 +0000 UTC m=+7511.993709157" Nov 22 10:28:01 crc kubenswrapper[4743]: I1122 10:28:01.704378 4743 generic.go:334] "Generic (PLEG): container finished" podID="94c102fa-b4f9-413a-92fb-533fccbe12c7" containerID="1582ef3314f22746637aaa6eaf290bb507122c0b2634360ef88826aa52c8be3f" exitCode=0 Nov 22 10:28:01 crc kubenswrapper[4743]: I1122 10:28:01.704479 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" event={"ID":"94c102fa-b4f9-413a-92fb-533fccbe12c7","Type":"ContainerDied","Data":"1582ef3314f22746637aaa6eaf290bb507122c0b2634360ef88826aa52c8be3f"} Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.193551 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.248701 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-inventory\") pod \"94c102fa-b4f9-413a-92fb-533fccbe12c7\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.248749 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ssh-key\") pod \"94c102fa-b4f9-413a-92fb-533fccbe12c7\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.248776 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ceph\") pod \"94c102fa-b4f9-413a-92fb-533fccbe12c7\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.248942 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r26fr\" (UniqueName: \"kubernetes.io/projected/94c102fa-b4f9-413a-92fb-533fccbe12c7-kube-api-access-r26fr\") pod \"94c102fa-b4f9-413a-92fb-533fccbe12c7\" (UID: \"94c102fa-b4f9-413a-92fb-533fccbe12c7\") " Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.258779 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c102fa-b4f9-413a-92fb-533fccbe12c7-kube-api-access-r26fr" (OuterVolumeSpecName: "kube-api-access-r26fr") pod "94c102fa-b4f9-413a-92fb-533fccbe12c7" (UID: "94c102fa-b4f9-413a-92fb-533fccbe12c7"). InnerVolumeSpecName "kube-api-access-r26fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.259749 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ceph" (OuterVolumeSpecName: "ceph") pod "94c102fa-b4f9-413a-92fb-533fccbe12c7" (UID: "94c102fa-b4f9-413a-92fb-533fccbe12c7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.279754 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "94c102fa-b4f9-413a-92fb-533fccbe12c7" (UID: "94c102fa-b4f9-413a-92fb-533fccbe12c7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.286002 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-inventory" (OuterVolumeSpecName: "inventory") pod "94c102fa-b4f9-413a-92fb-533fccbe12c7" (UID: "94c102fa-b4f9-413a-92fb-533fccbe12c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.351964 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.352329 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.352344 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/94c102fa-b4f9-413a-92fb-533fccbe12c7-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.352361 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r26fr\" (UniqueName: \"kubernetes.io/projected/94c102fa-b4f9-413a-92fb-533fccbe12c7-kube-api-access-r26fr\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.723888 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" event={"ID":"94c102fa-b4f9-413a-92fb-533fccbe12c7","Type":"ContainerDied","Data":"ff5a98c6136d2c30d9c1774eb25feb72c6528e241d4c8de7eb557542595342e9"} Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.723927 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff5a98c6136d2c30d9c1774eb25feb72c6528e241d4c8de7eb557542595342e9" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.724392 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-dtcqr" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.822358 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-2wp9n"] Nov 22 10:28:03 crc kubenswrapper[4743]: E1122 10:28:03.823131 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c102fa-b4f9-413a-92fb-533fccbe12c7" containerName="configure-os-openstack-openstack-cell1" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.823151 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c102fa-b4f9-413a-92fb-533fccbe12c7" containerName="configure-os-openstack-openstack-cell1" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.823460 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c102fa-b4f9-413a-92fb-533fccbe12c7" containerName="configure-os-openstack-openstack-cell1" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.824555 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.828890 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.828938 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.829030 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.829262 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.853697 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-2wp9n"] Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.965340 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ceph\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.965974 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-inventory-0\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.966070 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:03 crc kubenswrapper[4743]: I1122 10:28:03.966279 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvm6h\" (UniqueName: \"kubernetes.io/projected/4aa15537-2948-44af-b30f-ff55c4d4b86d-kube-api-access-bvm6h\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:04 crc kubenswrapper[4743]: I1122 10:28:04.068152 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ceph\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:04 crc kubenswrapper[4743]: I1122 10:28:04.068188 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-inventory-0\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:04 crc kubenswrapper[4743]: I1122 10:28:04.068256 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:04 crc kubenswrapper[4743]: I1122 10:28:04.068303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvm6h\" (UniqueName: \"kubernetes.io/projected/4aa15537-2948-44af-b30f-ff55c4d4b86d-kube-api-access-bvm6h\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:04 crc kubenswrapper[4743]: I1122 10:28:04.073700 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-inventory-0\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:04 crc kubenswrapper[4743]: I1122 10:28:04.077397 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ceph\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:04 crc kubenswrapper[4743]: I1122 10:28:04.078083 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:04 crc kubenswrapper[4743]: I1122 10:28:04.090502 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvm6h\" (UniqueName: \"kubernetes.io/projected/4aa15537-2948-44af-b30f-ff55c4d4b86d-kube-api-access-bvm6h\") pod \"ssh-known-hosts-openstack-2wp9n\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:04 crc kubenswrapper[4743]: I1122 10:28:04.177448 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:04 crc kubenswrapper[4743]: I1122 10:28:04.721552 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-2wp9n"] Nov 22 10:28:05 crc kubenswrapper[4743]: I1122 10:28:05.747013 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-2wp9n" event={"ID":"4aa15537-2948-44af-b30f-ff55c4d4b86d","Type":"ContainerStarted","Data":"e5688d4fbdcac525c5740cac541b76c42b83c70559ba5255f547fc73c19d76f0"} Nov 22 10:28:05 crc kubenswrapper[4743]: I1122 10:28:05.748342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-2wp9n" event={"ID":"4aa15537-2948-44af-b30f-ff55c4d4b86d","Type":"ContainerStarted","Data":"90c3c8514e39fff90c177a001d9a8a4663b5b8479cce0398ca8f6b23ee6fbf35"} Nov 22 10:28:05 crc kubenswrapper[4743]: I1122 10:28:05.769765 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-2wp9n" podStartSLOduration=2.207341466 podStartE2EDuration="2.769739753s" podCreationTimestamp="2025-11-22 10:28:03 +0000 UTC" firstStartedPulling="2025-11-22 10:28:04.729842055 +0000 UTC m=+7558.436203107" lastFinishedPulling="2025-11-22 10:28:05.292240352 +0000 UTC m=+7558.998601394" observedRunningTime="2025-11-22 10:28:05.762773524 +0000 UTC m=+7559.469134576" watchObservedRunningTime="2025-11-22 10:28:05.769739753 +0000 UTC m=+7559.476100805" Nov 22 10:28:14 crc kubenswrapper[4743]: I1122 10:28:14.827980 4743 generic.go:334] "Generic (PLEG): container finished" podID="4aa15537-2948-44af-b30f-ff55c4d4b86d" containerID="e5688d4fbdcac525c5740cac541b76c42b83c70559ba5255f547fc73c19d76f0" exitCode=0 Nov 22 10:28:14 crc kubenswrapper[4743]: I1122 10:28:14.828057 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-2wp9n" event={"ID":"4aa15537-2948-44af-b30f-ff55c4d4b86d","Type":"ContainerDied","Data":"e5688d4fbdcac525c5740cac541b76c42b83c70559ba5255f547fc73c19d76f0"} Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.320450 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.440472 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ceph\") pod \"4aa15537-2948-44af-b30f-ff55c4d4b86d\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.440564 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ssh-key-openstack-cell1\") pod \"4aa15537-2948-44af-b30f-ff55c4d4b86d\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.440619 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-inventory-0\") pod \"4aa15537-2948-44af-b30f-ff55c4d4b86d\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.440706 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvm6h\" (UniqueName: \"kubernetes.io/projected/4aa15537-2948-44af-b30f-ff55c4d4b86d-kube-api-access-bvm6h\") pod \"4aa15537-2948-44af-b30f-ff55c4d4b86d\" (UID: \"4aa15537-2948-44af-b30f-ff55c4d4b86d\") " Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.448858 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ceph" (OuterVolumeSpecName: "ceph") pod "4aa15537-2948-44af-b30f-ff55c4d4b86d" (UID: "4aa15537-2948-44af-b30f-ff55c4d4b86d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.448933 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa15537-2948-44af-b30f-ff55c4d4b86d-kube-api-access-bvm6h" (OuterVolumeSpecName: "kube-api-access-bvm6h") pod "4aa15537-2948-44af-b30f-ff55c4d4b86d" (UID: "4aa15537-2948-44af-b30f-ff55c4d4b86d"). InnerVolumeSpecName "kube-api-access-bvm6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.475368 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "4aa15537-2948-44af-b30f-ff55c4d4b86d" (UID: "4aa15537-2948-44af-b30f-ff55c4d4b86d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.477395 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4aa15537-2948-44af-b30f-ff55c4d4b86d" (UID: "4aa15537-2948-44af-b30f-ff55c4d4b86d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.545036 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.545081 4743 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-inventory-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.545093 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvm6h\" (UniqueName: \"kubernetes.io/projected/4aa15537-2948-44af-b30f-ff55c4d4b86d-kube-api-access-bvm6h\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.545106 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4aa15537-2948-44af-b30f-ff55c4d4b86d-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.849538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-2wp9n" event={"ID":"4aa15537-2948-44af-b30f-ff55c4d4b86d","Type":"ContainerDied","Data":"90c3c8514e39fff90c177a001d9a8a4663b5b8479cce0398ca8f6b23ee6fbf35"} Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.849829 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c3c8514e39fff90c177a001d9a8a4663b5b8479cce0398ca8f6b23ee6fbf35" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.849903 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-2wp9n" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.923074 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-94f4d"] Nov 22 10:28:16 crc kubenswrapper[4743]: E1122 10:28:16.923791 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa15537-2948-44af-b30f-ff55c4d4b86d" containerName="ssh-known-hosts-openstack" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.923823 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa15537-2948-44af-b30f-ff55c4d4b86d" containerName="ssh-known-hosts-openstack" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.924523 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa15537-2948-44af-b30f-ff55c4d4b86d" containerName="ssh-known-hosts-openstack" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.926186 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.928379 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.928500 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.928520 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.928785 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:28:16 crc kubenswrapper[4743]: I1122 10:28:16.937256 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-94f4d"] Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.054783 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gg2w\" (UniqueName: \"kubernetes.io/projected/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-kube-api-access-4gg2w\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.055173 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-inventory\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.055225 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ceph\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.055274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ssh-key\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.156990 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-inventory\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.157035 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ceph\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.157064 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ssh-key\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.157122 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gg2w\" (UniqueName: \"kubernetes.io/projected/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-kube-api-access-4gg2w\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.162842 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ssh-key\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.165504 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ceph\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.168194 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-inventory\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.174546 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gg2w\" (UniqueName: \"kubernetes.io/projected/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-kube-api-access-4gg2w\") pod \"run-os-openstack-openstack-cell1-94f4d\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.252398 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.819160 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-94f4d"] Nov 22 10:28:17 crc kubenswrapper[4743]: W1122 10:28:17.819452 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2ad6d66_1cc7_4e28_aae8_14d855606aeb.slice/crio-2f60e392d4237e29e1742b76a3ede0c5b0112a9061eaee8ea1044c56a214d63f WatchSource:0}: Error finding container 2f60e392d4237e29e1742b76a3ede0c5b0112a9061eaee8ea1044c56a214d63f: Status 404 returned error can't find the container with id 2f60e392d4237e29e1742b76a3ede0c5b0112a9061eaee8ea1044c56a214d63f Nov 22 10:28:17 crc kubenswrapper[4743]: I1122 10:28:17.862380 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-94f4d" event={"ID":"d2ad6d66-1cc7-4e28-aae8-14d855606aeb","Type":"ContainerStarted","Data":"2f60e392d4237e29e1742b76a3ede0c5b0112a9061eaee8ea1044c56a214d63f"} Nov 22 10:28:18 crc kubenswrapper[4743]: I1122 10:28:18.871043 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-94f4d" event={"ID":"d2ad6d66-1cc7-4e28-aae8-14d855606aeb","Type":"ContainerStarted","Data":"25e5eeeba55aaa426198bd505a733d04cc03928e71bb854030ed0fd98a1c164e"} Nov 22 10:28:18 crc kubenswrapper[4743]: I1122 10:28:18.892087 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-94f4d" podStartSLOduration=2.477997839 podStartE2EDuration="2.892072201s" podCreationTimestamp="2025-11-22 10:28:16 +0000 UTC" firstStartedPulling="2025-11-22 10:28:17.822745837 +0000 UTC m=+7571.529106899" lastFinishedPulling="2025-11-22 10:28:18.236820209 +0000 UTC m=+7571.943181261" observedRunningTime="2025-11-22 10:28:18.891473784 +0000 UTC m=+7572.597834846" watchObservedRunningTime="2025-11-22 10:28:18.892072201 +0000 UTC m=+7572.598433263" Nov 22 10:28:25 crc kubenswrapper[4743]: I1122 10:28:25.932841 4743 generic.go:334] "Generic (PLEG): container finished" podID="d2ad6d66-1cc7-4e28-aae8-14d855606aeb" containerID="25e5eeeba55aaa426198bd505a733d04cc03928e71bb854030ed0fd98a1c164e" exitCode=0 Nov 22 10:28:25 crc kubenswrapper[4743]: I1122 10:28:25.932934 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-94f4d" event={"ID":"d2ad6d66-1cc7-4e28-aae8-14d855606aeb","Type":"ContainerDied","Data":"25e5eeeba55aaa426198bd505a733d04cc03928e71bb854030ed0fd98a1c164e"} Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.511579 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.620635 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ssh-key\") pod \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.620844 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ceph\") pod \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.620916 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-inventory\") pod \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.621047 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gg2w\" (UniqueName: \"kubernetes.io/projected/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-kube-api-access-4gg2w\") pod \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\" (UID: \"d2ad6d66-1cc7-4e28-aae8-14d855606aeb\") " Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.626049 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-kube-api-access-4gg2w" (OuterVolumeSpecName: "kube-api-access-4gg2w") pod "d2ad6d66-1cc7-4e28-aae8-14d855606aeb" (UID: "d2ad6d66-1cc7-4e28-aae8-14d855606aeb"). InnerVolumeSpecName "kube-api-access-4gg2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.636074 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ceph" (OuterVolumeSpecName: "ceph") pod "d2ad6d66-1cc7-4e28-aae8-14d855606aeb" (UID: "d2ad6d66-1cc7-4e28-aae8-14d855606aeb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.654955 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-inventory" (OuterVolumeSpecName: "inventory") pod "d2ad6d66-1cc7-4e28-aae8-14d855606aeb" (UID: "d2ad6d66-1cc7-4e28-aae8-14d855606aeb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.660453 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d2ad6d66-1cc7-4e28-aae8-14d855606aeb" (UID: "d2ad6d66-1cc7-4e28-aae8-14d855606aeb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.722982 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gg2w\" (UniqueName: \"kubernetes.io/projected/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-kube-api-access-4gg2w\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.723017 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.723028 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.723037 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2ad6d66-1cc7-4e28-aae8-14d855606aeb-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.955463 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-94f4d" event={"ID":"d2ad6d66-1cc7-4e28-aae8-14d855606aeb","Type":"ContainerDied","Data":"2f60e392d4237e29e1742b76a3ede0c5b0112a9061eaee8ea1044c56a214d63f"} Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.955510 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f60e392d4237e29e1742b76a3ede0c5b0112a9061eaee8ea1044c56a214d63f" Nov 22 10:28:27 crc kubenswrapper[4743]: I1122 10:28:27.955566 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-94f4d" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.082828 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-w4vp7"] Nov 22 10:28:28 crc kubenswrapper[4743]: E1122 10:28:28.083283 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ad6d66-1cc7-4e28-aae8-14d855606aeb" containerName="run-os-openstack-openstack-cell1" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.083301 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ad6d66-1cc7-4e28-aae8-14d855606aeb" containerName="run-os-openstack-openstack-cell1" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.083509 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ad6d66-1cc7-4e28-aae8-14d855606aeb" containerName="run-os-openstack-openstack-cell1" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.084316 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.086837 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.089865 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.090125 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.090374 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.111020 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-w4vp7"] Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.272007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ceph\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.272095 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.272204 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppprv\" (UniqueName: \"kubernetes.io/projected/d6efb775-64a4-414b-a8e3-8169706ba3de-kube-api-access-ppprv\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.272242 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-inventory\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.373624 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppprv\" (UniqueName: \"kubernetes.io/projected/d6efb775-64a4-414b-a8e3-8169706ba3de-kube-api-access-ppprv\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.373693 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-inventory\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.373779 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ceph\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.373887 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.380310 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-inventory\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.380450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.381137 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ceph\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.391347 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppprv\" (UniqueName: \"kubernetes.io/projected/d6efb775-64a4-414b-a8e3-8169706ba3de-kube-api-access-ppprv\") pod \"reboot-os-openstack-openstack-cell1-w4vp7\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.408138 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.940527 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-w4vp7"] Nov 22 10:28:28 crc kubenswrapper[4743]: I1122 10:28:28.965237 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" event={"ID":"d6efb775-64a4-414b-a8e3-8169706ba3de","Type":"ContainerStarted","Data":"2353b721dd523210c97e33c511b0d2a5a315edb947c0f723e3d8f1eafb02b284"} Nov 22 10:28:29 crc kubenswrapper[4743]: I1122 10:28:29.985501 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" event={"ID":"d6efb775-64a4-414b-a8e3-8169706ba3de","Type":"ContainerStarted","Data":"e0fd2ff6bc44d8ca34631e3360345af833d76791b69239c0736093c79ed4f5b0"} Nov 22 10:28:45 crc kubenswrapper[4743]: I1122 10:28:45.139524 4743 generic.go:334] "Generic (PLEG): container finished" podID="d6efb775-64a4-414b-a8e3-8169706ba3de" containerID="e0fd2ff6bc44d8ca34631e3360345af833d76791b69239c0736093c79ed4f5b0" exitCode=0 Nov 22 10:28:45 crc kubenswrapper[4743]: I1122 10:28:45.139956 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" event={"ID":"d6efb775-64a4-414b-a8e3-8169706ba3de","Type":"ContainerDied","Data":"e0fd2ff6bc44d8ca34631e3360345af833d76791b69239c0736093c79ed4f5b0"} Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.597538 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.687529 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ssh-key\") pod \"d6efb775-64a4-414b-a8e3-8169706ba3de\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.687846 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ceph\") pod \"d6efb775-64a4-414b-a8e3-8169706ba3de\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.687974 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppprv\" (UniqueName: \"kubernetes.io/projected/d6efb775-64a4-414b-a8e3-8169706ba3de-kube-api-access-ppprv\") pod \"d6efb775-64a4-414b-a8e3-8169706ba3de\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.688139 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-inventory\") pod \"d6efb775-64a4-414b-a8e3-8169706ba3de\" (UID: \"d6efb775-64a4-414b-a8e3-8169706ba3de\") " Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.692768 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6efb775-64a4-414b-a8e3-8169706ba3de-kube-api-access-ppprv" (OuterVolumeSpecName: "kube-api-access-ppprv") pod "d6efb775-64a4-414b-a8e3-8169706ba3de" (UID: "d6efb775-64a4-414b-a8e3-8169706ba3de"). InnerVolumeSpecName "kube-api-access-ppprv". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.697016 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ceph" (OuterVolumeSpecName: "ceph") pod "d6efb775-64a4-414b-a8e3-8169706ba3de" (UID: "d6efb775-64a4-414b-a8e3-8169706ba3de"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.715147 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d6efb775-64a4-414b-a8e3-8169706ba3de" (UID: "d6efb775-64a4-414b-a8e3-8169706ba3de"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.715538 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-inventory" (OuterVolumeSpecName: "inventory") pod "d6efb775-64a4-414b-a8e3-8169706ba3de" (UID: "d6efb775-64a4-414b-a8e3-8169706ba3de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.790008 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.790049 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.790067 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppprv\" (UniqueName: \"kubernetes.io/projected/d6efb775-64a4-414b-a8e3-8169706ba3de-kube-api-access-ppprv\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:46 crc kubenswrapper[4743]: I1122 10:28:46.790083 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d6efb775-64a4-414b-a8e3-8169706ba3de-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.166048 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" event={"ID":"d6efb775-64a4-414b-a8e3-8169706ba3de","Type":"ContainerDied","Data":"2353b721dd523210c97e33c511b0d2a5a315edb947c0f723e3d8f1eafb02b284"} Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.166309 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2353b721dd523210c97e33c511b0d2a5a315edb947c0f723e3d8f1eafb02b284" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.166084 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-w4vp7" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.234705 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-77xtc"] Nov 22 10:28:47 crc kubenswrapper[4743]: E1122 10:28:47.235138 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6efb775-64a4-414b-a8e3-8169706ba3de" containerName="reboot-os-openstack-openstack-cell1" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.235155 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6efb775-64a4-414b-a8e3-8169706ba3de" containerName="reboot-os-openstack-openstack-cell1" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.235388 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6efb775-64a4-414b-a8e3-8169706ba3de" containerName="reboot-os-openstack-openstack-cell1" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.236121 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.240416 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.240690 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.241549 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.247059 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.250050 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-77xtc"] Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.299569 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ceph\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.299636 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.299845 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.300059 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l9vd\" (UniqueName: \"kubernetes.io/projected/381c96a8-6592-46e2-b5b7-2000e2577d5c-kube-api-access-2l9vd\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.300111 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.300132 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.300252 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ssh-key\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.300338 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.300399 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.300497 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-inventory\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.300530 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.300566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402122 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402166 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402201 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ceph\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402246 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402303 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402361 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l9vd\" (UniqueName: \"kubernetes.io/projected/381c96a8-6592-46e2-b5b7-2000e2577d5c-kube-api-access-2l9vd\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402385 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ssh-key\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402471 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402499 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.402528 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-inventory\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.406658 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-inventory\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.406878 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.407331 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.407413 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.407537 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.407831 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ceph\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.408931 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.412269 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ssh-key\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.412548 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.413013 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.413126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.420261 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l9vd\" (UniqueName: \"kubernetes.io/projected/381c96a8-6592-46e2-b5b7-2000e2577d5c-kube-api-access-2l9vd\") pod \"install-certs-openstack-openstack-cell1-77xtc\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:47 crc kubenswrapper[4743]: I1122 10:28:47.558046 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:28:48 crc kubenswrapper[4743]: W1122 10:28:48.081047 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod381c96a8_6592_46e2_b5b7_2000e2577d5c.slice/crio-1365b092e34a50052a92e7f09b97f9851120dc0abf8047c56ede348809f16982 WatchSource:0}: Error finding container 1365b092e34a50052a92e7f09b97f9851120dc0abf8047c56ede348809f16982: Status 404 returned error can't find the container with id 1365b092e34a50052a92e7f09b97f9851120dc0abf8047c56ede348809f16982 Nov 22 10:28:48 crc kubenswrapper[4743]: I1122 10:28:48.082757 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-77xtc"] Nov 22 10:28:48 crc kubenswrapper[4743]: I1122 10:28:48.179342 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-77xtc" event={"ID":"381c96a8-6592-46e2-b5b7-2000e2577d5c","Type":"ContainerStarted","Data":"1365b092e34a50052a92e7f09b97f9851120dc0abf8047c56ede348809f16982"} Nov 22 10:28:49 crc kubenswrapper[4743]: I1122 10:28:49.189945 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-77xtc" event={"ID":"381c96a8-6592-46e2-b5b7-2000e2577d5c","Type":"ContainerStarted","Data":"850ee20a929398681beeff806dd6e3e0cd4a18bc1944bc38510b3391ba01117b"} Nov 22 10:28:49 crc kubenswrapper[4743]: I1122 10:28:49.213643 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-77xtc" podStartSLOduration=1.783047475 podStartE2EDuration="2.21362177s" podCreationTimestamp="2025-11-22 10:28:47 +0000 UTC" firstStartedPulling="2025-11-22 10:28:48.084668647 +0000 UTC m=+7601.791029699" lastFinishedPulling="2025-11-22 10:28:48.515242942 +0000 UTC m=+7602.221603994" observedRunningTime="2025-11-22 10:28:49.204932311 +0000 UTC m=+7602.911293383" watchObservedRunningTime="2025-11-22 10:28:49.21362177 +0000 UTC m=+7602.919982822" Nov 22 10:29:01 crc kubenswrapper[4743]: I1122 10:29:01.241163 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:29:01 crc kubenswrapper[4743]: I1122 10:29:01.241765 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:29:07 crc kubenswrapper[4743]: I1122 10:29:07.378024 4743 generic.go:334] "Generic (PLEG): container finished" podID="381c96a8-6592-46e2-b5b7-2000e2577d5c" containerID="850ee20a929398681beeff806dd6e3e0cd4a18bc1944bc38510b3391ba01117b" exitCode=0 Nov 22 10:29:07 crc kubenswrapper[4743]: I1122 10:29:07.378122 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-77xtc" event={"ID":"381c96a8-6592-46e2-b5b7-2000e2577d5c","Type":"ContainerDied","Data":"850ee20a929398681beeff806dd6e3e0cd4a18bc1944bc38510b3391ba01117b"} Nov 22 10:29:08 crc kubenswrapper[4743]: I1122 10:29:08.878933 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.069786 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-inventory\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.069838 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-telemetry-combined-ca-bundle\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.069919 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-sriov-combined-ca-bundle\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.069962 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-dhcp-combined-ca-bundle\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.070027 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ovn-combined-ca-bundle\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.071342 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l9vd\" (UniqueName: \"kubernetes.io/projected/381c96a8-6592-46e2-b5b7-2000e2577d5c-kube-api-access-2l9vd\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.071371 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ceph\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.071401 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-nova-combined-ca-bundle\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.071470 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-bootstrap-combined-ca-bundle\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.071498 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-libvirt-combined-ca-bundle\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.071535 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ssh-key\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.071554 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-metadata-combined-ca-bundle\") pod \"381c96a8-6592-46e2-b5b7-2000e2577d5c\" (UID: \"381c96a8-6592-46e2-b5b7-2000e2577d5c\") " Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.076923 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.077679 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.077952 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.078043 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.078125 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.078171 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.078362 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.089710 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/381c96a8-6592-46e2-b5b7-2000e2577d5c-kube-api-access-2l9vd" (OuterVolumeSpecName: "kube-api-access-2l9vd") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "kube-api-access-2l9vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.089734 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ceph" (OuterVolumeSpecName: "ceph") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.091655 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.113101 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-inventory" (OuterVolumeSpecName: "inventory") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.116951 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "381c96a8-6592-46e2-b5b7-2000e2577d5c" (UID: "381c96a8-6592-46e2-b5b7-2000e2577d5c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174333 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174378 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l9vd\" (UniqueName: \"kubernetes.io/projected/381c96a8-6592-46e2-b5b7-2000e2577d5c-kube-api-access-2l9vd\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174396 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174413 4743 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174432 4743 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174448 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174465 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174482 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174499 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174519 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174538 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.174554 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381c96a8-6592-46e2-b5b7-2000e2577d5c-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.404459 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-77xtc" event={"ID":"381c96a8-6592-46e2-b5b7-2000e2577d5c","Type":"ContainerDied","Data":"1365b092e34a50052a92e7f09b97f9851120dc0abf8047c56ede348809f16982"} Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.404505 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1365b092e34a50052a92e7f09b97f9851120dc0abf8047c56ede348809f16982" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.404605 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-77xtc" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.513245 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-vz8zn"] Nov 22 10:29:09 crc kubenswrapper[4743]: E1122 10:29:09.514094 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="381c96a8-6592-46e2-b5b7-2000e2577d5c" containerName="install-certs-openstack-openstack-cell1" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.514195 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="381c96a8-6592-46e2-b5b7-2000e2577d5c" containerName="install-certs-openstack-openstack-cell1" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.514532 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="381c96a8-6592-46e2-b5b7-2000e2577d5c" containerName="install-certs-openstack-openstack-cell1" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.515424 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.519987 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.520277 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.520674 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.520800 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.521759 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-vz8zn"] Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.682813 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.682872 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-inventory\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.683042 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc5zp\" (UniqueName: \"kubernetes.io/projected/f07941f2-7f6c-497e-ad3d-6719b60f0111-kube-api-access-kc5zp\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.683111 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ceph\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.784612 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.784973 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-inventory\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.785059 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc5zp\" (UniqueName: \"kubernetes.io/projected/f07941f2-7f6c-497e-ad3d-6719b60f0111-kube-api-access-kc5zp\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.785106 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ceph\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.789605 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ceph\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.789971 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.793318 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-inventory\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.802329 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc5zp\" (UniqueName: \"kubernetes.io/projected/f07941f2-7f6c-497e-ad3d-6719b60f0111-kube-api-access-kc5zp\") pod \"ceph-client-openstack-openstack-cell1-vz8zn\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:09 crc kubenswrapper[4743]: I1122 10:29:09.848854 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:10 crc kubenswrapper[4743]: I1122 10:29:10.385413 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-vz8zn"] Nov 22 10:29:10 crc kubenswrapper[4743]: W1122 10:29:10.394408 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf07941f2_7f6c_497e_ad3d_6719b60f0111.slice/crio-25e9c522b12c68bedfcfe026ac08284c4507e80dd264a2273edcf7d3a6e47ca7 WatchSource:0}: Error finding container 25e9c522b12c68bedfcfe026ac08284c4507e80dd264a2273edcf7d3a6e47ca7: Status 404 returned error can't find the container with id 25e9c522b12c68bedfcfe026ac08284c4507e80dd264a2273edcf7d3a6e47ca7 Nov 22 10:29:10 crc kubenswrapper[4743]: I1122 10:29:10.415064 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" event={"ID":"f07941f2-7f6c-497e-ad3d-6719b60f0111","Type":"ContainerStarted","Data":"25e9c522b12c68bedfcfe026ac08284c4507e80dd264a2273edcf7d3a6e47ca7"} Nov 22 10:29:11 crc kubenswrapper[4743]: I1122 10:29:11.428171 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" event={"ID":"f07941f2-7f6c-497e-ad3d-6719b60f0111","Type":"ContainerStarted","Data":"5e6f9e3577b23333ac9599a21422786d7d85de81954b5eee35b2e32e971c4e2e"} Nov 22 10:29:11 crc kubenswrapper[4743]: I1122 10:29:11.457439 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" podStartSLOduration=1.696478886 podStartE2EDuration="2.457419099s" podCreationTimestamp="2025-11-22 10:29:09 +0000 UTC" firstStartedPulling="2025-11-22 10:29:10.398056003 +0000 UTC m=+7624.104417055" lastFinishedPulling="2025-11-22 10:29:11.158996216 +0000 UTC m=+7624.865357268" observedRunningTime="2025-11-22 10:29:11.456426961 +0000 UTC m=+7625.162788013" watchObservedRunningTime="2025-11-22 10:29:11.457419099 +0000 UTC m=+7625.163780151" Nov 22 10:29:16 crc kubenswrapper[4743]: I1122 10:29:16.490926 4743 generic.go:334] "Generic (PLEG): container finished" podID="f07941f2-7f6c-497e-ad3d-6719b60f0111" containerID="5e6f9e3577b23333ac9599a21422786d7d85de81954b5eee35b2e32e971c4e2e" exitCode=0 Nov 22 10:29:16 crc kubenswrapper[4743]: I1122 10:29:16.490986 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" event={"ID":"f07941f2-7f6c-497e-ad3d-6719b60f0111","Type":"ContainerDied","Data":"5e6f9e3577b23333ac9599a21422786d7d85de81954b5eee35b2e32e971c4e2e"} Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.100408 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.273671 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ssh-key\") pod \"f07941f2-7f6c-497e-ad3d-6719b60f0111\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.273791 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ceph\") pod \"f07941f2-7f6c-497e-ad3d-6719b60f0111\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.273871 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-inventory\") pod \"f07941f2-7f6c-497e-ad3d-6719b60f0111\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.274042 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc5zp\" (UniqueName: \"kubernetes.io/projected/f07941f2-7f6c-497e-ad3d-6719b60f0111-kube-api-access-kc5zp\") pod \"f07941f2-7f6c-497e-ad3d-6719b60f0111\" (UID: \"f07941f2-7f6c-497e-ad3d-6719b60f0111\") " Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.282481 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ceph" (OuterVolumeSpecName: "ceph") pod "f07941f2-7f6c-497e-ad3d-6719b60f0111" (UID: "f07941f2-7f6c-497e-ad3d-6719b60f0111"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.313672 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f07941f2-7f6c-497e-ad3d-6719b60f0111-kube-api-access-kc5zp" (OuterVolumeSpecName: "kube-api-access-kc5zp") pod "f07941f2-7f6c-497e-ad3d-6719b60f0111" (UID: "f07941f2-7f6c-497e-ad3d-6719b60f0111"). InnerVolumeSpecName "kube-api-access-kc5zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.328919 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-inventory" (OuterVolumeSpecName: "inventory") pod "f07941f2-7f6c-497e-ad3d-6719b60f0111" (UID: "f07941f2-7f6c-497e-ad3d-6719b60f0111"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.362837 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f07941f2-7f6c-497e-ad3d-6719b60f0111" (UID: "f07941f2-7f6c-497e-ad3d-6719b60f0111"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.376341 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.376377 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.376387 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f07941f2-7f6c-497e-ad3d-6719b60f0111-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.376399 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc5zp\" (UniqueName: \"kubernetes.io/projected/f07941f2-7f6c-497e-ad3d-6719b60f0111-kube-api-access-kc5zp\") on node \"crc\" DevicePath \"\"" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.513069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" event={"ID":"f07941f2-7f6c-497e-ad3d-6719b60f0111","Type":"ContainerDied","Data":"25e9c522b12c68bedfcfe026ac08284c4507e80dd264a2273edcf7d3a6e47ca7"} Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.513124 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25e9c522b12c68bedfcfe026ac08284c4507e80dd264a2273edcf7d3a6e47ca7" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.513210 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-vz8zn" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.589790 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-xwbpr"] Nov 22 10:29:18 crc kubenswrapper[4743]: E1122 10:29:18.590241 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f07941f2-7f6c-497e-ad3d-6719b60f0111" containerName="ceph-client-openstack-openstack-cell1" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.590260 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f07941f2-7f6c-497e-ad3d-6719b60f0111" containerName="ceph-client-openstack-openstack-cell1" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.590497 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f07941f2-7f6c-497e-ad3d-6719b60f0111" containerName="ceph-client-openstack-openstack-cell1" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.591372 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.598847 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.599294 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.599525 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.599736 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.599968 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.611102 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-xwbpr"] Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.683663 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ceph\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.683715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2zv4\" (UniqueName: \"kubernetes.io/projected/3e80ff58-1768-4d4a-b759-d9d882510ff8-kube-api-access-j2zv4\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.683776 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-inventory\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.683861 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.683988 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.684021 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ssh-key\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.785677 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-inventory\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.786019 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.786244 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.786376 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ssh-key\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.786533 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2zv4\" (UniqueName: \"kubernetes.io/projected/3e80ff58-1768-4d4a-b759-d9d882510ff8-kube-api-access-j2zv4\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.786646 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ceph\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.787621 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.790769 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ssh-key\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.790991 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ceph\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.794155 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-inventory\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.794609 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.803731 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2zv4\" (UniqueName: \"kubernetes.io/projected/3e80ff58-1768-4d4a-b759-d9d882510ff8-kube-api-access-j2zv4\") pod \"ovn-openstack-openstack-cell1-xwbpr\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:18 crc kubenswrapper[4743]: I1122 10:29:18.919950 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:29:19 crc kubenswrapper[4743]: I1122 10:29:19.505411 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-xwbpr"] Nov 22 10:29:19 crc kubenswrapper[4743]: I1122 10:29:19.522257 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-xwbpr" event={"ID":"3e80ff58-1768-4d4a-b759-d9d882510ff8","Type":"ContainerStarted","Data":"58b65e78c10839764be68c66246d8d207df455beb0ca02425e6ee5b185f011f5"} Nov 22 10:29:20 crc kubenswrapper[4743]: I1122 10:29:20.532887 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-xwbpr" event={"ID":"3e80ff58-1768-4d4a-b759-d9d882510ff8","Type":"ContainerStarted","Data":"9f5c3d0e9fea66f465658d78d39161c62cd2d6800241df929e4dfa65d3aa8955"} Nov 22 10:29:20 crc kubenswrapper[4743]: I1122 10:29:20.550378 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-xwbpr" podStartSLOduration=2.162066379 podStartE2EDuration="2.550360741s" podCreationTimestamp="2025-11-22 10:29:18 +0000 UTC" firstStartedPulling="2025-11-22 10:29:19.509066282 +0000 UTC m=+7633.215427334" lastFinishedPulling="2025-11-22 10:29:19.897360634 +0000 UTC m=+7633.603721696" observedRunningTime="2025-11-22 10:29:20.547238662 +0000 UTC m=+7634.253599714" watchObservedRunningTime="2025-11-22 10:29:20.550360741 +0000 UTC m=+7634.256721783" Nov 22 10:29:31 crc kubenswrapper[4743]: I1122 10:29:31.241106 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:29:31 crc kubenswrapper[4743]: I1122 10:29:31.241972 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.149712 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8"] Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.155943 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.158681 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.160590 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.164376 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8"] Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.327715 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f6dz\" (UniqueName: \"kubernetes.io/projected/f170712f-3df4-4ee8-99dd-308c78dce5f5-kube-api-access-8f6dz\") pod \"collect-profiles-29396790-cbtl8\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.327768 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f170712f-3df4-4ee8-99dd-308c78dce5f5-secret-volume\") pod \"collect-profiles-29396790-cbtl8\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.327799 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f170712f-3df4-4ee8-99dd-308c78dce5f5-config-volume\") pod \"collect-profiles-29396790-cbtl8\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.429913 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8f6dz\" (UniqueName: \"kubernetes.io/projected/f170712f-3df4-4ee8-99dd-308c78dce5f5-kube-api-access-8f6dz\") pod \"collect-profiles-29396790-cbtl8\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.429959 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f170712f-3df4-4ee8-99dd-308c78dce5f5-secret-volume\") pod \"collect-profiles-29396790-cbtl8\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.429987 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f170712f-3df4-4ee8-99dd-308c78dce5f5-config-volume\") pod \"collect-profiles-29396790-cbtl8\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.430944 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f170712f-3df4-4ee8-99dd-308c78dce5f5-config-volume\") pod \"collect-profiles-29396790-cbtl8\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.435221 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f170712f-3df4-4ee8-99dd-308c78dce5f5-secret-volume\") pod \"collect-profiles-29396790-cbtl8\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.448994 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f6dz\" (UniqueName: \"kubernetes.io/projected/f170712f-3df4-4ee8-99dd-308c78dce5f5-kube-api-access-8f6dz\") pod \"collect-profiles-29396790-cbtl8\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.489699 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.954483 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8"] Nov 22 10:30:00 crc kubenswrapper[4743]: I1122 10:30:00.968904 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" event={"ID":"f170712f-3df4-4ee8-99dd-308c78dce5f5","Type":"ContainerStarted","Data":"3fa1f07c0fca679c11cfe0076be2e39d243f3a85dcb3b882f43f11e8f2939c5d"} Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.241074 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.241506 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.241552 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.242440 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"40328998db4cc7d85f34000888325dea9a42d2a672d786787ca0b5e402b918dd"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.242502 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://40328998db4cc7d85f34000888325dea9a42d2a672d786787ca0b5e402b918dd" gracePeriod=600 Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.982713 4743 generic.go:334] "Generic (PLEG): container finished" podID="f170712f-3df4-4ee8-99dd-308c78dce5f5" containerID="2caaa734dbe8b17f8ca0a964de030ed4f89482693a31c43feb9d362afb59a86e" exitCode=0 Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.982829 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" event={"ID":"f170712f-3df4-4ee8-99dd-308c78dce5f5","Type":"ContainerDied","Data":"2caaa734dbe8b17f8ca0a964de030ed4f89482693a31c43feb9d362afb59a86e"} Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.988953 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="40328998db4cc7d85f34000888325dea9a42d2a672d786787ca0b5e402b918dd" exitCode=0 Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.988996 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"40328998db4cc7d85f34000888325dea9a42d2a672d786787ca0b5e402b918dd"} Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.989021 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a"} Nov 22 10:30:01 crc kubenswrapper[4743]: I1122 10:30:01.989041 4743 scope.go:117] "RemoveContainer" containerID="776dce2bd03e455984ca8d926febac0c1ad5e0f357b730a9914a6b72e760f307" Nov 22 10:30:03 crc kubenswrapper[4743]: I1122 10:30:03.378245 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:03 crc kubenswrapper[4743]: I1122 10:30:03.495568 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8f6dz\" (UniqueName: \"kubernetes.io/projected/f170712f-3df4-4ee8-99dd-308c78dce5f5-kube-api-access-8f6dz\") pod \"f170712f-3df4-4ee8-99dd-308c78dce5f5\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " Nov 22 10:30:03 crc kubenswrapper[4743]: I1122 10:30:03.496069 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f170712f-3df4-4ee8-99dd-308c78dce5f5-secret-volume\") pod \"f170712f-3df4-4ee8-99dd-308c78dce5f5\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " Nov 22 10:30:03 crc kubenswrapper[4743]: I1122 10:30:03.496176 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f170712f-3df4-4ee8-99dd-308c78dce5f5-config-volume\") pod \"f170712f-3df4-4ee8-99dd-308c78dce5f5\" (UID: \"f170712f-3df4-4ee8-99dd-308c78dce5f5\") " Nov 22 10:30:03 crc kubenswrapper[4743]: I1122 10:30:03.497312 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f170712f-3df4-4ee8-99dd-308c78dce5f5-config-volume" (OuterVolumeSpecName: "config-volume") pod "f170712f-3df4-4ee8-99dd-308c78dce5f5" (UID: "f170712f-3df4-4ee8-99dd-308c78dce5f5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:30:03 crc kubenswrapper[4743]: I1122 10:30:03.502175 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f170712f-3df4-4ee8-99dd-308c78dce5f5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f170712f-3df4-4ee8-99dd-308c78dce5f5" (UID: "f170712f-3df4-4ee8-99dd-308c78dce5f5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:30:03 crc kubenswrapper[4743]: I1122 10:30:03.506004 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f170712f-3df4-4ee8-99dd-308c78dce5f5-kube-api-access-8f6dz" (OuterVolumeSpecName: "kube-api-access-8f6dz") pod "f170712f-3df4-4ee8-99dd-308c78dce5f5" (UID: "f170712f-3df4-4ee8-99dd-308c78dce5f5"). InnerVolumeSpecName "kube-api-access-8f6dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:30:03 crc kubenswrapper[4743]: I1122 10:30:03.598732 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8f6dz\" (UniqueName: \"kubernetes.io/projected/f170712f-3df4-4ee8-99dd-308c78dce5f5-kube-api-access-8f6dz\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:03 crc kubenswrapper[4743]: I1122 10:30:03.598778 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f170712f-3df4-4ee8-99dd-308c78dce5f5-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:03 crc kubenswrapper[4743]: I1122 10:30:03.598792 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f170712f-3df4-4ee8-99dd-308c78dce5f5-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:04 crc kubenswrapper[4743]: I1122 10:30:04.020551 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" event={"ID":"f170712f-3df4-4ee8-99dd-308c78dce5f5","Type":"ContainerDied","Data":"3fa1f07c0fca679c11cfe0076be2e39d243f3a85dcb3b882f43f11e8f2939c5d"} Nov 22 10:30:04 crc kubenswrapper[4743]: I1122 10:30:04.021031 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fa1f07c0fca679c11cfe0076be2e39d243f3a85dcb3b882f43f11e8f2939c5d" Nov 22 10:30:04 crc kubenswrapper[4743]: I1122 10:30:04.020626 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8" Nov 22 10:30:04 crc kubenswrapper[4743]: I1122 10:30:04.451490 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7"] Nov 22 10:30:04 crc kubenswrapper[4743]: I1122 10:30:04.459276 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396745-t9fk7"] Nov 22 10:30:05 crc kubenswrapper[4743]: I1122 10:30:05.179510 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9b2454e-4ff4-42b3-aed8-fe654256639a" path="/var/lib/kubelet/pods/d9b2454e-4ff4-42b3-aed8-fe654256639a/volumes" Nov 22 10:30:28 crc kubenswrapper[4743]: I1122 10:30:28.348159 4743 generic.go:334] "Generic (PLEG): container finished" podID="3e80ff58-1768-4d4a-b759-d9d882510ff8" containerID="9f5c3d0e9fea66f465658d78d39161c62cd2d6800241df929e4dfa65d3aa8955" exitCode=0 Nov 22 10:30:28 crc kubenswrapper[4743]: I1122 10:30:28.349518 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-xwbpr" event={"ID":"3e80ff58-1768-4d4a-b759-d9d882510ff8","Type":"ContainerDied","Data":"9f5c3d0e9fea66f465658d78d39161c62cd2d6800241df929e4dfa65d3aa8955"} Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.810075 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.914705 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ssh-key\") pod \"3e80ff58-1768-4d4a-b759-d9d882510ff8\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.914864 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ceph\") pod \"3e80ff58-1768-4d4a-b759-d9d882510ff8\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.915025 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-inventory\") pod \"3e80ff58-1768-4d4a-b759-d9d882510ff8\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.915656 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2zv4\" (UniqueName: \"kubernetes.io/projected/3e80ff58-1768-4d4a-b759-d9d882510ff8-kube-api-access-j2zv4\") pod \"3e80ff58-1768-4d4a-b759-d9d882510ff8\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.915693 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovn-combined-ca-bundle\") pod \"3e80ff58-1768-4d4a-b759-d9d882510ff8\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.915727 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovncontroller-config-0\") pod \"3e80ff58-1768-4d4a-b759-d9d882510ff8\" (UID: \"3e80ff58-1768-4d4a-b759-d9d882510ff8\") " Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.920675 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e80ff58-1768-4d4a-b759-d9d882510ff8-kube-api-access-j2zv4" (OuterVolumeSpecName: "kube-api-access-j2zv4") pod "3e80ff58-1768-4d4a-b759-d9d882510ff8" (UID: "3e80ff58-1768-4d4a-b759-d9d882510ff8"). InnerVolumeSpecName "kube-api-access-j2zv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.920950 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3e80ff58-1768-4d4a-b759-d9d882510ff8" (UID: "3e80ff58-1768-4d4a-b759-d9d882510ff8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.921055 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ceph" (OuterVolumeSpecName: "ceph") pod "3e80ff58-1768-4d4a-b759-d9d882510ff8" (UID: "3e80ff58-1768-4d4a-b759-d9d882510ff8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.944343 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-inventory" (OuterVolumeSpecName: "inventory") pod "3e80ff58-1768-4d4a-b759-d9d882510ff8" (UID: "3e80ff58-1768-4d4a-b759-d9d882510ff8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.951361 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3e80ff58-1768-4d4a-b759-d9d882510ff8" (UID: "3e80ff58-1768-4d4a-b759-d9d882510ff8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:30:29 crc kubenswrapper[4743]: I1122 10:30:29.960749 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3e80ff58-1768-4d4a-b759-d9d882510ff8" (UID: "3e80ff58-1768-4d4a-b759-d9d882510ff8"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.018553 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.018605 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.018616 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2zv4\" (UniqueName: \"kubernetes.io/projected/3e80ff58-1768-4d4a-b759-d9d882510ff8-kube-api-access-j2zv4\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.018626 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.018636 4743 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3e80ff58-1768-4d4a-b759-d9d882510ff8-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.018645 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3e80ff58-1768-4d4a-b759-d9d882510ff8-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.378410 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-xwbpr" event={"ID":"3e80ff58-1768-4d4a-b759-d9d882510ff8","Type":"ContainerDied","Data":"58b65e78c10839764be68c66246d8d207df455beb0ca02425e6ee5b185f011f5"} Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.378471 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-xwbpr" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.378480 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58b65e78c10839764be68c66246d8d207df455beb0ca02425e6ee5b185f011f5" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.488475 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-q9bgn"] Nov 22 10:30:30 crc kubenswrapper[4743]: E1122 10:30:30.489007 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f170712f-3df4-4ee8-99dd-308c78dce5f5" containerName="collect-profiles" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.489031 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f170712f-3df4-4ee8-99dd-308c78dce5f5" containerName="collect-profiles" Nov 22 10:30:30 crc kubenswrapper[4743]: E1122 10:30:30.489105 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e80ff58-1768-4d4a-b759-d9d882510ff8" containerName="ovn-openstack-openstack-cell1" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.489116 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e80ff58-1768-4d4a-b759-d9d882510ff8" containerName="ovn-openstack-openstack-cell1" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.489394 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e80ff58-1768-4d4a-b759-d9d882510ff8" containerName="ovn-openstack-openstack-cell1" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.489436 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f170712f-3df4-4ee8-99dd-308c78dce5f5" containerName="collect-profiles" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.490438 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.494487 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.494699 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.494737 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.494818 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.494952 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.498999 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.503832 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-q9bgn"] Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.633096 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.633496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.633531 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.633556 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.633661 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.633751 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.633855 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjqx6\" (UniqueName: \"kubernetes.io/projected/28484c70-513c-41a2-b0f7-5922002be895-kube-api-access-bjqx6\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.736131 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.736226 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.736291 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjqx6\" (UniqueName: \"kubernetes.io/projected/28484c70-513c-41a2-b0f7-5922002be895-kube-api-access-bjqx6\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.736354 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.736380 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.736407 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.736431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.742052 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.742214 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.742422 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.743027 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.743429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.754955 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjqx6\" (UniqueName: \"kubernetes.io/projected/28484c70-513c-41a2-b0f7-5922002be895-kube-api-access-bjqx6\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.759812 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-q9bgn\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:30 crc kubenswrapper[4743]: I1122 10:30:30.817876 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:30:31 crc kubenswrapper[4743]: I1122 10:30:31.392318 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-q9bgn"] Nov 22 10:30:32 crc kubenswrapper[4743]: I1122 10:30:32.404592 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" event={"ID":"28484c70-513c-41a2-b0f7-5922002be895","Type":"ContainerStarted","Data":"accab3d4d99d1e0c7d704b27e80599cf4e41e83db2e316c620a93e7b3c970691"} Nov 22 10:30:32 crc kubenswrapper[4743]: I1122 10:30:32.404971 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" event={"ID":"28484c70-513c-41a2-b0f7-5922002be895","Type":"ContainerStarted","Data":"50d48a6c9a0eab99a6a0228ba98c5fbb23e1eb41bee0000c0bc624467b353913"} Nov 22 10:30:32 crc kubenswrapper[4743]: I1122 10:30:32.430078 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" podStartSLOduration=1.8461774549999999 podStartE2EDuration="2.430052639s" podCreationTimestamp="2025-11-22 10:30:30 +0000 UTC" firstStartedPulling="2025-11-22 10:30:31.405606614 +0000 UTC m=+7705.111967666" lastFinishedPulling="2025-11-22 10:30:31.989481798 +0000 UTC m=+7705.695842850" observedRunningTime="2025-11-22 10:30:32.417956712 +0000 UTC m=+7706.124317784" watchObservedRunningTime="2025-11-22 10:30:32.430052639 +0000 UTC m=+7706.136413701" Nov 22 10:30:38 crc kubenswrapper[4743]: I1122 10:30:38.068902 4743 scope.go:117] "RemoveContainer" containerID="fc19a6ab174358a3de2825506376a50ca1ce69796262542d83926eef3cbe9193" Nov 22 10:31:26 crc kubenswrapper[4743]: I1122 10:31:26.974060 4743 generic.go:334] "Generic (PLEG): container finished" podID="28484c70-513c-41a2-b0f7-5922002be895" containerID="accab3d4d99d1e0c7d704b27e80599cf4e41e83db2e316c620a93e7b3c970691" exitCode=0 Nov 22 10:31:26 crc kubenswrapper[4743]: I1122 10:31:26.974158 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" event={"ID":"28484c70-513c-41a2-b0f7-5922002be895","Type":"ContainerDied","Data":"accab3d4d99d1e0c7d704b27e80599cf4e41e83db2e316c620a93e7b3c970691"} Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.459074 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.569993 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-metadata-combined-ca-bundle\") pod \"28484c70-513c-41a2-b0f7-5922002be895\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.570109 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ssh-key\") pod \"28484c70-513c-41a2-b0f7-5922002be895\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.570173 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ceph\") pod \"28484c70-513c-41a2-b0f7-5922002be895\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.570263 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-ovn-metadata-agent-neutron-config-0\") pod \"28484c70-513c-41a2-b0f7-5922002be895\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.570387 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-nova-metadata-neutron-config-0\") pod \"28484c70-513c-41a2-b0f7-5922002be895\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.570488 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjqx6\" (UniqueName: \"kubernetes.io/projected/28484c70-513c-41a2-b0f7-5922002be895-kube-api-access-bjqx6\") pod \"28484c70-513c-41a2-b0f7-5922002be895\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.570511 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-inventory\") pod \"28484c70-513c-41a2-b0f7-5922002be895\" (UID: \"28484c70-513c-41a2-b0f7-5922002be895\") " Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.575128 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ceph" (OuterVolumeSpecName: "ceph") pod "28484c70-513c-41a2-b0f7-5922002be895" (UID: "28484c70-513c-41a2-b0f7-5922002be895"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.576111 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28484c70-513c-41a2-b0f7-5922002be895-kube-api-access-bjqx6" (OuterVolumeSpecName: "kube-api-access-bjqx6") pod "28484c70-513c-41a2-b0f7-5922002be895" (UID: "28484c70-513c-41a2-b0f7-5922002be895"). InnerVolumeSpecName "kube-api-access-bjqx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.583925 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "28484c70-513c-41a2-b0f7-5922002be895" (UID: "28484c70-513c-41a2-b0f7-5922002be895"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.602938 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "28484c70-513c-41a2-b0f7-5922002be895" (UID: "28484c70-513c-41a2-b0f7-5922002be895"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.611443 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "28484c70-513c-41a2-b0f7-5922002be895" (UID: "28484c70-513c-41a2-b0f7-5922002be895"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.611412 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "28484c70-513c-41a2-b0f7-5922002be895" (UID: "28484c70-513c-41a2-b0f7-5922002be895"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.635874 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-inventory" (OuterVolumeSpecName: "inventory") pod "28484c70-513c-41a2-b0f7-5922002be895" (UID: "28484c70-513c-41a2-b0f7-5922002be895"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.677246 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.677281 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.677294 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.677312 4743 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.677325 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjqx6\" (UniqueName: \"kubernetes.io/projected/28484c70-513c-41a2-b0f7-5922002be895-kube-api-access-bjqx6\") on node \"crc\" DevicePath \"\"" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.677333 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.677343 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28484c70-513c-41a2-b0f7-5922002be895-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.997382 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" event={"ID":"28484c70-513c-41a2-b0f7-5922002be895","Type":"ContainerDied","Data":"50d48a6c9a0eab99a6a0228ba98c5fbb23e1eb41bee0000c0bc624467b353913"} Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.997732 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d48a6c9a0eab99a6a0228ba98c5fbb23e1eb41bee0000c0bc624467b353913" Nov 22 10:31:28 crc kubenswrapper[4743]: I1122 10:31:28.997441 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-q9bgn" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.093641 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-vt6f7"] Nov 22 10:31:29 crc kubenswrapper[4743]: E1122 10:31:29.094173 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28484c70-513c-41a2-b0f7-5922002be895" containerName="neutron-metadata-openstack-openstack-cell1" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.094245 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="28484c70-513c-41a2-b0f7-5922002be895" containerName="neutron-metadata-openstack-openstack-cell1" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.094652 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="28484c70-513c-41a2-b0f7-5922002be895" containerName="neutron-metadata-openstack-openstack-cell1" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.098741 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.103496 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.103857 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.104666 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.104943 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.105004 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.127222 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-vt6f7"] Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.288120 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ssh-key\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.288236 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.288283 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xdv7\" (UniqueName: \"kubernetes.io/projected/65ef4bef-62b0-4592-94a2-d93d8679ce08-kube-api-access-7xdv7\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.288327 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ceph\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.288412 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.288432 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-inventory\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.389929 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xdv7\" (UniqueName: \"kubernetes.io/projected/65ef4bef-62b0-4592-94a2-d93d8679ce08-kube-api-access-7xdv7\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.389993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ceph\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.390028 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.390046 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-inventory\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.390161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ssh-key\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.390233 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.393878 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ssh-key\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.394510 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.395507 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ceph\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.396022 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.396363 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-inventory\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.420925 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xdv7\" (UniqueName: \"kubernetes.io/projected/65ef4bef-62b0-4592-94a2-d93d8679ce08-kube-api-access-7xdv7\") pod \"libvirt-openstack-openstack-cell1-vt6f7\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.423007 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:31:29 crc kubenswrapper[4743]: W1122 10:31:29.975710 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65ef4bef_62b0_4592_94a2_d93d8679ce08.slice/crio-ffb19095da1b233c7c96635427f298997386753a17df776c9bce17c834a61035 WatchSource:0}: Error finding container ffb19095da1b233c7c96635427f298997386753a17df776c9bce17c834a61035: Status 404 returned error can't find the container with id ffb19095da1b233c7c96635427f298997386753a17df776c9bce17c834a61035 Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.980325 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:31:29 crc kubenswrapper[4743]: I1122 10:31:29.980545 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-vt6f7"] Nov 22 10:31:30 crc kubenswrapper[4743]: I1122 10:31:30.021011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" event={"ID":"65ef4bef-62b0-4592-94a2-d93d8679ce08","Type":"ContainerStarted","Data":"ffb19095da1b233c7c96635427f298997386753a17df776c9bce17c834a61035"} Nov 22 10:31:31 crc kubenswrapper[4743]: I1122 10:31:31.044292 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" event={"ID":"65ef4bef-62b0-4592-94a2-d93d8679ce08","Type":"ContainerStarted","Data":"4388a79750fc48982c6e1bbf26b496d4adacc242ef7cef0d937b68388c675378"} Nov 22 10:31:31 crc kubenswrapper[4743]: I1122 10:31:31.064630 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" podStartSLOduration=1.664801702 podStartE2EDuration="2.064611484s" podCreationTimestamp="2025-11-22 10:31:29 +0000 UTC" firstStartedPulling="2025-11-22 10:31:29.980036944 +0000 UTC m=+7763.686397996" lastFinishedPulling="2025-11-22 10:31:30.379846726 +0000 UTC m=+7764.086207778" observedRunningTime="2025-11-22 10:31:31.064241533 +0000 UTC m=+7764.770602605" watchObservedRunningTime="2025-11-22 10:31:31.064611484 +0000 UTC m=+7764.770972536" Nov 22 10:32:01 crc kubenswrapper[4743]: I1122 10:32:01.241610 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:32:01 crc kubenswrapper[4743]: I1122 10:32:01.242280 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:32:31 crc kubenswrapper[4743]: I1122 10:32:31.240904 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:32:31 crc kubenswrapper[4743]: I1122 10:32:31.243098 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:32:57 crc kubenswrapper[4743]: I1122 10:32:57.850356 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f2tgw"] Nov 22 10:32:57 crc kubenswrapper[4743]: I1122 10:32:57.853754 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:57 crc kubenswrapper[4743]: I1122 10:32:57.868551 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2tgw"] Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.003422 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-utilities\") pod \"redhat-marketplace-f2tgw\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.003837 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtqqz\" (UniqueName: \"kubernetes.io/projected/d2fe984c-6618-4398-a8c0-6d4cc515c580-kube-api-access-gtqqz\") pod \"redhat-marketplace-f2tgw\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.003977 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-catalog-content\") pod \"redhat-marketplace-f2tgw\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.106043 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtqqz\" (UniqueName: \"kubernetes.io/projected/d2fe984c-6618-4398-a8c0-6d4cc515c580-kube-api-access-gtqqz\") pod \"redhat-marketplace-f2tgw\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.106327 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-catalog-content\") pod \"redhat-marketplace-f2tgw\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.106390 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-utilities\") pod \"redhat-marketplace-f2tgw\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.106946 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-catalog-content\") pod \"redhat-marketplace-f2tgw\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.106962 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-utilities\") pod \"redhat-marketplace-f2tgw\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.128981 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtqqz\" (UniqueName: \"kubernetes.io/projected/d2fe984c-6618-4398-a8c0-6d4cc515c580-kube-api-access-gtqqz\") pod \"redhat-marketplace-f2tgw\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.196337 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.738094 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2tgw"] Nov 22 10:32:58 crc kubenswrapper[4743]: I1122 10:32:58.922962 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2tgw" event={"ID":"d2fe984c-6618-4398-a8c0-6d4cc515c580","Type":"ContainerStarted","Data":"3bb750f7ab68f0ec1155f183952f770fd1cf85f2543f201005a1ad2b8adaa876"} Nov 22 10:32:59 crc kubenswrapper[4743]: I1122 10:32:59.938341 4743 generic.go:334] "Generic (PLEG): container finished" podID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerID="95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f" exitCode=0 Nov 22 10:32:59 crc kubenswrapper[4743]: I1122 10:32:59.938666 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2tgw" event={"ID":"d2fe984c-6618-4398-a8c0-6d4cc515c580","Type":"ContainerDied","Data":"95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f"} Nov 22 10:33:00 crc kubenswrapper[4743]: I1122 10:33:00.950299 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2tgw" event={"ID":"d2fe984c-6618-4398-a8c0-6d4cc515c580","Type":"ContainerStarted","Data":"6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481"} Nov 22 10:33:01 crc kubenswrapper[4743]: I1122 10:33:01.241462 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:33:01 crc kubenswrapper[4743]: I1122 10:33:01.241530 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:33:01 crc kubenswrapper[4743]: I1122 10:33:01.241624 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 10:33:01 crc kubenswrapper[4743]: I1122 10:33:01.960071 4743 generic.go:334] "Generic (PLEG): container finished" podID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerID="6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481" exitCode=0 Nov 22 10:33:01 crc kubenswrapper[4743]: I1122 10:33:01.960631 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2tgw" event={"ID":"d2fe984c-6618-4398-a8c0-6d4cc515c580","Type":"ContainerDied","Data":"6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481"} Nov 22 10:33:01 crc kubenswrapper[4743]: I1122 10:33:01.960985 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:33:01 crc kubenswrapper[4743]: I1122 10:33:01.961060 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" gracePeriod=600 Nov 22 10:33:02 crc kubenswrapper[4743]: E1122 10:33:02.095710 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:33:02 crc kubenswrapper[4743]: I1122 10:33:02.977064 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" exitCode=0 Nov 22 10:33:02 crc kubenswrapper[4743]: I1122 10:33:02.977153 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a"} Nov 22 10:33:02 crc kubenswrapper[4743]: I1122 10:33:02.977779 4743 scope.go:117] "RemoveContainer" containerID="40328998db4cc7d85f34000888325dea9a42d2a672d786787ca0b5e402b918dd" Nov 22 10:33:02 crc kubenswrapper[4743]: I1122 10:33:02.978616 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:33:02 crc kubenswrapper[4743]: E1122 10:33:02.979138 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:33:02 crc kubenswrapper[4743]: I1122 10:33:02.984673 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2tgw" event={"ID":"d2fe984c-6618-4398-a8c0-6d4cc515c580","Type":"ContainerStarted","Data":"af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109"} Nov 22 10:33:03 crc kubenswrapper[4743]: I1122 10:33:03.028539 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f2tgw" podStartSLOduration=3.612097219 podStartE2EDuration="6.028517485s" podCreationTimestamp="2025-11-22 10:32:57 +0000 UTC" firstStartedPulling="2025-11-22 10:32:59.946793739 +0000 UTC m=+7853.653154791" lastFinishedPulling="2025-11-22 10:33:02.363214005 +0000 UTC m=+7856.069575057" observedRunningTime="2025-11-22 10:33:03.02067905 +0000 UTC m=+7856.727040102" watchObservedRunningTime="2025-11-22 10:33:03.028517485 +0000 UTC m=+7856.734878537" Nov 22 10:33:08 crc kubenswrapper[4743]: I1122 10:33:08.197003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:33:08 crc kubenswrapper[4743]: I1122 10:33:08.198173 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:33:08 crc kubenswrapper[4743]: I1122 10:33:08.280215 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:33:09 crc kubenswrapper[4743]: I1122 10:33:09.149145 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:33:09 crc kubenswrapper[4743]: I1122 10:33:09.205559 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2tgw"] Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.085039 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-f2tgw" podUID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerName="registry-server" containerID="cri-o://af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109" gracePeriod=2 Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.609915 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.729458 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-catalog-content\") pod \"d2fe984c-6618-4398-a8c0-6d4cc515c580\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.729569 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-utilities\") pod \"d2fe984c-6618-4398-a8c0-6d4cc515c580\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.729645 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtqqz\" (UniqueName: \"kubernetes.io/projected/d2fe984c-6618-4398-a8c0-6d4cc515c580-kube-api-access-gtqqz\") pod \"d2fe984c-6618-4398-a8c0-6d4cc515c580\" (UID: \"d2fe984c-6618-4398-a8c0-6d4cc515c580\") " Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.742873 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-utilities" (OuterVolumeSpecName: "utilities") pod "d2fe984c-6618-4398-a8c0-6d4cc515c580" (UID: "d2fe984c-6618-4398-a8c0-6d4cc515c580"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.743014 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2fe984c-6618-4398-a8c0-6d4cc515c580-kube-api-access-gtqqz" (OuterVolumeSpecName: "kube-api-access-gtqqz") pod "d2fe984c-6618-4398-a8c0-6d4cc515c580" (UID: "d2fe984c-6618-4398-a8c0-6d4cc515c580"). InnerVolumeSpecName "kube-api-access-gtqqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.753191 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2fe984c-6618-4398-a8c0-6d4cc515c580" (UID: "d2fe984c-6618-4398-a8c0-6d4cc515c580"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.831397 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.831437 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fe984c-6618-4398-a8c0-6d4cc515c580-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:33:11 crc kubenswrapper[4743]: I1122 10:33:11.831447 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtqqz\" (UniqueName: \"kubernetes.io/projected/d2fe984c-6618-4398-a8c0-6d4cc515c580-kube-api-access-gtqqz\") on node \"crc\" DevicePath \"\"" Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.097987 4743 generic.go:334] "Generic (PLEG): container finished" podID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerID="af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109" exitCode=0 Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.098028 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2tgw" event={"ID":"d2fe984c-6618-4398-a8c0-6d4cc515c580","Type":"ContainerDied","Data":"af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109"} Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.098054 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2tgw" event={"ID":"d2fe984c-6618-4398-a8c0-6d4cc515c580","Type":"ContainerDied","Data":"3bb750f7ab68f0ec1155f183952f770fd1cf85f2543f201005a1ad2b8adaa876"} Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.098072 4743 scope.go:117] "RemoveContainer" containerID="af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109" Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.098189 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2tgw" Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.128365 4743 scope.go:117] "RemoveContainer" containerID="6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481" Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.128493 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2tgw"] Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.139824 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2tgw"] Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.162355 4743 scope.go:117] "RemoveContainer" containerID="95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f" Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.197973 4743 scope.go:117] "RemoveContainer" containerID="af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109" Nov 22 10:33:12 crc kubenswrapper[4743]: E1122 10:33:12.198530 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109\": container with ID starting with af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109 not found: ID does not exist" containerID="af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109" Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.198600 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109"} err="failed to get container status \"af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109\": rpc error: code = NotFound desc = could not find container \"af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109\": container with ID starting with af16b459b993e37dd57f3dc930726d2ee744140142a678d9a66af5568c6f2109 not found: ID does not exist" Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.198629 4743 scope.go:117] "RemoveContainer" containerID="6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481" Nov 22 10:33:12 crc kubenswrapper[4743]: E1122 10:33:12.199047 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481\": container with ID starting with 6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481 not found: ID does not exist" containerID="6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481" Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.199067 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481"} err="failed to get container status \"6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481\": rpc error: code = NotFound desc = could not find container \"6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481\": container with ID starting with 6ee442410604f3d8047f48d6fa88e75b172a238ccd292fb874244bea361e9481 not found: ID does not exist" Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.199081 4743 scope.go:117] "RemoveContainer" containerID="95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f" Nov 22 10:33:12 crc kubenswrapper[4743]: E1122 10:33:12.199552 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f\": container with ID starting with 95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f not found: ID does not exist" containerID="95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f" Nov 22 10:33:12 crc kubenswrapper[4743]: I1122 10:33:12.199618 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f"} err="failed to get container status \"95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f\": rpc error: code = NotFound desc = could not find container \"95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f\": container with ID starting with 95272ca8806e6ce575f42292356383d50415ff64422201c44ea6db58a492bc4f not found: ID does not exist" Nov 22 10:33:13 crc kubenswrapper[4743]: I1122 10:33:13.163237 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2fe984c-6618-4398-a8c0-6d4cc515c580" path="/var/lib/kubelet/pods/d2fe984c-6618-4398-a8c0-6d4cc515c580/volumes" Nov 22 10:33:16 crc kubenswrapper[4743]: I1122 10:33:16.152324 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:33:16 crc kubenswrapper[4743]: E1122 10:33:16.152963 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:33:30 crc kubenswrapper[4743]: I1122 10:33:30.152005 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:33:30 crc kubenswrapper[4743]: E1122 10:33:30.154910 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:33:42 crc kubenswrapper[4743]: I1122 10:33:42.152169 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:33:42 crc kubenswrapper[4743]: E1122 10:33:42.152997 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:33:54 crc kubenswrapper[4743]: I1122 10:33:54.151743 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:33:54 crc kubenswrapper[4743]: E1122 10:33:54.152634 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.135739 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qfck4"] Nov 22 10:33:57 crc kubenswrapper[4743]: E1122 10:33:57.138819 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerName="registry-server" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.138884 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerName="registry-server" Nov 22 10:33:57 crc kubenswrapper[4743]: E1122 10:33:57.138899 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerName="extract-content" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.138906 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerName="extract-content" Nov 22 10:33:57 crc kubenswrapper[4743]: E1122 10:33:57.138924 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerName="extract-utilities" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.138930 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerName="extract-utilities" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.139158 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2fe984c-6618-4398-a8c0-6d4cc515c580" containerName="registry-server" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.141139 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.197560 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qfck4"] Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.314630 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-utilities\") pod \"certified-operators-qfck4\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.314729 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8dps\" (UniqueName: \"kubernetes.io/projected/4abf270f-d138-492a-8979-fd1ff1b77217-kube-api-access-n8dps\") pod \"certified-operators-qfck4\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.314799 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-catalog-content\") pod \"certified-operators-qfck4\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.417016 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-utilities\") pod \"certified-operators-qfck4\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.417091 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8dps\" (UniqueName: \"kubernetes.io/projected/4abf270f-d138-492a-8979-fd1ff1b77217-kube-api-access-n8dps\") pod \"certified-operators-qfck4\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.417133 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-catalog-content\") pod \"certified-operators-qfck4\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.417553 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-utilities\") pod \"certified-operators-qfck4\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.417730 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-catalog-content\") pod \"certified-operators-qfck4\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.440361 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8dps\" (UniqueName: \"kubernetes.io/projected/4abf270f-d138-492a-8979-fd1ff1b77217-kube-api-access-n8dps\") pod \"certified-operators-qfck4\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:57 crc kubenswrapper[4743]: I1122 10:33:57.482901 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:33:58 crc kubenswrapper[4743]: I1122 10:33:58.005113 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qfck4"] Nov 22 10:33:58 crc kubenswrapper[4743]: I1122 10:33:58.607401 4743 generic.go:334] "Generic (PLEG): container finished" podID="4abf270f-d138-492a-8979-fd1ff1b77217" containerID="c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390" exitCode=0 Nov 22 10:33:58 crc kubenswrapper[4743]: I1122 10:33:58.607529 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfck4" event={"ID":"4abf270f-d138-492a-8979-fd1ff1b77217","Type":"ContainerDied","Data":"c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390"} Nov 22 10:33:58 crc kubenswrapper[4743]: I1122 10:33:58.607792 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfck4" event={"ID":"4abf270f-d138-492a-8979-fd1ff1b77217","Type":"ContainerStarted","Data":"3f70efd4c23cd216048ce41e8e033e2380911c0d601518785499cfd5b8836cae"} Nov 22 10:33:59 crc kubenswrapper[4743]: I1122 10:33:59.619185 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfck4" event={"ID":"4abf270f-d138-492a-8979-fd1ff1b77217","Type":"ContainerStarted","Data":"a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba"} Nov 22 10:34:00 crc kubenswrapper[4743]: I1122 10:34:00.629594 4743 generic.go:334] "Generic (PLEG): container finished" podID="4abf270f-d138-492a-8979-fd1ff1b77217" containerID="a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba" exitCode=0 Nov 22 10:34:00 crc kubenswrapper[4743]: I1122 10:34:00.629724 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfck4" event={"ID":"4abf270f-d138-492a-8979-fd1ff1b77217","Type":"ContainerDied","Data":"a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba"} Nov 22 10:34:01 crc kubenswrapper[4743]: I1122 10:34:01.641494 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfck4" event={"ID":"4abf270f-d138-492a-8979-fd1ff1b77217","Type":"ContainerStarted","Data":"2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525"} Nov 22 10:34:01 crc kubenswrapper[4743]: I1122 10:34:01.663002 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qfck4" podStartSLOduration=2.268462971 podStartE2EDuration="4.662981189s" podCreationTimestamp="2025-11-22 10:33:57 +0000 UTC" firstStartedPulling="2025-11-22 10:33:58.610480872 +0000 UTC m=+7912.316841924" lastFinishedPulling="2025-11-22 10:34:01.00499909 +0000 UTC m=+7914.711360142" observedRunningTime="2025-11-22 10:34:01.657971285 +0000 UTC m=+7915.364332337" watchObservedRunningTime="2025-11-22 10:34:01.662981189 +0000 UTC m=+7915.369342251" Nov 22 10:34:07 crc kubenswrapper[4743]: I1122 10:34:07.483509 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:34:07 crc kubenswrapper[4743]: I1122 10:34:07.484152 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:34:07 crc kubenswrapper[4743]: I1122 10:34:07.564949 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:34:07 crc kubenswrapper[4743]: I1122 10:34:07.741855 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:34:07 crc kubenswrapper[4743]: I1122 10:34:07.801063 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qfck4"] Nov 22 10:34:08 crc kubenswrapper[4743]: I1122 10:34:08.151750 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:34:08 crc kubenswrapper[4743]: E1122 10:34:08.152450 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:34:09 crc kubenswrapper[4743]: I1122 10:34:09.714769 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qfck4" podUID="4abf270f-d138-492a-8979-fd1ff1b77217" containerName="registry-server" containerID="cri-o://2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525" gracePeriod=2 Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.193811 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.317112 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8dps\" (UniqueName: \"kubernetes.io/projected/4abf270f-d138-492a-8979-fd1ff1b77217-kube-api-access-n8dps\") pod \"4abf270f-d138-492a-8979-fd1ff1b77217\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.317173 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-catalog-content\") pod \"4abf270f-d138-492a-8979-fd1ff1b77217\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.317329 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-utilities\") pod \"4abf270f-d138-492a-8979-fd1ff1b77217\" (UID: \"4abf270f-d138-492a-8979-fd1ff1b77217\") " Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.318257 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-utilities" (OuterVolumeSpecName: "utilities") pod "4abf270f-d138-492a-8979-fd1ff1b77217" (UID: "4abf270f-d138-492a-8979-fd1ff1b77217"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.318971 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.324901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abf270f-d138-492a-8979-fd1ff1b77217-kube-api-access-n8dps" (OuterVolumeSpecName: "kube-api-access-n8dps") pod "4abf270f-d138-492a-8979-fd1ff1b77217" (UID: "4abf270f-d138-492a-8979-fd1ff1b77217"). InnerVolumeSpecName "kube-api-access-n8dps". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.394998 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4abf270f-d138-492a-8979-fd1ff1b77217" (UID: "4abf270f-d138-492a-8979-fd1ff1b77217"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.421250 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8dps\" (UniqueName: \"kubernetes.io/projected/4abf270f-d138-492a-8979-fd1ff1b77217-kube-api-access-n8dps\") on node \"crc\" DevicePath \"\"" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.421287 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4abf270f-d138-492a-8979-fd1ff1b77217-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.725693 4743 generic.go:334] "Generic (PLEG): container finished" podID="4abf270f-d138-492a-8979-fd1ff1b77217" containerID="2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525" exitCode=0 Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.725960 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfck4" event={"ID":"4abf270f-d138-492a-8979-fd1ff1b77217","Type":"ContainerDied","Data":"2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525"} Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.725987 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qfck4" event={"ID":"4abf270f-d138-492a-8979-fd1ff1b77217","Type":"ContainerDied","Data":"3f70efd4c23cd216048ce41e8e033e2380911c0d601518785499cfd5b8836cae"} Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.726004 4743 scope.go:117] "RemoveContainer" containerID="2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.726121 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qfck4" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.759886 4743 scope.go:117] "RemoveContainer" containerID="a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.778412 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qfck4"] Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.785011 4743 scope.go:117] "RemoveContainer" containerID="c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.786284 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qfck4"] Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.831765 4743 scope.go:117] "RemoveContainer" containerID="2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525" Nov 22 10:34:10 crc kubenswrapper[4743]: E1122 10:34:10.832363 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525\": container with ID starting with 2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525 not found: ID does not exist" containerID="2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.832406 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525"} err="failed to get container status \"2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525\": rpc error: code = NotFound desc = could not find container \"2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525\": container with ID starting with 2149cbd0dd092cf80b1589fe0726ba1ce21d1130158ee1c62642958cad1d0525 not found: ID does not exist" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.832433 4743 scope.go:117] "RemoveContainer" containerID="a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba" Nov 22 10:34:10 crc kubenswrapper[4743]: E1122 10:34:10.832876 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba\": container with ID starting with a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba not found: ID does not exist" containerID="a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.832902 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba"} err="failed to get container status \"a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba\": rpc error: code = NotFound desc = could not find container \"a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba\": container with ID starting with a401000320e3cc760eba0139c2a681ea25372980e5131685e0ab910eb3ef79ba not found: ID does not exist" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.832915 4743 scope.go:117] "RemoveContainer" containerID="c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390" Nov 22 10:34:10 crc kubenswrapper[4743]: E1122 10:34:10.833217 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390\": container with ID starting with c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390 not found: ID does not exist" containerID="c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390" Nov 22 10:34:10 crc kubenswrapper[4743]: I1122 10:34:10.833256 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390"} err="failed to get container status \"c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390\": rpc error: code = NotFound desc = could not find container \"c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390\": container with ID starting with c5a38962f646e3407ffe94f204c9247096eb53e84fdd0c4359c12a8a6b82d390 not found: ID does not exist" Nov 22 10:34:11 crc kubenswrapper[4743]: I1122 10:34:11.165167 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4abf270f-d138-492a-8979-fd1ff1b77217" path="/var/lib/kubelet/pods/4abf270f-d138-492a-8979-fd1ff1b77217/volumes" Nov 22 10:34:23 crc kubenswrapper[4743]: I1122 10:34:23.151657 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:34:23 crc kubenswrapper[4743]: E1122 10:34:23.152445 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:34:35 crc kubenswrapper[4743]: I1122 10:34:35.152222 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:34:35 crc kubenswrapper[4743]: E1122 10:34:35.153183 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:34:46 crc kubenswrapper[4743]: I1122 10:34:46.152310 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:34:46 crc kubenswrapper[4743]: E1122 10:34:46.153242 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:34:58 crc kubenswrapper[4743]: I1122 10:34:58.151985 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:34:58 crc kubenswrapper[4743]: E1122 10:34:58.152983 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:35:13 crc kubenswrapper[4743]: I1122 10:35:13.151851 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:35:13 crc kubenswrapper[4743]: E1122 10:35:13.152628 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:35:25 crc kubenswrapper[4743]: I1122 10:35:25.152063 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:35:25 crc kubenswrapper[4743]: E1122 10:35:25.154624 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:35:37 crc kubenswrapper[4743]: I1122 10:35:37.158413 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:35:37 crc kubenswrapper[4743]: E1122 10:35:37.159500 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:35:50 crc kubenswrapper[4743]: I1122 10:35:50.151944 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:35:50 crc kubenswrapper[4743]: E1122 10:35:50.153453 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:36:05 crc kubenswrapper[4743]: I1122 10:36:05.151913 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:36:05 crc kubenswrapper[4743]: E1122 10:36:05.152753 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:36:09 crc kubenswrapper[4743]: I1122 10:36:09.997781 4743 generic.go:334] "Generic (PLEG): container finished" podID="65ef4bef-62b0-4592-94a2-d93d8679ce08" containerID="4388a79750fc48982c6e1bbf26b496d4adacc242ef7cef0d937b68388c675378" exitCode=0 Nov 22 10:36:09 crc kubenswrapper[4743]: I1122 10:36:09.997830 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" event={"ID":"65ef4bef-62b0-4592-94a2-d93d8679ce08","Type":"ContainerDied","Data":"4388a79750fc48982c6e1bbf26b496d4adacc242ef7cef0d937b68388c675378"} Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.578918 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.729988 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ssh-key\") pod \"65ef4bef-62b0-4592-94a2-d93d8679ce08\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.730089 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ceph\") pod \"65ef4bef-62b0-4592-94a2-d93d8679ce08\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.730229 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-combined-ca-bundle\") pod \"65ef4bef-62b0-4592-94a2-d93d8679ce08\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.731017 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-secret-0\") pod \"65ef4bef-62b0-4592-94a2-d93d8679ce08\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.731091 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xdv7\" (UniqueName: \"kubernetes.io/projected/65ef4bef-62b0-4592-94a2-d93d8679ce08-kube-api-access-7xdv7\") pod \"65ef4bef-62b0-4592-94a2-d93d8679ce08\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.731174 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-inventory\") pod \"65ef4bef-62b0-4592-94a2-d93d8679ce08\" (UID: \"65ef4bef-62b0-4592-94a2-d93d8679ce08\") " Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.735466 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ceph" (OuterVolumeSpecName: "ceph") pod "65ef4bef-62b0-4592-94a2-d93d8679ce08" (UID: "65ef4bef-62b0-4592-94a2-d93d8679ce08"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.735708 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ef4bef-62b0-4592-94a2-d93d8679ce08-kube-api-access-7xdv7" (OuterVolumeSpecName: "kube-api-access-7xdv7") pod "65ef4bef-62b0-4592-94a2-d93d8679ce08" (UID: "65ef4bef-62b0-4592-94a2-d93d8679ce08"). InnerVolumeSpecName "kube-api-access-7xdv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.735739 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "65ef4bef-62b0-4592-94a2-d93d8679ce08" (UID: "65ef4bef-62b0-4592-94a2-d93d8679ce08"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.760736 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "65ef4bef-62b0-4592-94a2-d93d8679ce08" (UID: "65ef4bef-62b0-4592-94a2-d93d8679ce08"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.761641 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-inventory" (OuterVolumeSpecName: "inventory") pod "65ef4bef-62b0-4592-94a2-d93d8679ce08" (UID: "65ef4bef-62b0-4592-94a2-d93d8679ce08"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.773822 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "65ef4bef-62b0-4592-94a2-d93d8679ce08" (UID: "65ef4bef-62b0-4592-94a2-d93d8679ce08"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.834172 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.834200 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xdv7\" (UniqueName: \"kubernetes.io/projected/65ef4bef-62b0-4592-94a2-d93d8679ce08-kube-api-access-7xdv7\") on node \"crc\" DevicePath \"\"" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.834215 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.834224 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.834233 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:36:11 crc kubenswrapper[4743]: I1122 10:36:11.834242 4743 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65ef4bef-62b0-4592-94a2-d93d8679ce08-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.015208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" event={"ID":"65ef4bef-62b0-4592-94a2-d93d8679ce08","Type":"ContainerDied","Data":"ffb19095da1b233c7c96635427f298997386753a17df776c9bce17c834a61035"} Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.015245 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffb19095da1b233c7c96635427f298997386753a17df776c9bce17c834a61035" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.015283 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-vt6f7" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.106094 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-94fht"] Nov 22 10:36:12 crc kubenswrapper[4743]: E1122 10:36:12.106599 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ef4bef-62b0-4592-94a2-d93d8679ce08" containerName="libvirt-openstack-openstack-cell1" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.106622 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ef4bef-62b0-4592-94a2-d93d8679ce08" containerName="libvirt-openstack-openstack-cell1" Nov 22 10:36:12 crc kubenswrapper[4743]: E1122 10:36:12.106656 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abf270f-d138-492a-8979-fd1ff1b77217" containerName="registry-server" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.106665 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abf270f-d138-492a-8979-fd1ff1b77217" containerName="registry-server" Nov 22 10:36:12 crc kubenswrapper[4743]: E1122 10:36:12.106679 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abf270f-d138-492a-8979-fd1ff1b77217" containerName="extract-utilities" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.106688 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abf270f-d138-492a-8979-fd1ff1b77217" containerName="extract-utilities" Nov 22 10:36:12 crc kubenswrapper[4743]: E1122 10:36:12.106717 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abf270f-d138-492a-8979-fd1ff1b77217" containerName="extract-content" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.106726 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abf270f-d138-492a-8979-fd1ff1b77217" containerName="extract-content" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.106976 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abf270f-d138-492a-8979-fd1ff1b77217" containerName="registry-server" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.106996 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ef4bef-62b0-4592-94a2-d93d8679ce08" containerName="libvirt-openstack-openstack-cell1" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.144303 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.147647 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.147720 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.147948 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.148072 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.148415 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.148443 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.150677 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.164102 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-94fht"] Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.347608 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.347962 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.348042 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.348071 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.348130 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.348188 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.348290 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.348418 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.348648 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.349566 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pssn\" (UniqueName: \"kubernetes.io/projected/c7eed5f0-702c-4714-ab82-9d23577c2a5f-kube-api-access-7pssn\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.349811 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451412 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451457 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451485 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451505 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451527 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451570 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451637 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451666 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pssn\" (UniqueName: \"kubernetes.io/projected/c7eed5f0-702c-4714-ab82-9d23577c2a5f-kube-api-access-7pssn\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451687 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451726 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.451767 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.453019 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.454973 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.455256 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.455356 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.455647 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.456491 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.456501 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.456857 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.459484 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.460126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.468975 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pssn\" (UniqueName: \"kubernetes.io/projected/c7eed5f0-702c-4714-ab82-9d23577c2a5f-kube-api-access-7pssn\") pod \"nova-cell1-openstack-openstack-cell1-94fht\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.472011 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:36:12 crc kubenswrapper[4743]: I1122 10:36:12.986109 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-94fht"] Nov 22 10:36:13 crc kubenswrapper[4743]: I1122 10:36:13.032611 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" event={"ID":"c7eed5f0-702c-4714-ab82-9d23577c2a5f","Type":"ContainerStarted","Data":"01cf38ce6073996203cd56f3811ee12fc8fecd79de05ada9f0525609167eb085"} Nov 22 10:36:14 crc kubenswrapper[4743]: I1122 10:36:14.043511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" event={"ID":"c7eed5f0-702c-4714-ab82-9d23577c2a5f","Type":"ContainerStarted","Data":"e310b86b31915faae900264866d1a8ded342f6e03bf16fb8a2dc9d2be1e89789"} Nov 22 10:36:14 crc kubenswrapper[4743]: I1122 10:36:14.063870 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" podStartSLOduration=1.582548979 podStartE2EDuration="2.063839308s" podCreationTimestamp="2025-11-22 10:36:12 +0000 UTC" firstStartedPulling="2025-11-22 10:36:12.99255929 +0000 UTC m=+8046.698920342" lastFinishedPulling="2025-11-22 10:36:13.473849619 +0000 UTC m=+8047.180210671" observedRunningTime="2025-11-22 10:36:14.060650277 +0000 UTC m=+8047.767011369" watchObservedRunningTime="2025-11-22 10:36:14.063839308 +0000 UTC m=+8047.770200370" Nov 22 10:36:18 crc kubenswrapper[4743]: I1122 10:36:18.152514 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:36:18 crc kubenswrapper[4743]: E1122 10:36:18.154129 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:36:31 crc kubenswrapper[4743]: I1122 10:36:31.151624 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:36:31 crc kubenswrapper[4743]: E1122 10:36:31.152485 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:36:45 crc kubenswrapper[4743]: I1122 10:36:45.152234 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:36:45 crc kubenswrapper[4743]: E1122 10:36:45.153065 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:36:57 crc kubenswrapper[4743]: I1122 10:36:57.162924 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:36:57 crc kubenswrapper[4743]: E1122 10:36:57.164008 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:37:09 crc kubenswrapper[4743]: I1122 10:37:09.152289 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:37:09 crc kubenswrapper[4743]: E1122 10:37:09.153044 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:37:20 crc kubenswrapper[4743]: I1122 10:37:20.151833 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:37:20 crc kubenswrapper[4743]: E1122 10:37:20.153043 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:37:32 crc kubenswrapper[4743]: I1122 10:37:32.151873 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:37:32 crc kubenswrapper[4743]: E1122 10:37:32.152693 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.234637 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bj567"] Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.240883 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.248321 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bj567"] Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.294319 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-catalog-content\") pod \"redhat-operators-bj567\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.294647 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-utilities\") pod \"redhat-operators-bj567\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.294731 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfl55\" (UniqueName: \"kubernetes.io/projected/84b5a514-d018-4402-9da0-a4e52ecaa2b8-kube-api-access-hfl55\") pod \"redhat-operators-bj567\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.396861 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-catalog-content\") pod \"redhat-operators-bj567\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.397012 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-utilities\") pod \"redhat-operators-bj567\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.397066 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfl55\" (UniqueName: \"kubernetes.io/projected/84b5a514-d018-4402-9da0-a4e52ecaa2b8-kube-api-access-hfl55\") pod \"redhat-operators-bj567\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.397305 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-catalog-content\") pod \"redhat-operators-bj567\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.397448 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-utilities\") pod \"redhat-operators-bj567\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.434879 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfl55\" (UniqueName: \"kubernetes.io/projected/84b5a514-d018-4402-9da0-a4e52ecaa2b8-kube-api-access-hfl55\") pod \"redhat-operators-bj567\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.445200 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s9zv9"] Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.447479 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.454417 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s9zv9"] Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.499458 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfj49\" (UniqueName: \"kubernetes.io/projected/f805b630-bf28-40e6-8ee6-271eed0f1974-kube-api-access-bfj49\") pod \"community-operators-s9zv9\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.499611 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-catalog-content\") pod \"community-operators-s9zv9\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.499719 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-utilities\") pod \"community-operators-s9zv9\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.579876 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.602311 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-utilities\") pod \"community-operators-s9zv9\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.602405 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfj49\" (UniqueName: \"kubernetes.io/projected/f805b630-bf28-40e6-8ee6-271eed0f1974-kube-api-access-bfj49\") pod \"community-operators-s9zv9\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.602491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-catalog-content\") pod \"community-operators-s9zv9\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.603008 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-catalog-content\") pod \"community-operators-s9zv9\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.603226 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-utilities\") pod \"community-operators-s9zv9\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.623526 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfj49\" (UniqueName: \"kubernetes.io/projected/f805b630-bf28-40e6-8ee6-271eed0f1974-kube-api-access-bfj49\") pod \"community-operators-s9zv9\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:34 crc kubenswrapper[4743]: I1122 10:37:34.804911 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:35 crc kubenswrapper[4743]: I1122 10:37:35.102027 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bj567"] Nov 22 10:37:35 crc kubenswrapper[4743]: I1122 10:37:35.438335 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s9zv9"] Nov 22 10:37:35 crc kubenswrapper[4743]: W1122 10:37:35.454673 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf805b630_bf28_40e6_8ee6_271eed0f1974.slice/crio-78e044531da012c6b5c511da8a4641d8e66300d64e6a43157345f67fda5fb160 WatchSource:0}: Error finding container 78e044531da012c6b5c511da8a4641d8e66300d64e6a43157345f67fda5fb160: Status 404 returned error can't find the container with id 78e044531da012c6b5c511da8a4641d8e66300d64e6a43157345f67fda5fb160 Nov 22 10:37:35 crc kubenswrapper[4743]: I1122 10:37:35.867848 4743 generic.go:334] "Generic (PLEG): container finished" podID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerID="3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417" exitCode=0 Nov 22 10:37:35 crc kubenswrapper[4743]: I1122 10:37:35.867910 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zv9" event={"ID":"f805b630-bf28-40e6-8ee6-271eed0f1974","Type":"ContainerDied","Data":"3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417"} Nov 22 10:37:35 crc kubenswrapper[4743]: I1122 10:37:35.868172 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zv9" event={"ID":"f805b630-bf28-40e6-8ee6-271eed0f1974","Type":"ContainerStarted","Data":"78e044531da012c6b5c511da8a4641d8e66300d64e6a43157345f67fda5fb160"} Nov 22 10:37:35 crc kubenswrapper[4743]: I1122 10:37:35.869804 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:37:35 crc kubenswrapper[4743]: I1122 10:37:35.870043 4743 generic.go:334] "Generic (PLEG): container finished" podID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerID="b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483" exitCode=0 Nov 22 10:37:35 crc kubenswrapper[4743]: I1122 10:37:35.870090 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj567" event={"ID":"84b5a514-d018-4402-9da0-a4e52ecaa2b8","Type":"ContainerDied","Data":"b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483"} Nov 22 10:37:35 crc kubenswrapper[4743]: I1122 10:37:35.870125 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj567" event={"ID":"84b5a514-d018-4402-9da0-a4e52ecaa2b8","Type":"ContainerStarted","Data":"5a5f4a7a157c6a6df5d9589501672baf81aaa6ecd58bf8999eb6c61c3ea2035d"} Nov 22 10:37:36 crc kubenswrapper[4743]: I1122 10:37:36.882330 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj567" event={"ID":"84b5a514-d018-4402-9da0-a4e52ecaa2b8","Type":"ContainerStarted","Data":"5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c"} Nov 22 10:37:36 crc kubenswrapper[4743]: I1122 10:37:36.889636 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zv9" event={"ID":"f805b630-bf28-40e6-8ee6-271eed0f1974","Type":"ContainerStarted","Data":"e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f"} Nov 22 10:37:38 crc kubenswrapper[4743]: I1122 10:37:38.913938 4743 generic.go:334] "Generic (PLEG): container finished" podID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerID="e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f" exitCode=0 Nov 22 10:37:38 crc kubenswrapper[4743]: I1122 10:37:38.913991 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zv9" event={"ID":"f805b630-bf28-40e6-8ee6-271eed0f1974","Type":"ContainerDied","Data":"e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f"} Nov 22 10:37:39 crc kubenswrapper[4743]: I1122 10:37:39.928143 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zv9" event={"ID":"f805b630-bf28-40e6-8ee6-271eed0f1974","Type":"ContainerStarted","Data":"66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b"} Nov 22 10:37:39 crc kubenswrapper[4743]: I1122 10:37:39.954015 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s9zv9" podStartSLOduration=2.487340584 podStartE2EDuration="5.953352938s" podCreationTimestamp="2025-11-22 10:37:34 +0000 UTC" firstStartedPulling="2025-11-22 10:37:35.869596 +0000 UTC m=+8129.575957052" lastFinishedPulling="2025-11-22 10:37:39.335608354 +0000 UTC m=+8133.041969406" observedRunningTime="2025-11-22 10:37:39.946835871 +0000 UTC m=+8133.653196913" watchObservedRunningTime="2025-11-22 10:37:39.953352938 +0000 UTC m=+8133.659713990" Nov 22 10:37:41 crc kubenswrapper[4743]: I1122 10:37:41.955100 4743 generic.go:334] "Generic (PLEG): container finished" podID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerID="5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c" exitCode=0 Nov 22 10:37:41 crc kubenswrapper[4743]: I1122 10:37:41.955287 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj567" event={"ID":"84b5a514-d018-4402-9da0-a4e52ecaa2b8","Type":"ContainerDied","Data":"5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c"} Nov 22 10:37:43 crc kubenswrapper[4743]: I1122 10:37:43.004493 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj567" event={"ID":"84b5a514-d018-4402-9da0-a4e52ecaa2b8","Type":"ContainerStarted","Data":"7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add"} Nov 22 10:37:43 crc kubenswrapper[4743]: I1122 10:37:43.030052 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bj567" podStartSLOduration=2.507883284 podStartE2EDuration="9.03003311s" podCreationTimestamp="2025-11-22 10:37:34 +0000 UTC" firstStartedPulling="2025-11-22 10:37:35.871534855 +0000 UTC m=+8129.577895907" lastFinishedPulling="2025-11-22 10:37:42.393684691 +0000 UTC m=+8136.100045733" observedRunningTime="2025-11-22 10:37:43.02517012 +0000 UTC m=+8136.731531242" watchObservedRunningTime="2025-11-22 10:37:43.03003311 +0000 UTC m=+8136.736394172" Nov 22 10:37:44 crc kubenswrapper[4743]: I1122 10:37:44.580050 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:44 crc kubenswrapper[4743]: I1122 10:37:44.580439 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:44 crc kubenswrapper[4743]: I1122 10:37:44.805630 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:44 crc kubenswrapper[4743]: I1122 10:37:44.805689 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:44 crc kubenswrapper[4743]: I1122 10:37:44.854426 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:45 crc kubenswrapper[4743]: I1122 10:37:45.067838 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:45 crc kubenswrapper[4743]: I1122 10:37:45.635502 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bj567" podUID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerName="registry-server" probeResult="failure" output=< Nov 22 10:37:45 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 10:37:45 crc kubenswrapper[4743]: > Nov 22 10:37:46 crc kubenswrapper[4743]: I1122 10:37:46.036785 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s9zv9"] Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.040745 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s9zv9" podUID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerName="registry-server" containerID="cri-o://66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b" gracePeriod=2 Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.163396 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:37:47 crc kubenswrapper[4743]: E1122 10:37:47.163945 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.524336 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.630785 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfj49\" (UniqueName: \"kubernetes.io/projected/f805b630-bf28-40e6-8ee6-271eed0f1974-kube-api-access-bfj49\") pod \"f805b630-bf28-40e6-8ee6-271eed0f1974\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.631128 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-utilities\") pod \"f805b630-bf28-40e6-8ee6-271eed0f1974\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.631183 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-catalog-content\") pod \"f805b630-bf28-40e6-8ee6-271eed0f1974\" (UID: \"f805b630-bf28-40e6-8ee6-271eed0f1974\") " Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.631935 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-utilities" (OuterVolumeSpecName: "utilities") pod "f805b630-bf28-40e6-8ee6-271eed0f1974" (UID: "f805b630-bf28-40e6-8ee6-271eed0f1974"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.638872 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f805b630-bf28-40e6-8ee6-271eed0f1974-kube-api-access-bfj49" (OuterVolumeSpecName: "kube-api-access-bfj49") pod "f805b630-bf28-40e6-8ee6-271eed0f1974" (UID: "f805b630-bf28-40e6-8ee6-271eed0f1974"). InnerVolumeSpecName "kube-api-access-bfj49". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.687702 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f805b630-bf28-40e6-8ee6-271eed0f1974" (UID: "f805b630-bf28-40e6-8ee6-271eed0f1974"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.733243 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.733288 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f805b630-bf28-40e6-8ee6-271eed0f1974-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:37:47 crc kubenswrapper[4743]: I1122 10:37:47.733304 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfj49\" (UniqueName: \"kubernetes.io/projected/f805b630-bf28-40e6-8ee6-271eed0f1974-kube-api-access-bfj49\") on node \"crc\" DevicePath \"\"" Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.056003 4743 generic.go:334] "Generic (PLEG): container finished" podID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerID="66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b" exitCode=0 Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.056100 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zv9" event={"ID":"f805b630-bf28-40e6-8ee6-271eed0f1974","Type":"ContainerDied","Data":"66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b"} Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.056161 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9zv9" Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.056194 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9zv9" event={"ID":"f805b630-bf28-40e6-8ee6-271eed0f1974","Type":"ContainerDied","Data":"78e044531da012c6b5c511da8a4641d8e66300d64e6a43157345f67fda5fb160"} Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.056247 4743 scope.go:117] "RemoveContainer" containerID="66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b" Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.094197 4743 scope.go:117] "RemoveContainer" containerID="e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f" Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.115767 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s9zv9"] Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.129960 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s9zv9"] Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.143447 4743 scope.go:117] "RemoveContainer" containerID="3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417" Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.188161 4743 scope.go:117] "RemoveContainer" containerID="66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b" Nov 22 10:37:48 crc kubenswrapper[4743]: E1122 10:37:48.189093 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b\": container with ID starting with 66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b not found: ID does not exist" containerID="66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b" Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.189138 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b"} err="failed to get container status \"66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b\": rpc error: code = NotFound desc = could not find container \"66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b\": container with ID starting with 66eb35d7b866ad97ba026596e575b6ed05f39ed68519e37af2adaf034341042b not found: ID does not exist" Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.189164 4743 scope.go:117] "RemoveContainer" containerID="e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f" Nov 22 10:37:48 crc kubenswrapper[4743]: E1122 10:37:48.190081 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f\": container with ID starting with e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f not found: ID does not exist" containerID="e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f" Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.190110 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f"} err="failed to get container status \"e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f\": rpc error: code = NotFound desc = could not find container \"e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f\": container with ID starting with e68a0dab557d0eabd4d03186289de86cacf92085f532b7292d36fcfaf197b26f not found: ID does not exist" Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.190125 4743 scope.go:117] "RemoveContainer" containerID="3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417" Nov 22 10:37:48 crc kubenswrapper[4743]: E1122 10:37:48.190465 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417\": container with ID starting with 3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417 not found: ID does not exist" containerID="3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417" Nov 22 10:37:48 crc kubenswrapper[4743]: I1122 10:37:48.190487 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417"} err="failed to get container status \"3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417\": rpc error: code = NotFound desc = could not find container \"3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417\": container with ID starting with 3875bc03a55ea6d7a4aaea01031d79c3a3985e12800b4fbc9c8fddf2f7e44417 not found: ID does not exist" Nov 22 10:37:49 crc kubenswrapper[4743]: I1122 10:37:49.163681 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f805b630-bf28-40e6-8ee6-271eed0f1974" path="/var/lib/kubelet/pods/f805b630-bf28-40e6-8ee6-271eed0f1974/volumes" Nov 22 10:37:54 crc kubenswrapper[4743]: I1122 10:37:54.631501 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:54 crc kubenswrapper[4743]: I1122 10:37:54.688947 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:54 crc kubenswrapper[4743]: I1122 10:37:54.865610 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bj567"] Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.140720 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bj567" podUID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerName="registry-server" containerID="cri-o://7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add" gracePeriod=2 Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.733097 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.826767 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfl55\" (UniqueName: \"kubernetes.io/projected/84b5a514-d018-4402-9da0-a4e52ecaa2b8-kube-api-access-hfl55\") pod \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.827214 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-utilities\") pod \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.827345 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-catalog-content\") pod \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\" (UID: \"84b5a514-d018-4402-9da0-a4e52ecaa2b8\") " Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.827971 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-utilities" (OuterVolumeSpecName: "utilities") pod "84b5a514-d018-4402-9da0-a4e52ecaa2b8" (UID: "84b5a514-d018-4402-9da0-a4e52ecaa2b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.834520 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b5a514-d018-4402-9da0-a4e52ecaa2b8-kube-api-access-hfl55" (OuterVolumeSpecName: "kube-api-access-hfl55") pod "84b5a514-d018-4402-9da0-a4e52ecaa2b8" (UID: "84b5a514-d018-4402-9da0-a4e52ecaa2b8"). InnerVolumeSpecName "kube-api-access-hfl55". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.921034 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "84b5a514-d018-4402-9da0-a4e52ecaa2b8" (UID: "84b5a514-d018-4402-9da0-a4e52ecaa2b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.929984 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfl55\" (UniqueName: \"kubernetes.io/projected/84b5a514-d018-4402-9da0-a4e52ecaa2b8-kube-api-access-hfl55\") on node \"crc\" DevicePath \"\"" Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.930065 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:37:56 crc kubenswrapper[4743]: I1122 10:37:56.930102 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/84b5a514-d018-4402-9da0-a4e52ecaa2b8-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.157168 4743 generic.go:334] "Generic (PLEG): container finished" podID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerID="7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add" exitCode=0 Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.160141 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bj567" Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.166778 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj567" event={"ID":"84b5a514-d018-4402-9da0-a4e52ecaa2b8","Type":"ContainerDied","Data":"7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add"} Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.166965 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bj567" event={"ID":"84b5a514-d018-4402-9da0-a4e52ecaa2b8","Type":"ContainerDied","Data":"5a5f4a7a157c6a6df5d9589501672baf81aaa6ecd58bf8999eb6c61c3ea2035d"} Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.167043 4743 scope.go:117] "RemoveContainer" containerID="7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add" Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.190540 4743 scope.go:117] "RemoveContainer" containerID="5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c" Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.205056 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bj567"] Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.216538 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bj567"] Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.219985 4743 scope.go:117] "RemoveContainer" containerID="b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483" Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.264387 4743 scope.go:117] "RemoveContainer" containerID="7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add" Nov 22 10:37:57 crc kubenswrapper[4743]: E1122 10:37:57.265048 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add\": container with ID starting with 7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add not found: ID does not exist" containerID="7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add" Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.265096 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add"} err="failed to get container status \"7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add\": rpc error: code = NotFound desc = could not find container \"7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add\": container with ID starting with 7bd89990a04c08f33eef12be1c7c78eb6c8bd08f9facdcf381794d9158a01add not found: ID does not exist" Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.265118 4743 scope.go:117] "RemoveContainer" containerID="5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c" Nov 22 10:37:57 crc kubenswrapper[4743]: E1122 10:37:57.265391 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c\": container with ID starting with 5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c not found: ID does not exist" containerID="5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c" Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.265410 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c"} err="failed to get container status \"5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c\": rpc error: code = NotFound desc = could not find container \"5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c\": container with ID starting with 5a93a27afc38e8835847d91fb3735cd389e5a351ba871712b8c112fd01657a5c not found: ID does not exist" Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.265422 4743 scope.go:117] "RemoveContainer" containerID="b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483" Nov 22 10:37:57 crc kubenswrapper[4743]: E1122 10:37:57.267714 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483\": container with ID starting with b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483 not found: ID does not exist" containerID="b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483" Nov 22 10:37:57 crc kubenswrapper[4743]: I1122 10:37:57.267747 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483"} err="failed to get container status \"b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483\": rpc error: code = NotFound desc = could not find container \"b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483\": container with ID starting with b751fdd7d49f7131e2f4ecbae85e08ed3f00b4703e376d7baa3fb44fc8e1c483 not found: ID does not exist" Nov 22 10:37:59 crc kubenswrapper[4743]: I1122 10:37:59.176894 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" path="/var/lib/kubelet/pods/84b5a514-d018-4402-9da0-a4e52ecaa2b8/volumes" Nov 22 10:38:00 crc kubenswrapper[4743]: I1122 10:38:00.151782 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:38:00 crc kubenswrapper[4743]: E1122 10:38:00.152471 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:38:14 crc kubenswrapper[4743]: I1122 10:38:14.152273 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:38:15 crc kubenswrapper[4743]: I1122 10:38:15.356540 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"45c9e51af829850913e5c98775c99bef3ada6b54f1df49a98aaefd6ef00ac1bf"} Nov 22 10:39:22 crc kubenswrapper[4743]: I1122 10:39:22.232883 4743 generic.go:334] "Generic (PLEG): container finished" podID="c7eed5f0-702c-4714-ab82-9d23577c2a5f" containerID="e310b86b31915faae900264866d1a8ded342f6e03bf16fb8a2dc9d2be1e89789" exitCode=0 Nov 22 10:39:22 crc kubenswrapper[4743]: I1122 10:39:22.233009 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" event={"ID":"c7eed5f0-702c-4714-ab82-9d23577c2a5f","Type":"ContainerDied","Data":"e310b86b31915faae900264866d1a8ded342f6e03bf16fb8a2dc9d2be1e89789"} Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.696703 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.857470 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-1\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.857540 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ssh-key\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.864662 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-1\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.864723 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-inventory\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.864764 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ceph\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.864807 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-0\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.864917 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-1\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.865000 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-0\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.865024 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-combined-ca-bundle\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.865093 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-0\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.865159 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pssn\" (UniqueName: \"kubernetes.io/projected/c7eed5f0-702c-4714-ab82-9d23577c2a5f-kube-api-access-7pssn\") pod \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\" (UID: \"c7eed5f0-702c-4714-ab82-9d23577c2a5f\") " Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.878986 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7eed5f0-702c-4714-ab82-9d23577c2a5f-kube-api-access-7pssn" (OuterVolumeSpecName: "kube-api-access-7pssn") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "kube-api-access-7pssn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.896889 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ceph" (OuterVolumeSpecName: "ceph") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.897419 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.917079 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.921475 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.943321 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.964780 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-inventory" (OuterVolumeSpecName: "inventory") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.972557 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.980355 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.980601 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.980726 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pssn\" (UniqueName: \"kubernetes.io/projected/c7eed5f0-702c-4714-ab82-9d23577c2a5f-kube-api-access-7pssn\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.980808 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.980881 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.980953 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.981024 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.981095 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.988301 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:39:23 crc kubenswrapper[4743]: I1122 10:39:23.988815 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.018994 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c7eed5f0-702c-4714-ab82-9d23577c2a5f" (UID: "c7eed5f0-702c-4714-ab82-9d23577c2a5f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.084049 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.084087 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.084097 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c7eed5f0-702c-4714-ab82-9d23577c2a5f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.253384 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" event={"ID":"c7eed5f0-702c-4714-ab82-9d23577c2a5f","Type":"ContainerDied","Data":"01cf38ce6073996203cd56f3811ee12fc8fecd79de05ada9f0525609167eb085"} Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.253424 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01cf38ce6073996203cd56f3811ee12fc8fecd79de05ada9f0525609167eb085" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.253457 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-94fht" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.352340 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-bcrp6"] Nov 22 10:39:24 crc kubenswrapper[4743]: E1122 10:39:24.352797 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerName="registry-server" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.352814 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerName="registry-server" Nov 22 10:39:24 crc kubenswrapper[4743]: E1122 10:39:24.352828 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7eed5f0-702c-4714-ab82-9d23577c2a5f" containerName="nova-cell1-openstack-openstack-cell1" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.352835 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7eed5f0-702c-4714-ab82-9d23577c2a5f" containerName="nova-cell1-openstack-openstack-cell1" Nov 22 10:39:24 crc kubenswrapper[4743]: E1122 10:39:24.352844 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerName="extract-utilities" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.352851 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerName="extract-utilities" Nov 22 10:39:24 crc kubenswrapper[4743]: E1122 10:39:24.352861 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerName="registry-server" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.352867 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerName="registry-server" Nov 22 10:39:24 crc kubenswrapper[4743]: E1122 10:39:24.352887 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerName="extract-content" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.352893 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerName="extract-content" Nov 22 10:39:24 crc kubenswrapper[4743]: E1122 10:39:24.352907 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerName="extract-utilities" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.352922 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerName="extract-utilities" Nov 22 10:39:24 crc kubenswrapper[4743]: E1122 10:39:24.352945 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerName="extract-content" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.352951 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerName="extract-content" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.353135 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7eed5f0-702c-4714-ab82-9d23577c2a5f" containerName="nova-cell1-openstack-openstack-cell1" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.353164 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="84b5a514-d018-4402-9da0-a4e52ecaa2b8" containerName="registry-server" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.353182 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="f805b630-bf28-40e6-8ee6-271eed0f1974" containerName="registry-server" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.354942 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.357104 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.357305 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.358783 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.359073 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.359222 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.367830 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-bcrp6"] Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.492190 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ssh-key\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.492234 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceph\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.492284 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-inventory\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.492386 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvfgt\" (UniqueName: \"kubernetes.io/projected/7e221c36-eb02-4ce0-8eda-c568c7adf15c-kube-api-access-vvfgt\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.492457 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.492528 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.492674 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.492725 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.595124 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-inventory\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.595182 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvfgt\" (UniqueName: \"kubernetes.io/projected/7e221c36-eb02-4ce0-8eda-c568c7adf15c-kube-api-access-vvfgt\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.595250 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.595343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.595475 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.595562 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.595691 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ssh-key\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.595742 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceph\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.599359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.599711 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ssh-key\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.601314 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-inventory\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.602070 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.603770 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.604126 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceph\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.607482 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.615041 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvfgt\" (UniqueName: \"kubernetes.io/projected/7e221c36-eb02-4ce0-8eda-c568c7adf15c-kube-api-access-vvfgt\") pod \"telemetry-openstack-openstack-cell1-bcrp6\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:24 crc kubenswrapper[4743]: I1122 10:39:24.679504 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:39:25 crc kubenswrapper[4743]: I1122 10:39:25.224048 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-bcrp6"] Nov 22 10:39:25 crc kubenswrapper[4743]: W1122 10:39:25.240719 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e221c36_eb02_4ce0_8eda_c568c7adf15c.slice/crio-1e952c9e709e16b00d6e8781e274a8cd992d9943e96348287543da5d90d99262 WatchSource:0}: Error finding container 1e952c9e709e16b00d6e8781e274a8cd992d9943e96348287543da5d90d99262: Status 404 returned error can't find the container with id 1e952c9e709e16b00d6e8781e274a8cd992d9943e96348287543da5d90d99262 Nov 22 10:39:25 crc kubenswrapper[4743]: I1122 10:39:25.262921 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" event={"ID":"7e221c36-eb02-4ce0-8eda-c568c7adf15c","Type":"ContainerStarted","Data":"1e952c9e709e16b00d6e8781e274a8cd992d9943e96348287543da5d90d99262"} Nov 22 10:39:26 crc kubenswrapper[4743]: I1122 10:39:26.272694 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" event={"ID":"7e221c36-eb02-4ce0-8eda-c568c7adf15c","Type":"ContainerStarted","Data":"a66c4395e91a1614d1e7e7a7aab18437f6b2317e8efd1abbe89ffa11b6b8e26d"} Nov 22 10:39:26 crc kubenswrapper[4743]: I1122 10:39:26.296174 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" podStartSLOduration=1.805990568 podStartE2EDuration="2.296156513s" podCreationTimestamp="2025-11-22 10:39:24 +0000 UTC" firstStartedPulling="2025-11-22 10:39:25.242516379 +0000 UTC m=+8238.948877431" lastFinishedPulling="2025-11-22 10:39:25.732682314 +0000 UTC m=+8239.439043376" observedRunningTime="2025-11-22 10:39:26.288302817 +0000 UTC m=+8239.994663869" watchObservedRunningTime="2025-11-22 10:39:26.296156513 +0000 UTC m=+8240.002517565" Nov 22 10:40:31 crc kubenswrapper[4743]: I1122 10:40:31.241062 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:40:31 crc kubenswrapper[4743]: I1122 10:40:31.241896 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:41:01 crc kubenswrapper[4743]: I1122 10:41:01.240752 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:41:01 crc kubenswrapper[4743]: I1122 10:41:01.241365 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:41:31 crc kubenswrapper[4743]: I1122 10:41:31.241170 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:41:31 crc kubenswrapper[4743]: I1122 10:41:31.241833 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:41:31 crc kubenswrapper[4743]: I1122 10:41:31.241882 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 10:41:31 crc kubenswrapper[4743]: I1122 10:41:31.242895 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"45c9e51af829850913e5c98775c99bef3ada6b54f1df49a98aaefd6ef00ac1bf"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:41:31 crc kubenswrapper[4743]: I1122 10:41:31.242952 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://45c9e51af829850913e5c98775c99bef3ada6b54f1df49a98aaefd6ef00ac1bf" gracePeriod=600 Nov 22 10:41:31 crc kubenswrapper[4743]: I1122 10:41:31.701287 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="45c9e51af829850913e5c98775c99bef3ada6b54f1df49a98aaefd6ef00ac1bf" exitCode=0 Nov 22 10:41:31 crc kubenswrapper[4743]: I1122 10:41:31.701372 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"45c9e51af829850913e5c98775c99bef3ada6b54f1df49a98aaefd6ef00ac1bf"} Nov 22 10:41:31 crc kubenswrapper[4743]: I1122 10:41:31.701908 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0"} Nov 22 10:41:31 crc kubenswrapper[4743]: I1122 10:41:31.701931 4743 scope.go:117] "RemoveContainer" containerID="0587304085e748dbfd17fa47a7caaa1e82169bcb96eaeb09913f17b289944b0a" Nov 22 10:43:05 crc kubenswrapper[4743]: I1122 10:43:05.631912 4743 generic.go:334] "Generic (PLEG): container finished" podID="7e221c36-eb02-4ce0-8eda-c568c7adf15c" containerID="a66c4395e91a1614d1e7e7a7aab18437f6b2317e8efd1abbe89ffa11b6b8e26d" exitCode=0 Nov 22 10:43:05 crc kubenswrapper[4743]: I1122 10:43:05.632011 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" event={"ID":"7e221c36-eb02-4ce0-8eda-c568c7adf15c","Type":"ContainerDied","Data":"a66c4395e91a1614d1e7e7a7aab18437f6b2317e8efd1abbe89ffa11b6b8e26d"} Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.093456 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.244832 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-2\") pod \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.245067 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceph\") pod \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.245216 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ssh-key\") pod \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.245465 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvfgt\" (UniqueName: \"kubernetes.io/projected/7e221c36-eb02-4ce0-8eda-c568c7adf15c-kube-api-access-vvfgt\") pod \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.245636 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-0\") pod \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.245688 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-1\") pod \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.245775 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-inventory\") pod \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.245842 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-telemetry-combined-ca-bundle\") pod \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\" (UID: \"7e221c36-eb02-4ce0-8eda-c568c7adf15c\") " Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.253608 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e221c36-eb02-4ce0-8eda-c568c7adf15c-kube-api-access-vvfgt" (OuterVolumeSpecName: "kube-api-access-vvfgt") pod "7e221c36-eb02-4ce0-8eda-c568c7adf15c" (UID: "7e221c36-eb02-4ce0-8eda-c568c7adf15c"). InnerVolumeSpecName "kube-api-access-vvfgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.253826 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceph" (OuterVolumeSpecName: "ceph") pod "7e221c36-eb02-4ce0-8eda-c568c7adf15c" (UID: "7e221c36-eb02-4ce0-8eda-c568c7adf15c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.254968 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7e221c36-eb02-4ce0-8eda-c568c7adf15c" (UID: "7e221c36-eb02-4ce0-8eda-c568c7adf15c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.279954 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "7e221c36-eb02-4ce0-8eda-c568c7adf15c" (UID: "7e221c36-eb02-4ce0-8eda-c568c7adf15c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.280662 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-inventory" (OuterVolumeSpecName: "inventory") pod "7e221c36-eb02-4ce0-8eda-c568c7adf15c" (UID: "7e221c36-eb02-4ce0-8eda-c568c7adf15c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.281741 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "7e221c36-eb02-4ce0-8eda-c568c7adf15c" (UID: "7e221c36-eb02-4ce0-8eda-c568c7adf15c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.282809 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7e221c36-eb02-4ce0-8eda-c568c7adf15c" (UID: "7e221c36-eb02-4ce0-8eda-c568c7adf15c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.285863 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "7e221c36-eb02-4ce0-8eda-c568c7adf15c" (UID: "7e221c36-eb02-4ce0-8eda-c568c7adf15c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.348636 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.348936 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.349044 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.349165 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvfgt\" (UniqueName: \"kubernetes.io/projected/7e221c36-eb02-4ce0-8eda-c568c7adf15c-kube-api-access-vvfgt\") on node \"crc\" DevicePath \"\"" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.349220 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.349270 4743 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.349378 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.349558 4743 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e221c36-eb02-4ce0-8eda-c568c7adf15c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.653204 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" event={"ID":"7e221c36-eb02-4ce0-8eda-c568c7adf15c","Type":"ContainerDied","Data":"1e952c9e709e16b00d6e8781e274a8cd992d9943e96348287543da5d90d99262"} Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.653252 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e952c9e709e16b00d6e8781e274a8cd992d9943e96348287543da5d90d99262" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.653276 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-bcrp6" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.745393 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-tzsvk"] Nov 22 10:43:07 crc kubenswrapper[4743]: E1122 10:43:07.746095 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e221c36-eb02-4ce0-8eda-c568c7adf15c" containerName="telemetry-openstack-openstack-cell1" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.746197 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e221c36-eb02-4ce0-8eda-c568c7adf15c" containerName="telemetry-openstack-openstack-cell1" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.746464 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e221c36-eb02-4ce0-8eda-c568c7adf15c" containerName="telemetry-openstack-openstack-cell1" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.747281 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.749759 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.759610 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.759788 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.760280 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.761893 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.776268 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-tzsvk"] Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.863610 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.863864 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7q4l\" (UniqueName: \"kubernetes.io/projected/4cac0424-ba03-4f34-8433-acbbdcbaeb73-kube-api-access-n7q4l\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.864009 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.864294 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.864360 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.864485 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.966911 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.966960 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7q4l\" (UniqueName: \"kubernetes.io/projected/4cac0424-ba03-4f34-8433-acbbdcbaeb73-kube-api-access-n7q4l\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.967009 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.967045 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.967062 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.967098 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.977196 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.977296 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.977633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.977931 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.981169 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:07 crc kubenswrapper[4743]: I1122 10:43:07.982846 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7q4l\" (UniqueName: \"kubernetes.io/projected/4cac0424-ba03-4f34-8433-acbbdcbaeb73-kube-api-access-n7q4l\") pod \"neutron-sriov-openstack-openstack-cell1-tzsvk\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:08 crc kubenswrapper[4743]: I1122 10:43:08.068346 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:43:08 crc kubenswrapper[4743]: I1122 10:43:08.650335 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-tzsvk"] Nov 22 10:43:08 crc kubenswrapper[4743]: I1122 10:43:08.675827 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:43:09 crc kubenswrapper[4743]: I1122 10:43:09.703400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" event={"ID":"4cac0424-ba03-4f34-8433-acbbdcbaeb73","Type":"ContainerStarted","Data":"6ba878a01313898f1050ef46afc3d8769816943d207130460be521dd7a9c1e82"} Nov 22 10:43:10 crc kubenswrapper[4743]: I1122 10:43:10.715025 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" event={"ID":"4cac0424-ba03-4f34-8433-acbbdcbaeb73","Type":"ContainerStarted","Data":"86f603f31806197f5119f54e26aba693dbad912c798f0930d51f7101326d230e"} Nov 22 10:43:10 crc kubenswrapper[4743]: I1122 10:43:10.769082 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" podStartSLOduration=3.089664633 podStartE2EDuration="3.769059508s" podCreationTimestamp="2025-11-22 10:43:07 +0000 UTC" firstStartedPulling="2025-11-22 10:43:08.675644019 +0000 UTC m=+8462.382005071" lastFinishedPulling="2025-11-22 10:43:09.355038894 +0000 UTC m=+8463.061399946" observedRunningTime="2025-11-22 10:43:10.740973232 +0000 UTC m=+8464.447334284" watchObservedRunningTime="2025-11-22 10:43:10.769059508 +0000 UTC m=+8464.475420560" Nov 22 10:43:31 crc kubenswrapper[4743]: I1122 10:43:31.241960 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:43:31 crc kubenswrapper[4743]: I1122 10:43:31.242573 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:44:01 crc kubenswrapper[4743]: I1122 10:44:01.241553 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:44:01 crc kubenswrapper[4743]: I1122 10:44:01.242713 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:44:31 crc kubenswrapper[4743]: I1122 10:44:31.241539 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:44:31 crc kubenswrapper[4743]: I1122 10:44:31.242189 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:44:31 crc kubenswrapper[4743]: I1122 10:44:31.242230 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 10:44:31 crc kubenswrapper[4743]: I1122 10:44:31.243075 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:44:31 crc kubenswrapper[4743]: I1122 10:44:31.243123 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" gracePeriod=600 Nov 22 10:44:31 crc kubenswrapper[4743]: E1122 10:44:31.375696 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:44:31 crc kubenswrapper[4743]: I1122 10:44:31.509156 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" exitCode=0 Nov 22 10:44:31 crc kubenswrapper[4743]: I1122 10:44:31.509206 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0"} Nov 22 10:44:31 crc kubenswrapper[4743]: I1122 10:44:31.509248 4743 scope.go:117] "RemoveContainer" containerID="45c9e51af829850913e5c98775c99bef3ada6b54f1df49a98aaefd6ef00ac1bf" Nov 22 10:44:31 crc kubenswrapper[4743]: I1122 10:44:31.509983 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:44:31 crc kubenswrapper[4743]: E1122 10:44:31.510992 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:44:46 crc kubenswrapper[4743]: I1122 10:44:46.152270 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:44:46 crc kubenswrapper[4743]: E1122 10:44:46.153073 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:44:57 crc kubenswrapper[4743]: I1122 10:44:57.163064 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:44:57 crc kubenswrapper[4743]: E1122 10:44:57.163918 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.153093 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5"] Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.159331 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.161699 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.161826 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.167036 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5"] Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.209499 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-secret-volume\") pod \"collect-profiles-29396805-ggps5\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.210124 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4s8\" (UniqueName: \"kubernetes.io/projected/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-kube-api-access-vd4s8\") pod \"collect-profiles-29396805-ggps5\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.210243 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-config-volume\") pod \"collect-profiles-29396805-ggps5\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.312194 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4s8\" (UniqueName: \"kubernetes.io/projected/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-kube-api-access-vd4s8\") pod \"collect-profiles-29396805-ggps5\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.312283 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-config-volume\") pod \"collect-profiles-29396805-ggps5\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.312357 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-secret-volume\") pod \"collect-profiles-29396805-ggps5\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.313222 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-config-volume\") pod \"collect-profiles-29396805-ggps5\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.327535 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-secret-volume\") pod \"collect-profiles-29396805-ggps5\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.333936 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4s8\" (UniqueName: \"kubernetes.io/projected/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-kube-api-access-vd4s8\") pod \"collect-profiles-29396805-ggps5\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.489472 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:00 crc kubenswrapper[4743]: I1122 10:45:00.945631 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5"] Nov 22 10:45:01 crc kubenswrapper[4743]: I1122 10:45:01.794959 4743 generic.go:334] "Generic (PLEG): container finished" podID="e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1" containerID="3c1b0e8c37b13b4731b05c120ab2626fe929157305f70a0c40a541da56111240" exitCode=0 Nov 22 10:45:01 crc kubenswrapper[4743]: I1122 10:45:01.795014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" event={"ID":"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1","Type":"ContainerDied","Data":"3c1b0e8c37b13b4731b05c120ab2626fe929157305f70a0c40a541da56111240"} Nov 22 10:45:01 crc kubenswrapper[4743]: I1122 10:45:01.795260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" event={"ID":"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1","Type":"ContainerStarted","Data":"7825c6293ee0f4e8b92b3c8d3e81d373674e08d7aa3536ea1929b8cc46112c35"} Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.178930 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.375689 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4s8\" (UniqueName: \"kubernetes.io/projected/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-kube-api-access-vd4s8\") pod \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.376140 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-config-volume\") pod \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.376239 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-secret-volume\") pod \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\" (UID: \"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1\") " Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.376833 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1" (UID: "e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.383210 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1" (UID: "e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.383264 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-kube-api-access-vd4s8" (OuterVolumeSpecName: "kube-api-access-vd4s8") pod "e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1" (UID: "e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1"). InnerVolumeSpecName "kube-api-access-vd4s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.478882 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4s8\" (UniqueName: \"kubernetes.io/projected/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-kube-api-access-vd4s8\") on node \"crc\" DevicePath \"\"" Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.478925 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.478934 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.813871 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" event={"ID":"e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1","Type":"ContainerDied","Data":"7825c6293ee0f4e8b92b3c8d3e81d373674e08d7aa3536ea1929b8cc46112c35"} Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.813908 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7825c6293ee0f4e8b92b3c8d3e81d373674e08d7aa3536ea1929b8cc46112c35" Nov 22 10:45:03 crc kubenswrapper[4743]: I1122 10:45:03.813985 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396805-ggps5" Nov 22 10:45:04 crc kubenswrapper[4743]: I1122 10:45:04.255519 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t"] Nov 22 10:45:04 crc kubenswrapper[4743]: I1122 10:45:04.264754 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396760-j8p6t"] Nov 22 10:45:05 crc kubenswrapper[4743]: I1122 10:45:05.173730 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f767578d-897b-492f-afa7-e61b6134690d" path="/var/lib/kubelet/pods/f767578d-897b-492f-afa7-e61b6134690d/volumes" Nov 22 10:45:08 crc kubenswrapper[4743]: I1122 10:45:08.152706 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:45:08 crc kubenswrapper[4743]: E1122 10:45:08.154274 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:45:21 crc kubenswrapper[4743]: I1122 10:45:21.152096 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:45:21 crc kubenswrapper[4743]: E1122 10:45:21.156907 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:45:36 crc kubenswrapper[4743]: I1122 10:45:36.152323 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:45:36 crc kubenswrapper[4743]: E1122 10:45:36.153129 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:45:38 crc kubenswrapper[4743]: I1122 10:45:38.553890 4743 scope.go:117] "RemoveContainer" containerID="af41ec3c5c6d477bd4ccb2d43edda1d8a6b077978a23a6398f02bb0ebacbfb34" Nov 22 10:45:47 crc kubenswrapper[4743]: I1122 10:45:47.165992 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:45:47 crc kubenswrapper[4743]: E1122 10:45:47.167156 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:45:52 crc kubenswrapper[4743]: I1122 10:45:52.320625 4743 generic.go:334] "Generic (PLEG): container finished" podID="4cac0424-ba03-4f34-8433-acbbdcbaeb73" containerID="86f603f31806197f5119f54e26aba693dbad912c798f0930d51f7101326d230e" exitCode=0 Nov 22 10:45:52 crc kubenswrapper[4743]: I1122 10:45:52.320722 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" event={"ID":"4cac0424-ba03-4f34-8433-acbbdcbaeb73","Type":"ContainerDied","Data":"86f603f31806197f5119f54e26aba693dbad912c798f0930d51f7101326d230e"} Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.821983 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.950227 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7q4l\" (UniqueName: \"kubernetes.io/projected/4cac0424-ba03-4f34-8433-acbbdcbaeb73-kube-api-access-n7q4l\") pod \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.950309 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-agent-neutron-config-0\") pod \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.950558 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ssh-key\") pod \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.951101 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-inventory\") pod \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.951215 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-combined-ca-bundle\") pod \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.951238 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ceph\") pod \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\" (UID: \"4cac0424-ba03-4f34-8433-acbbdcbaeb73\") " Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.956607 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cac0424-ba03-4f34-8433-acbbdcbaeb73-kube-api-access-n7q4l" (OuterVolumeSpecName: "kube-api-access-n7q4l") pod "4cac0424-ba03-4f34-8433-acbbdcbaeb73" (UID: "4cac0424-ba03-4f34-8433-acbbdcbaeb73"). InnerVolumeSpecName "kube-api-access-n7q4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.963077 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "4cac0424-ba03-4f34-8433-acbbdcbaeb73" (UID: "4cac0424-ba03-4f34-8433-acbbdcbaeb73"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.966367 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ceph" (OuterVolumeSpecName: "ceph") pod "4cac0424-ba03-4f34-8433-acbbdcbaeb73" (UID: "4cac0424-ba03-4f34-8433-acbbdcbaeb73"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.982278 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "4cac0424-ba03-4f34-8433-acbbdcbaeb73" (UID: "4cac0424-ba03-4f34-8433-acbbdcbaeb73"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:45:53 crc kubenswrapper[4743]: I1122 10:45:53.987040 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4cac0424-ba03-4f34-8433-acbbdcbaeb73" (UID: "4cac0424-ba03-4f34-8433-acbbdcbaeb73"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.004370 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-inventory" (OuterVolumeSpecName: "inventory") pod "4cac0424-ba03-4f34-8433-acbbdcbaeb73" (UID: "4cac0424-ba03-4f34-8433-acbbdcbaeb73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.054034 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.054102 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.054117 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7q4l\" (UniqueName: \"kubernetes.io/projected/4cac0424-ba03-4f34-8433-acbbdcbaeb73-kube-api-access-n7q4l\") on node \"crc\" DevicePath \"\"" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.054132 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.054144 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.054157 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4cac0424-ba03-4f34-8433-acbbdcbaeb73-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.342210 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" event={"ID":"4cac0424-ba03-4f34-8433-acbbdcbaeb73","Type":"ContainerDied","Data":"6ba878a01313898f1050ef46afc3d8769816943d207130460be521dd7a9c1e82"} Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.342445 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba878a01313898f1050ef46afc3d8769816943d207130460be521dd7a9c1e82" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.342263 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-tzsvk" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.422870 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g"] Nov 22 10:45:54 crc kubenswrapper[4743]: E1122 10:45:54.423497 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1" containerName="collect-profiles" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.423560 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1" containerName="collect-profiles" Nov 22 10:45:54 crc kubenswrapper[4743]: E1122 10:45:54.423649 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cac0424-ba03-4f34-8433-acbbdcbaeb73" containerName="neutron-sriov-openstack-openstack-cell1" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.423737 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cac0424-ba03-4f34-8433-acbbdcbaeb73" containerName="neutron-sriov-openstack-openstack-cell1" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.424070 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86dfb2c-f334-41ca-9bc3-226ff6dfd3c1" containerName="collect-profiles" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.424161 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cac0424-ba03-4f34-8433-acbbdcbaeb73" containerName="neutron-sriov-openstack-openstack-cell1" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.425078 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.429082 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.429086 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.429193 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.429683 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.429728 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.444182 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g"] Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.564631 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkdvt\" (UniqueName: \"kubernetes.io/projected/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-kube-api-access-qkdvt\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.564699 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.564723 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.564815 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.564885 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.564922 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.666820 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.667128 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.667264 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.667395 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.667491 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.667675 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkdvt\" (UniqueName: \"kubernetes.io/projected/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-kube-api-access-qkdvt\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.671725 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.673406 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.674933 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.675084 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.679378 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.690355 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkdvt\" (UniqueName: \"kubernetes.io/projected/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-kube-api-access-qkdvt\") pod \"neutron-dhcp-openstack-openstack-cell1-nxh2g\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:54 crc kubenswrapper[4743]: I1122 10:45:54.746695 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:45:55 crc kubenswrapper[4743]: W1122 10:45:55.274082 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f88c0f4_ddcb_4924_ab2a_3179a3f1f616.slice/crio-46e3982d6968d8a913fac313dd5431f06ae8f3713f956fd851167e5bbc5953f3 WatchSource:0}: Error finding container 46e3982d6968d8a913fac313dd5431f06ae8f3713f956fd851167e5bbc5953f3: Status 404 returned error can't find the container with id 46e3982d6968d8a913fac313dd5431f06ae8f3713f956fd851167e5bbc5953f3 Nov 22 10:45:55 crc kubenswrapper[4743]: I1122 10:45:55.274145 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g"] Nov 22 10:45:55 crc kubenswrapper[4743]: I1122 10:45:55.352391 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" event={"ID":"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616","Type":"ContainerStarted","Data":"46e3982d6968d8a913fac313dd5431f06ae8f3713f956fd851167e5bbc5953f3"} Nov 22 10:45:56 crc kubenswrapper[4743]: I1122 10:45:56.362920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" event={"ID":"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616","Type":"ContainerStarted","Data":"e862195ea33a2cc4fc1549da612c82a215c0a392f0c94438d4d6fa55b936b609"} Nov 22 10:45:56 crc kubenswrapper[4743]: I1122 10:45:56.381881 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" podStartSLOduration=1.864400134 podStartE2EDuration="2.381859743s" podCreationTimestamp="2025-11-22 10:45:54 +0000 UTC" firstStartedPulling="2025-11-22 10:45:55.277765056 +0000 UTC m=+8628.984126108" lastFinishedPulling="2025-11-22 10:45:55.795224665 +0000 UTC m=+8629.501585717" observedRunningTime="2025-11-22 10:45:56.375733208 +0000 UTC m=+8630.082094260" watchObservedRunningTime="2025-11-22 10:45:56.381859743 +0000 UTC m=+8630.088220795" Nov 22 10:45:58 crc kubenswrapper[4743]: I1122 10:45:58.152430 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:45:58 crc kubenswrapper[4743]: E1122 10:45:58.153019 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:46:13 crc kubenswrapper[4743]: I1122 10:46:13.152745 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:46:13 crc kubenswrapper[4743]: E1122 10:46:13.153635 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:46:27 crc kubenswrapper[4743]: I1122 10:46:27.161057 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:46:27 crc kubenswrapper[4743]: E1122 10:46:27.161927 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:46:42 crc kubenswrapper[4743]: I1122 10:46:42.152409 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:46:42 crc kubenswrapper[4743]: E1122 10:46:42.153511 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:46:54 crc kubenswrapper[4743]: I1122 10:46:54.151415 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:46:54 crc kubenswrapper[4743]: E1122 10:46:54.152268 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:47:08 crc kubenswrapper[4743]: I1122 10:47:08.153231 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:47:08 crc kubenswrapper[4743]: E1122 10:47:08.154750 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:47:22 crc kubenswrapper[4743]: I1122 10:47:22.151551 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:47:22 crc kubenswrapper[4743]: E1122 10:47:22.152247 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:47:36 crc kubenswrapper[4743]: I1122 10:47:36.151972 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:47:36 crc kubenswrapper[4743]: E1122 10:47:36.152803 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:47:50 crc kubenswrapper[4743]: I1122 10:47:50.151417 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:47:50 crc kubenswrapper[4743]: E1122 10:47:50.152312 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:48:03 crc kubenswrapper[4743]: I1122 10:48:03.152371 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:48:03 crc kubenswrapper[4743]: E1122 10:48:03.153256 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:48:18 crc kubenswrapper[4743]: I1122 10:48:18.152404 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:48:18 crc kubenswrapper[4743]: E1122 10:48:18.153215 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.130785 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m7gdw"] Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.135171 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.142077 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7gdw"] Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.212395 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-utilities\") pod \"community-operators-m7gdw\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.212492 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c468\" (UniqueName: \"kubernetes.io/projected/ea2d006f-6698-4844-9e3d-accfb17a93c7-kube-api-access-6c468\") pod \"community-operators-m7gdw\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.212594 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-catalog-content\") pod \"community-operators-m7gdw\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.314797 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-catalog-content\") pod \"community-operators-m7gdw\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.314993 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-utilities\") pod \"community-operators-m7gdw\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.315146 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c468\" (UniqueName: \"kubernetes.io/projected/ea2d006f-6698-4844-9e3d-accfb17a93c7-kube-api-access-6c468\") pod \"community-operators-m7gdw\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.315528 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-utilities\") pod \"community-operators-m7gdw\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.315522 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-catalog-content\") pod \"community-operators-m7gdw\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.337185 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c468\" (UniqueName: \"kubernetes.io/projected/ea2d006f-6698-4844-9e3d-accfb17a93c7-kube-api-access-6c468\") pod \"community-operators-m7gdw\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:31 crc kubenswrapper[4743]: I1122 10:48:31.472151 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:32 crc kubenswrapper[4743]: I1122 10:48:32.065389 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m7gdw"] Nov 22 10:48:32 crc kubenswrapper[4743]: I1122 10:48:32.152484 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:48:32 crc kubenswrapper[4743]: E1122 10:48:32.152820 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:48:32 crc kubenswrapper[4743]: I1122 10:48:32.160081 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7gdw" event={"ID":"ea2d006f-6698-4844-9e3d-accfb17a93c7","Type":"ContainerStarted","Data":"828ba9d846b4774c290682db5b476339be92e11aedaff19f17be81b46880da25"} Nov 22 10:48:33 crc kubenswrapper[4743]: I1122 10:48:33.171797 4743 generic.go:334] "Generic (PLEG): container finished" podID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerID="66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899" exitCode=0 Nov 22 10:48:33 crc kubenswrapper[4743]: I1122 10:48:33.171876 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7gdw" event={"ID":"ea2d006f-6698-4844-9e3d-accfb17a93c7","Type":"ContainerDied","Data":"66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899"} Nov 22 10:48:33 crc kubenswrapper[4743]: I1122 10:48:33.174072 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:48:35 crc kubenswrapper[4743]: I1122 10:48:35.216208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7gdw" event={"ID":"ea2d006f-6698-4844-9e3d-accfb17a93c7","Type":"ContainerStarted","Data":"a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b"} Nov 22 10:48:36 crc kubenswrapper[4743]: I1122 10:48:36.228132 4743 generic.go:334] "Generic (PLEG): container finished" podID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerID="a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b" exitCode=0 Nov 22 10:48:36 crc kubenswrapper[4743]: I1122 10:48:36.228195 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7gdw" event={"ID":"ea2d006f-6698-4844-9e3d-accfb17a93c7","Type":"ContainerDied","Data":"a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b"} Nov 22 10:48:37 crc kubenswrapper[4743]: I1122 10:48:37.239064 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7gdw" event={"ID":"ea2d006f-6698-4844-9e3d-accfb17a93c7","Type":"ContainerStarted","Data":"8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5"} Nov 22 10:48:37 crc kubenswrapper[4743]: I1122 10:48:37.257917 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m7gdw" podStartSLOduration=2.792892467 podStartE2EDuration="6.257895349s" podCreationTimestamp="2025-11-22 10:48:31 +0000 UTC" firstStartedPulling="2025-11-22 10:48:33.173792559 +0000 UTC m=+8786.880153611" lastFinishedPulling="2025-11-22 10:48:36.638795441 +0000 UTC m=+8790.345156493" observedRunningTime="2025-11-22 10:48:37.254288916 +0000 UTC m=+8790.960649968" watchObservedRunningTime="2025-11-22 10:48:37.257895349 +0000 UTC m=+8790.964256401" Nov 22 10:48:41 crc kubenswrapper[4743]: I1122 10:48:41.473598 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:41 crc kubenswrapper[4743]: I1122 10:48:41.474197 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:41 crc kubenswrapper[4743]: I1122 10:48:41.525095 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:42 crc kubenswrapper[4743]: I1122 10:48:42.331336 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:42 crc kubenswrapper[4743]: I1122 10:48:42.384197 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m7gdw"] Nov 22 10:48:44 crc kubenswrapper[4743]: I1122 10:48:44.305915 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m7gdw" podUID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerName="registry-server" containerID="cri-o://8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5" gracePeriod=2 Nov 22 10:48:44 crc kubenswrapper[4743]: I1122 10:48:44.927003 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.045039 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c468\" (UniqueName: \"kubernetes.io/projected/ea2d006f-6698-4844-9e3d-accfb17a93c7-kube-api-access-6c468\") pod \"ea2d006f-6698-4844-9e3d-accfb17a93c7\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.045115 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-utilities\") pod \"ea2d006f-6698-4844-9e3d-accfb17a93c7\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.045257 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-catalog-content\") pod \"ea2d006f-6698-4844-9e3d-accfb17a93c7\" (UID: \"ea2d006f-6698-4844-9e3d-accfb17a93c7\") " Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.046019 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-utilities" (OuterVolumeSpecName: "utilities") pod "ea2d006f-6698-4844-9e3d-accfb17a93c7" (UID: "ea2d006f-6698-4844-9e3d-accfb17a93c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.049361 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.052827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea2d006f-6698-4844-9e3d-accfb17a93c7-kube-api-access-6c468" (OuterVolumeSpecName: "kube-api-access-6c468") pod "ea2d006f-6698-4844-9e3d-accfb17a93c7" (UID: "ea2d006f-6698-4844-9e3d-accfb17a93c7"). InnerVolumeSpecName "kube-api-access-6c468". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.094056 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea2d006f-6698-4844-9e3d-accfb17a93c7" (UID: "ea2d006f-6698-4844-9e3d-accfb17a93c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.151493 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea2d006f-6698-4844-9e3d-accfb17a93c7-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.151524 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c468\" (UniqueName: \"kubernetes.io/projected/ea2d006f-6698-4844-9e3d-accfb17a93c7-kube-api-access-6c468\") on node \"crc\" DevicePath \"\"" Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.324437 4743 generic.go:334] "Generic (PLEG): container finished" podID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerID="8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5" exitCode=0 Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.324493 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7gdw" event={"ID":"ea2d006f-6698-4844-9e3d-accfb17a93c7","Type":"ContainerDied","Data":"8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5"} Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.324530 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m7gdw" event={"ID":"ea2d006f-6698-4844-9e3d-accfb17a93c7","Type":"ContainerDied","Data":"828ba9d846b4774c290682db5b476339be92e11aedaff19f17be81b46880da25"} Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.324564 4743 scope.go:117] "RemoveContainer" containerID="8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5" Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.324771 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m7gdw" Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.355195 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m7gdw"] Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.365997 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m7gdw"] Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.366711 4743 scope.go:117] "RemoveContainer" containerID="a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b" Nov 22 10:48:45 crc kubenswrapper[4743]: I1122 10:48:45.973101 4743 scope.go:117] "RemoveContainer" containerID="66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899" Nov 22 10:48:46 crc kubenswrapper[4743]: I1122 10:48:46.019918 4743 scope.go:117] "RemoveContainer" containerID="8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5" Nov 22 10:48:46 crc kubenswrapper[4743]: E1122 10:48:46.020494 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5\": container with ID starting with 8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5 not found: ID does not exist" containerID="8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5" Nov 22 10:48:46 crc kubenswrapper[4743]: I1122 10:48:46.020530 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5"} err="failed to get container status \"8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5\": rpc error: code = NotFound desc = could not find container \"8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5\": container with ID starting with 8e6535ce1852999c40d5d716dcf7f3ff15bbdf542db8e10f50c895562df2bed5 not found: ID does not exist" Nov 22 10:48:46 crc kubenswrapper[4743]: I1122 10:48:46.020569 4743 scope.go:117] "RemoveContainer" containerID="a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b" Nov 22 10:48:46 crc kubenswrapper[4743]: E1122 10:48:46.021157 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b\": container with ID starting with a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b not found: ID does not exist" containerID="a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b" Nov 22 10:48:46 crc kubenswrapper[4743]: I1122 10:48:46.021301 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b"} err="failed to get container status \"a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b\": rpc error: code = NotFound desc = could not find container \"a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b\": container with ID starting with a4429434ace6250878228d99dd2dc1ae0eb7825e5b09e5c966a065eb7b2f853b not found: ID does not exist" Nov 22 10:48:46 crc kubenswrapper[4743]: I1122 10:48:46.021316 4743 scope.go:117] "RemoveContainer" containerID="66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899" Nov 22 10:48:46 crc kubenswrapper[4743]: E1122 10:48:46.021671 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899\": container with ID starting with 66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899 not found: ID does not exist" containerID="66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899" Nov 22 10:48:46 crc kubenswrapper[4743]: I1122 10:48:46.021721 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899"} err="failed to get container status \"66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899\": rpc error: code = NotFound desc = could not find container \"66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899\": container with ID starting with 66968e0b3f402088927683d921300373150dfd9a4fdb2ebd58bd050129d5d899 not found: ID does not exist" Nov 22 10:48:46 crc kubenswrapper[4743]: I1122 10:48:46.152841 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:48:46 crc kubenswrapper[4743]: E1122 10:48:46.153075 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:48:47 crc kubenswrapper[4743]: I1122 10:48:47.163104 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea2d006f-6698-4844-9e3d-accfb17a93c7" path="/var/lib/kubelet/pods/ea2d006f-6698-4844-9e3d-accfb17a93c7/volumes" Nov 22 10:48:57 crc kubenswrapper[4743]: I1122 10:48:57.158184 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:48:57 crc kubenswrapper[4743]: E1122 10:48:57.158998 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:49:08 crc kubenswrapper[4743]: I1122 10:49:08.151671 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:49:08 crc kubenswrapper[4743]: E1122 10:49:08.152681 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:49:23 crc kubenswrapper[4743]: I1122 10:49:23.151374 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:49:23 crc kubenswrapper[4743]: E1122 10:49:23.153214 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:49:37 crc kubenswrapper[4743]: I1122 10:49:37.158370 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:49:37 crc kubenswrapper[4743]: I1122 10:49:37.802899 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"82d7fb2ec1629cbdcf99bab7b5eb0f8726527d7c49663a3d80c02d05249f64d5"} Nov 22 10:52:01 crc kubenswrapper[4743]: I1122 10:52:01.241407 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:52:01 crc kubenswrapper[4743]: I1122 10:52:01.242095 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:52:31 crc kubenswrapper[4743]: I1122 10:52:31.242099 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:52:31 crc kubenswrapper[4743]: I1122 10:52:31.242816 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.090412 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j8v9v"] Nov 22 10:52:44 crc kubenswrapper[4743]: E1122 10:52:44.091459 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerName="extract-content" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.091473 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerName="extract-content" Nov 22 10:52:44 crc kubenswrapper[4743]: E1122 10:52:44.091497 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerName="extract-utilities" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.091503 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerName="extract-utilities" Nov 22 10:52:44 crc kubenswrapper[4743]: E1122 10:52:44.091538 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerName="registry-server" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.091544 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerName="registry-server" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.091768 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea2d006f-6698-4844-9e3d-accfb17a93c7" containerName="registry-server" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.093483 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.104923 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8v9v"] Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.212322 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-utilities\") pod \"redhat-marketplace-j8v9v\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.212451 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-225fx\" (UniqueName: \"kubernetes.io/projected/0f3787e0-b70f-4c89-a89d-be15f5840012-kube-api-access-225fx\") pod \"redhat-marketplace-j8v9v\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.212479 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-catalog-content\") pod \"redhat-marketplace-j8v9v\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.318642 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-225fx\" (UniqueName: \"kubernetes.io/projected/0f3787e0-b70f-4c89-a89d-be15f5840012-kube-api-access-225fx\") pod \"redhat-marketplace-j8v9v\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.319061 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-catalog-content\") pod \"redhat-marketplace-j8v9v\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.319544 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-catalog-content\") pod \"redhat-marketplace-j8v9v\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.319679 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-utilities\") pod \"redhat-marketplace-j8v9v\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.319920 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-utilities\") pod \"redhat-marketplace-j8v9v\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.347294 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-225fx\" (UniqueName: \"kubernetes.io/projected/0f3787e0-b70f-4c89-a89d-be15f5840012-kube-api-access-225fx\") pod \"redhat-marketplace-j8v9v\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.421046 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:44 crc kubenswrapper[4743]: I1122 10:52:44.952941 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8v9v"] Nov 22 10:52:45 crc kubenswrapper[4743]: I1122 10:52:45.681477 4743 generic.go:334] "Generic (PLEG): container finished" podID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerID="18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7" exitCode=0 Nov 22 10:52:45 crc kubenswrapper[4743]: I1122 10:52:45.682069 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8v9v" event={"ID":"0f3787e0-b70f-4c89-a89d-be15f5840012","Type":"ContainerDied","Data":"18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7"} Nov 22 10:52:45 crc kubenswrapper[4743]: I1122 10:52:45.682208 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8v9v" event={"ID":"0f3787e0-b70f-4c89-a89d-be15f5840012","Type":"ContainerStarted","Data":"a385101b4bc50ea76977293b7571c92a075ad224fd7dc3748e5ecaf5b7fd9903"} Nov 22 10:52:45 crc kubenswrapper[4743]: I1122 10:52:45.684414 4743 generic.go:334] "Generic (PLEG): container finished" podID="8f88c0f4-ddcb-4924-ab2a-3179a3f1f616" containerID="e862195ea33a2cc4fc1549da612c82a215c0a392f0c94438d4d6fa55b936b609" exitCode=0 Nov 22 10:52:45 crc kubenswrapper[4743]: I1122 10:52:45.684459 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" event={"ID":"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616","Type":"ContainerDied","Data":"e862195ea33a2cc4fc1549da612c82a215c0a392f0c94438d4d6fa55b936b609"} Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.176364 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.282264 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-inventory\") pod \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.282646 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-agent-neutron-config-0\") pod \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.282686 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ceph\") pod \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.282712 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkdvt\" (UniqueName: \"kubernetes.io/projected/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-kube-api-access-qkdvt\") pod \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.282757 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ssh-key\") pod \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.282993 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-combined-ca-bundle\") pod \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\" (UID: \"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616\") " Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.289251 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ceph" (OuterVolumeSpecName: "ceph") pod "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616" (UID: "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.289778 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616" (UID: "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.296087 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-kube-api-access-qkdvt" (OuterVolumeSpecName: "kube-api-access-qkdvt") pod "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616" (UID: "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616"). InnerVolumeSpecName "kube-api-access-qkdvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.316521 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616" (UID: "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.317524 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616" (UID: "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.319584 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-inventory" (OuterVolumeSpecName: "inventory") pod "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616" (UID: "8f88c0f4-ddcb-4924-ab2a-3179a3f1f616"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.386022 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.386064 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.386078 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.386092 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkdvt\" (UniqueName: \"kubernetes.io/projected/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-kube-api-access-qkdvt\") on node \"crc\" DevicePath \"\"" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.386106 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.386115 4743 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f88c0f4-ddcb-4924-ab2a-3179a3f1f616-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.703981 4743 generic.go:334] "Generic (PLEG): container finished" podID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerID="9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6" exitCode=0 Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.704078 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8v9v" event={"ID":"0f3787e0-b70f-4c89-a89d-be15f5840012","Type":"ContainerDied","Data":"9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6"} Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.706759 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" event={"ID":"8f88c0f4-ddcb-4924-ab2a-3179a3f1f616","Type":"ContainerDied","Data":"46e3982d6968d8a913fac313dd5431f06ae8f3713f956fd851167e5bbc5953f3"} Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.706804 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46e3982d6968d8a913fac313dd5431f06ae8f3713f956fd851167e5bbc5953f3" Nov 22 10:52:47 crc kubenswrapper[4743]: I1122 10:52:47.706817 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-nxh2g" Nov 22 10:52:48 crc kubenswrapper[4743]: I1122 10:52:48.718051 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8v9v" event={"ID":"0f3787e0-b70f-4c89-a89d-be15f5840012","Type":"ContainerStarted","Data":"0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b"} Nov 22 10:52:48 crc kubenswrapper[4743]: I1122 10:52:48.741463 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j8v9v" podStartSLOduration=2.324756921 podStartE2EDuration="4.741446443s" podCreationTimestamp="2025-11-22 10:52:44 +0000 UTC" firstStartedPulling="2025-11-22 10:52:45.684806968 +0000 UTC m=+9039.391168020" lastFinishedPulling="2025-11-22 10:52:48.10149648 +0000 UTC m=+9041.807857542" observedRunningTime="2025-11-22 10:52:48.738718415 +0000 UTC m=+9042.445079477" watchObservedRunningTime="2025-11-22 10:52:48.741446443 +0000 UTC m=+9042.447807495" Nov 22 10:52:54 crc kubenswrapper[4743]: I1122 10:52:54.422594 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:54 crc kubenswrapper[4743]: I1122 10:52:54.423049 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:54 crc kubenswrapper[4743]: I1122 10:52:54.676802 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:54 crc kubenswrapper[4743]: I1122 10:52:54.819115 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:54 crc kubenswrapper[4743]: I1122 10:52:54.913012 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8v9v"] Nov 22 10:52:56 crc kubenswrapper[4743]: I1122 10:52:56.794423 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j8v9v" podUID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerName="registry-server" containerID="cri-o://0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b" gracePeriod=2 Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.341263 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.507870 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-225fx\" (UniqueName: \"kubernetes.io/projected/0f3787e0-b70f-4c89-a89d-be15f5840012-kube-api-access-225fx\") pod \"0f3787e0-b70f-4c89-a89d-be15f5840012\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.508097 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-utilities\") pod \"0f3787e0-b70f-4c89-a89d-be15f5840012\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.508249 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-catalog-content\") pod \"0f3787e0-b70f-4c89-a89d-be15f5840012\" (UID: \"0f3787e0-b70f-4c89-a89d-be15f5840012\") " Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.509435 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-utilities" (OuterVolumeSpecName: "utilities") pod "0f3787e0-b70f-4c89-a89d-be15f5840012" (UID: "0f3787e0-b70f-4c89-a89d-be15f5840012"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.513642 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3787e0-b70f-4c89-a89d-be15f5840012-kube-api-access-225fx" (OuterVolumeSpecName: "kube-api-access-225fx") pod "0f3787e0-b70f-4c89-a89d-be15f5840012" (UID: "0f3787e0-b70f-4c89-a89d-be15f5840012"). InnerVolumeSpecName "kube-api-access-225fx". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.526207 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f3787e0-b70f-4c89-a89d-be15f5840012" (UID: "0f3787e0-b70f-4c89-a89d-be15f5840012"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.610834 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.610863 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f3787e0-b70f-4c89-a89d-be15f5840012-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.610874 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-225fx\" (UniqueName: \"kubernetes.io/projected/0f3787e0-b70f-4c89-a89d-be15f5840012-kube-api-access-225fx\") on node \"crc\" DevicePath \"\"" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.807566 4743 generic.go:334] "Generic (PLEG): container finished" podID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerID="0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b" exitCode=0 Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.807637 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j8v9v" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.807637 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8v9v" event={"ID":"0f3787e0-b70f-4c89-a89d-be15f5840012","Type":"ContainerDied","Data":"0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b"} Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.807755 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j8v9v" event={"ID":"0f3787e0-b70f-4c89-a89d-be15f5840012","Type":"ContainerDied","Data":"a385101b4bc50ea76977293b7571c92a075ad224fd7dc3748e5ecaf5b7fd9903"} Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.807779 4743 scope.go:117] "RemoveContainer" containerID="0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.847185 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8v9v"] Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.847434 4743 scope.go:117] "RemoveContainer" containerID="9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.858418 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j8v9v"] Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.877818 4743 scope.go:117] "RemoveContainer" containerID="18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.898297 4743 scope.go:117] "RemoveContainer" containerID="0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b" Nov 22 10:52:57 crc kubenswrapper[4743]: E1122 10:52:57.898810 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b\": container with ID starting with 0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b not found: ID does not exist" containerID="0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.898838 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b"} err="failed to get container status \"0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b\": rpc error: code = NotFound desc = could not find container \"0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b\": container with ID starting with 0ec12b37a230568f635f7e04460f16461025b69b024a485bb2464e1896b8b25b not found: ID does not exist" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.898859 4743 scope.go:117] "RemoveContainer" containerID="9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6" Nov 22 10:52:57 crc kubenswrapper[4743]: E1122 10:52:57.899122 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6\": container with ID starting with 9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6 not found: ID does not exist" containerID="9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.899152 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6"} err="failed to get container status \"9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6\": rpc error: code = NotFound desc = could not find container \"9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6\": container with ID starting with 9253133bff1e9c1b68750c7284d4fc8e0c52512466dff8a7125b3a6069feffe6 not found: ID does not exist" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.899169 4743 scope.go:117] "RemoveContainer" containerID="18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7" Nov 22 10:52:57 crc kubenswrapper[4743]: E1122 10:52:57.899358 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7\": container with ID starting with 18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7 not found: ID does not exist" containerID="18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7" Nov 22 10:52:57 crc kubenswrapper[4743]: I1122 10:52:57.899375 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7"} err="failed to get container status \"18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7\": rpc error: code = NotFound desc = could not find container \"18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7\": container with ID starting with 18148a4d8829c5e4c22b21dcb7d19bbce3fe222c43e21ed37475f3d6c37bbcc7 not found: ID does not exist" Nov 22 10:52:59 crc kubenswrapper[4743]: I1122 10:52:59.162614 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3787e0-b70f-4c89-a89d-be15f5840012" path="/var/lib/kubelet/pods/0f3787e0-b70f-4c89-a89d-be15f5840012/volumes" Nov 22 10:53:01 crc kubenswrapper[4743]: I1122 10:53:01.241378 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:53:01 crc kubenswrapper[4743]: I1122 10:53:01.241806 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:53:01 crc kubenswrapper[4743]: I1122 10:53:01.241848 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 10:53:01 crc kubenswrapper[4743]: I1122 10:53:01.242698 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82d7fb2ec1629cbdcf99bab7b5eb0f8726527d7c49663a3d80c02d05249f64d5"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:53:01 crc kubenswrapper[4743]: I1122 10:53:01.242753 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://82d7fb2ec1629cbdcf99bab7b5eb0f8726527d7c49663a3d80c02d05249f64d5" gracePeriod=600 Nov 22 10:53:01 crc kubenswrapper[4743]: I1122 10:53:01.851323 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="82d7fb2ec1629cbdcf99bab7b5eb0f8726527d7c49663a3d80c02d05249f64d5" exitCode=0 Nov 22 10:53:01 crc kubenswrapper[4743]: I1122 10:53:01.851397 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"82d7fb2ec1629cbdcf99bab7b5eb0f8726527d7c49663a3d80c02d05249f64d5"} Nov 22 10:53:01 crc kubenswrapper[4743]: I1122 10:53:01.851626 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63"} Nov 22 10:53:01 crc kubenswrapper[4743]: I1122 10:53:01.851644 4743 scope.go:117] "RemoveContainer" containerID="0d2617de2819568247ff4c289dfe303d1f8bb87c7c61ed7a3cb5b21fddeeaae0" Nov 22 10:53:07 crc kubenswrapper[4743]: I1122 10:53:07.507551 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 10:53:07 crc kubenswrapper[4743]: I1122 10:53:07.508913 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="375736c0-507a-4cb9-bf8d-b0827eb30630" containerName="nova-cell0-conductor-conductor" containerID="cri-o://524a484ad4860bf6973687c1a8123e31d7919c0e0bf05519bcf74918c9eae0ae" gracePeriod=30 Nov 22 10:53:07 crc kubenswrapper[4743]: I1122 10:53:07.562670 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 10:53:07 crc kubenswrapper[4743]: I1122 10:53:07.563005 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="e9508ef1-7649-4ffd-84af-de9884f26e1c" containerName="nova-cell1-conductor-conductor" containerID="cri-o://1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932" gracePeriod=30 Nov 22 10:53:08 crc kubenswrapper[4743]: I1122 10:53:08.290957 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 10:53:08 crc kubenswrapper[4743]: I1122 10:53:08.291909 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9b35026a-cef7-4dd0-9446-429b448f7ed9" containerName="nova-scheduler-scheduler" containerID="cri-o://73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635" gracePeriod=30 Nov 22 10:53:08 crc kubenswrapper[4743]: I1122 10:53:08.307432 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 10:53:08 crc kubenswrapper[4743]: I1122 10:53:08.331038 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 10:53:08 crc kubenswrapper[4743]: I1122 10:53:08.331289 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-log" containerID="cri-o://60f51e33a98a40374c2117fb4eaa128dc055495b07382c37e2ccde713562a10d" gracePeriod=30 Nov 22 10:53:08 crc kubenswrapper[4743]: I1122 10:53:08.331401 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-metadata" containerID="cri-o://7987c3233e70cdcf23c8dbb58b86bb2c3daa8bc7feacdc54bb9342ae2e49bbc3" gracePeriod=30 Nov 22 10:53:08 crc kubenswrapper[4743]: I1122 10:53:08.923808 4743 generic.go:334] "Generic (PLEG): container finished" podID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerID="60f51e33a98a40374c2117fb4eaa128dc055495b07382c37e2ccde713562a10d" exitCode=143 Nov 22 10:53:08 crc kubenswrapper[4743]: I1122 10:53:08.924017 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerName="nova-api-log" containerID="cri-o://10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b" gracePeriod=30 Nov 22 10:53:08 crc kubenswrapper[4743]: I1122 10:53:08.924098 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ced675f6-5342-4162-bf69-d8250ee6ba58","Type":"ContainerDied","Data":"60f51e33a98a40374c2117fb4eaa128dc055495b07382c37e2ccde713562a10d"} Nov 22 10:53:08 crc kubenswrapper[4743]: I1122 10:53:08.924540 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerName="nova-api-api" containerID="cri-o://5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786" gracePeriod=30 Nov 22 10:53:09 crc kubenswrapper[4743]: E1122 10:53:09.531121 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 10:53:09 crc kubenswrapper[4743]: E1122 10:53:09.532877 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 10:53:09 crc kubenswrapper[4743]: E1122 10:53:09.535051 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Nov 22 10:53:09 crc kubenswrapper[4743]: E1122 10:53:09.535095 4743 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9b35026a-cef7-4dd0-9446-429b448f7ed9" containerName="nova-scheduler-scheduler" Nov 22 10:53:09 crc kubenswrapper[4743]: I1122 10:53:09.935201 4743 generic.go:334] "Generic (PLEG): container finished" podID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerID="10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b" exitCode=143 Nov 22 10:53:09 crc kubenswrapper[4743]: I1122 10:53:09.935285 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86","Type":"ContainerDied","Data":"10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b"} Nov 22 10:53:09 crc kubenswrapper[4743]: I1122 10:53:09.940240 4743 generic.go:334] "Generic (PLEG): container finished" podID="e9508ef1-7649-4ffd-84af-de9884f26e1c" containerID="1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932" exitCode=0 Nov 22 10:53:09 crc kubenswrapper[4743]: I1122 10:53:09.940284 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e9508ef1-7649-4ffd-84af-de9884f26e1c","Type":"ContainerDied","Data":"1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932"} Nov 22 10:53:10 crc kubenswrapper[4743]: E1122 10:53:10.253043 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932 is running failed: container process not found" containerID="1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 10:53:10 crc kubenswrapper[4743]: E1122 10:53:10.253858 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932 is running failed: container process not found" containerID="1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 10:53:10 crc kubenswrapper[4743]: E1122 10:53:10.254270 4743 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932 is running failed: container process not found" containerID="1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Nov 22 10:53:10 crc kubenswrapper[4743]: E1122 10:53:10.254343 4743 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="e9508ef1-7649-4ffd-84af-de9884f26e1c" containerName="nova-cell1-conductor-conductor" Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.347929 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.406496 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-combined-ca-bundle\") pod \"e9508ef1-7649-4ffd-84af-de9884f26e1c\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.406595 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-config-data\") pod \"e9508ef1-7649-4ffd-84af-de9884f26e1c\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.408681 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq5vn\" (UniqueName: \"kubernetes.io/projected/e9508ef1-7649-4ffd-84af-de9884f26e1c-kube-api-access-vq5vn\") pod \"e9508ef1-7649-4ffd-84af-de9884f26e1c\" (UID: \"e9508ef1-7649-4ffd-84af-de9884f26e1c\") " Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.443167 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9508ef1-7649-4ffd-84af-de9884f26e1c-kube-api-access-vq5vn" (OuterVolumeSpecName: "kube-api-access-vq5vn") pod "e9508ef1-7649-4ffd-84af-de9884f26e1c" (UID: "e9508ef1-7649-4ffd-84af-de9884f26e1c"). InnerVolumeSpecName "kube-api-access-vq5vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.463113 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-config-data" (OuterVolumeSpecName: "config-data") pod "e9508ef1-7649-4ffd-84af-de9884f26e1c" (UID: "e9508ef1-7649-4ffd-84af-de9884f26e1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.485739 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9508ef1-7649-4ffd-84af-de9884f26e1c" (UID: "e9508ef1-7649-4ffd-84af-de9884f26e1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.511186 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.511219 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9508ef1-7649-4ffd-84af-de9884f26e1c-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.511229 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq5vn\" (UniqueName: \"kubernetes.io/projected/e9508ef1-7649-4ffd-84af-de9884f26e1c-kube-api-access-vq5vn\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.959506 4743 generic.go:334] "Generic (PLEG): container finished" podID="375736c0-507a-4cb9-bf8d-b0827eb30630" containerID="524a484ad4860bf6973687c1a8123e31d7919c0e0bf05519bcf74918c9eae0ae" exitCode=0 Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.959633 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"375736c0-507a-4cb9-bf8d-b0827eb30630","Type":"ContainerDied","Data":"524a484ad4860bf6973687c1a8123e31d7919c0e0bf05519bcf74918c9eae0ae"} Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.961485 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e9508ef1-7649-4ffd-84af-de9884f26e1c","Type":"ContainerDied","Data":"4f5d04b5c6f2fb4c61438f14e416ce860762637fd805754a3ad943dfa138f3a0"} Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.961523 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:10 crc kubenswrapper[4743]: I1122 10:53:10.961540 4743 scope.go:117] "RemoveContainer" containerID="1050eec1673765886d0ada900b00ea92432a9d8a44c1f7a2680cc5d34762f932" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.016935 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.024854 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.058216 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 10:53:11 crc kubenswrapper[4743]: E1122 10:53:11.058800 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerName="extract-utilities" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.058821 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerName="extract-utilities" Nov 22 10:53:11 crc kubenswrapper[4743]: E1122 10:53:11.058848 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerName="extract-content" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.058856 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerName="extract-content" Nov 22 10:53:11 crc kubenswrapper[4743]: E1122 10:53:11.058871 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9508ef1-7649-4ffd-84af-de9884f26e1c" containerName="nova-cell1-conductor-conductor" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.058879 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9508ef1-7649-4ffd-84af-de9884f26e1c" containerName="nova-cell1-conductor-conductor" Nov 22 10:53:11 crc kubenswrapper[4743]: E1122 10:53:11.058888 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerName="registry-server" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.058895 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerName="registry-server" Nov 22 10:53:11 crc kubenswrapper[4743]: E1122 10:53:11.058910 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f88c0f4-ddcb-4924-ab2a-3179a3f1f616" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.058918 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f88c0f4-ddcb-4924-ab2a-3179a3f1f616" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.059203 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3787e0-b70f-4c89-a89d-be15f5840012" containerName="registry-server" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.059230 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9508ef1-7649-4ffd-84af-de9884f26e1c" containerName="nova-cell1-conductor-conductor" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.059253 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f88c0f4-ddcb-4924-ab2a-3179a3f1f616" containerName="neutron-dhcp-openstack-openstack-cell1" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.060226 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.072235 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.077977 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.125221 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kmbc\" (UniqueName: \"kubernetes.io/projected/6d4f3395-9c57-463a-8d49-66ffad381e6c-kube-api-access-6kmbc\") pod \"nova-cell1-conductor-0\" (UID: \"6d4f3395-9c57-463a-8d49-66ffad381e6c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.125279 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f3395-9c57-463a-8d49-66ffad381e6c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d4f3395-9c57-463a-8d49-66ffad381e6c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.125669 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f3395-9c57-463a-8d49-66ffad381e6c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d4f3395-9c57-463a-8d49-66ffad381e6c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.167237 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9508ef1-7649-4ffd-84af-de9884f26e1c" path="/var/lib/kubelet/pods/e9508ef1-7649-4ffd-84af-de9884f26e1c/volumes" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.228501 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kmbc\" (UniqueName: \"kubernetes.io/projected/6d4f3395-9c57-463a-8d49-66ffad381e6c-kube-api-access-6kmbc\") pod \"nova-cell1-conductor-0\" (UID: \"6d4f3395-9c57-463a-8d49-66ffad381e6c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.228566 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f3395-9c57-463a-8d49-66ffad381e6c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d4f3395-9c57-463a-8d49-66ffad381e6c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.228761 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f3395-9c57-463a-8d49-66ffad381e6c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d4f3395-9c57-463a-8d49-66ffad381e6c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.247429 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d4f3395-9c57-463a-8d49-66ffad381e6c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6d4f3395-9c57-463a-8d49-66ffad381e6c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.250953 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kmbc\" (UniqueName: \"kubernetes.io/projected/6d4f3395-9c57-463a-8d49-66ffad381e6c-kube-api-access-6kmbc\") pod \"nova-cell1-conductor-0\" (UID: \"6d4f3395-9c57-463a-8d49-66ffad381e6c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.251376 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d4f3395-9c57-463a-8d49-66ffad381e6c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6d4f3395-9c57-463a-8d49-66ffad381e6c\") " pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.362359 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.408402 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.431111 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-config-data\") pod \"375736c0-507a-4cb9-bf8d-b0827eb30630\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.431298 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf4lk\" (UniqueName: \"kubernetes.io/projected/375736c0-507a-4cb9-bf8d-b0827eb30630-kube-api-access-gf4lk\") pod \"375736c0-507a-4cb9-bf8d-b0827eb30630\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.431535 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-combined-ca-bundle\") pod \"375736c0-507a-4cb9-bf8d-b0827eb30630\" (UID: \"375736c0-507a-4cb9-bf8d-b0827eb30630\") " Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.438851 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375736c0-507a-4cb9-bf8d-b0827eb30630-kube-api-access-gf4lk" (OuterVolumeSpecName: "kube-api-access-gf4lk") pod "375736c0-507a-4cb9-bf8d-b0827eb30630" (UID: "375736c0-507a-4cb9-bf8d-b0827eb30630"). InnerVolumeSpecName "kube-api-access-gf4lk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.476918 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-config-data" (OuterVolumeSpecName: "config-data") pod "375736c0-507a-4cb9-bf8d-b0827eb30630" (UID: "375736c0-507a-4cb9-bf8d-b0827eb30630"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.478543 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "375736c0-507a-4cb9-bf8d-b0827eb30630" (UID: "375736c0-507a-4cb9-bf8d-b0827eb30630"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.537365 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.537423 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/375736c0-507a-4cb9-bf8d-b0827eb30630-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.537437 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf4lk\" (UniqueName: \"kubernetes.io/projected/375736c0-507a-4cb9-bf8d-b0827eb30630-kube-api-access-gf4lk\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.753131 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": read tcp 10.217.0.2:34646->10.217.1.82:8775: read: connection reset by peer" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.753199 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.82:8775/\": read tcp 10.217.0.2:34634->10.217.1.82:8775: read: connection reset by peer" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.903396 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.990161 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6d4f3395-9c57-463a-8d49-66ffad381e6c","Type":"ContainerStarted","Data":"47effd73451db0d3573df8b49b8470610ac4e635e9590e6c890efc8f7e0905c7"} Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.992274 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.992271 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"375736c0-507a-4cb9-bf8d-b0827eb30630","Type":"ContainerDied","Data":"361bddd3c49bbf423431c1a60063dce1d35a2bb28c8f44310dc2a957f5dec040"} Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.992464 4743 scope.go:117] "RemoveContainer" containerID="524a484ad4860bf6973687c1a8123e31d7919c0e0bf05519bcf74918c9eae0ae" Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.999875 4743 generic.go:334] "Generic (PLEG): container finished" podID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerID="7987c3233e70cdcf23c8dbb58b86bb2c3daa8bc7feacdc54bb9342ae2e49bbc3" exitCode=0 Nov 22 10:53:11 crc kubenswrapper[4743]: I1122 10:53:11.999928 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ced675f6-5342-4162-bf69-d8250ee6ba58","Type":"ContainerDied","Data":"7987c3233e70cdcf23c8dbb58b86bb2c3daa8bc7feacdc54bb9342ae2e49bbc3"} Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.341443 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.352298 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.352766 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.375542 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 10:53:12 crc kubenswrapper[4743]: E1122 10:53:12.376118 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-metadata" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.376140 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-metadata" Nov 22 10:53:12 crc kubenswrapper[4743]: E1122 10:53:12.376172 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375736c0-507a-4cb9-bf8d-b0827eb30630" containerName="nova-cell0-conductor-conductor" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.376181 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="375736c0-507a-4cb9-bf8d-b0827eb30630" containerName="nova-cell0-conductor-conductor" Nov 22 10:53:12 crc kubenswrapper[4743]: E1122 10:53:12.376227 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-log" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.376235 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-log" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.376459 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-metadata" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.376495 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" containerName="nova-metadata-log" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.376514 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="375736c0-507a-4cb9-bf8d-b0827eb30630" containerName="nova-cell0-conductor-conductor" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.377396 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.381801 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.390724 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.463273 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ced675f6-5342-4162-bf69-d8250ee6ba58-logs\") pod \"ced675f6-5342-4162-bf69-d8250ee6ba58\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.463459 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzksw\" (UniqueName: \"kubernetes.io/projected/ced675f6-5342-4162-bf69-d8250ee6ba58-kube-api-access-nzksw\") pod \"ced675f6-5342-4162-bf69-d8250ee6ba58\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.463531 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-config-data\") pod \"ced675f6-5342-4162-bf69-d8250ee6ba58\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.463646 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-combined-ca-bundle\") pod \"ced675f6-5342-4162-bf69-d8250ee6ba58\" (UID: \"ced675f6-5342-4162-bf69-d8250ee6ba58\") " Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.463986 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0900ac26-4cb6-4e32-bce4-b2cfce7a18a5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5\") " pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.464107 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z79jn\" (UniqueName: \"kubernetes.io/projected/0900ac26-4cb6-4e32-bce4-b2cfce7a18a5-kube-api-access-z79jn\") pod \"nova-cell0-conductor-0\" (UID: \"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5\") " pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.464132 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0900ac26-4cb6-4e32-bce4-b2cfce7a18a5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5\") " pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.465230 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ced675f6-5342-4162-bf69-d8250ee6ba58-logs" (OuterVolumeSpecName: "logs") pod "ced675f6-5342-4162-bf69-d8250ee6ba58" (UID: "ced675f6-5342-4162-bf69-d8250ee6ba58"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.472074 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced675f6-5342-4162-bf69-d8250ee6ba58-kube-api-access-nzksw" (OuterVolumeSpecName: "kube-api-access-nzksw") pod "ced675f6-5342-4162-bf69-d8250ee6ba58" (UID: "ced675f6-5342-4162-bf69-d8250ee6ba58"). InnerVolumeSpecName "kube-api-access-nzksw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.510204 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ced675f6-5342-4162-bf69-d8250ee6ba58" (UID: "ced675f6-5342-4162-bf69-d8250ee6ba58"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.537017 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-config-data" (OuterVolumeSpecName: "config-data") pod "ced675f6-5342-4162-bf69-d8250ee6ba58" (UID: "ced675f6-5342-4162-bf69-d8250ee6ba58"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.579743 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.585786 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z79jn\" (UniqueName: \"kubernetes.io/projected/0900ac26-4cb6-4e32-bce4-b2cfce7a18a5-kube-api-access-z79jn\") pod \"nova-cell0-conductor-0\" (UID: \"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5\") " pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.585874 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0900ac26-4cb6-4e32-bce4-b2cfce7a18a5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5\") " pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.586151 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0900ac26-4cb6-4e32-bce4-b2cfce7a18a5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5\") " pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.586549 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzksw\" (UniqueName: \"kubernetes.io/projected/ced675f6-5342-4162-bf69-d8250ee6ba58-kube-api-access-nzksw\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.586575 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.586599 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ced675f6-5342-4162-bf69-d8250ee6ba58-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.586614 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ced675f6-5342-4162-bf69-d8250ee6ba58-logs\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.590787 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0900ac26-4cb6-4e32-bce4-b2cfce7a18a5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5\") " pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.597265 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0900ac26-4cb6-4e32-bce4-b2cfce7a18a5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5\") " pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.635868 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z79jn\" (UniqueName: \"kubernetes.io/projected/0900ac26-4cb6-4e32-bce4-b2cfce7a18a5-kube-api-access-z79jn\") pod \"nova-cell0-conductor-0\" (UID: \"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5\") " pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.689520 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xxvn\" (UniqueName: \"kubernetes.io/projected/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-kube-api-access-2xxvn\") pod \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.689934 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-logs\") pod \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.689993 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-combined-ca-bundle\") pod \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.690078 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-config-data\") pod \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\" (UID: \"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86\") " Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.690899 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-logs" (OuterVolumeSpecName: "logs") pod "e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" (UID: "e3c354a8-2d13-4d82-9bd5-1311e1fc5f86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.701202 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-kube-api-access-2xxvn" (OuterVolumeSpecName: "kube-api-access-2xxvn") pod "e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" (UID: "e3c354a8-2d13-4d82-9bd5-1311e1fc5f86"). InnerVolumeSpecName "kube-api-access-2xxvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.717817 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.725380 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" (UID: "e3c354a8-2d13-4d82-9bd5-1311e1fc5f86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.747947 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-config-data" (OuterVolumeSpecName: "config-data") pod "e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" (UID: "e3c354a8-2d13-4d82-9bd5-1311e1fc5f86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.792385 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xxvn\" (UniqueName: \"kubernetes.io/projected/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-kube-api-access-2xxvn\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.792428 4743 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-logs\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.792439 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:12 crc kubenswrapper[4743]: I1122 10:53:12.792447 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.013998 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6d4f3395-9c57-463a-8d49-66ffad381e6c","Type":"ContainerStarted","Data":"0bcf9a9a4c22f0eeffc90dada4a2c2917c59f2b29c607ec4d3b67e77ffd3cfbf"} Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.014338 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.018369 4743 generic.go:334] "Generic (PLEG): container finished" podID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerID="5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786" exitCode=0 Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.018441 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86","Type":"ContainerDied","Data":"5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786"} Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.018468 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.018487 4743 scope.go:117] "RemoveContainer" containerID="5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.018474 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e3c354a8-2d13-4d82-9bd5-1311e1fc5f86","Type":"ContainerDied","Data":"c2081bc6e343bda29d9986ce0d15ed78c154f2418843d610782542d621e60c81"} Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.024311 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ced675f6-5342-4162-bf69-d8250ee6ba58","Type":"ContainerDied","Data":"e07d632807aebf976a7a4c4ce0859a47f2995c4071fa7a6fde07ed41e11c2412"} Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.024376 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.049826 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.049808326 podStartE2EDuration="2.049808326s" podCreationTimestamp="2025-11-22 10:53:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:53:13.041984723 +0000 UTC m=+9066.748345775" watchObservedRunningTime="2025-11-22 10:53:13.049808326 +0000 UTC m=+9066.756169378" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.082721 4743 scope.go:117] "RemoveContainer" containerID="10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.089528 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.117377 4743 scope.go:117] "RemoveContainer" containerID="5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786" Nov 22 10:53:13 crc kubenswrapper[4743]: E1122 10:53:13.119760 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786\": container with ID starting with 5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786 not found: ID does not exist" containerID="5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.119806 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786"} err="failed to get container status \"5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786\": rpc error: code = NotFound desc = could not find container \"5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786\": container with ID starting with 5cd17367dc3e7676e54c6a4ba0ff8a7c0e7c26165c9921732f61edb9230c4786 not found: ID does not exist" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.119837 4743 scope.go:117] "RemoveContainer" containerID="10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b" Nov 22 10:53:13 crc kubenswrapper[4743]: E1122 10:53:13.120125 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b\": container with ID starting with 10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b not found: ID does not exist" containerID="10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.120144 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b"} err="failed to get container status \"10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b\": rpc error: code = NotFound desc = could not find container \"10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b\": container with ID starting with 10f8f75287f35b259bcf2ab1b1df632faa2de1c8a01faa1eaca13d11c569bd4b not found: ID does not exist" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.120157 4743 scope.go:117] "RemoveContainer" containerID="7987c3233e70cdcf23c8dbb58b86bb2c3daa8bc7feacdc54bb9342ae2e49bbc3" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.125998 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.133638 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.147648 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Nov 22 10:53:13 crc kubenswrapper[4743]: E1122 10:53:13.148170 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerName="nova-api-api" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.148182 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerName="nova-api-api" Nov 22 10:53:13 crc kubenswrapper[4743]: E1122 10:53:13.148198 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerName="nova-api-log" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.148203 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerName="nova-api-log" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.148408 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerName="nova-api-log" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.148435 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" containerName="nova-api-api" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.149758 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.156533 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.190291 4743 scope.go:117] "RemoveContainer" containerID="60f51e33a98a40374c2117fb4eaa128dc055495b07382c37e2ccde713562a10d" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.193201 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375736c0-507a-4cb9-bf8d-b0827eb30630" path="/var/lib/kubelet/pods/375736c0-507a-4cb9-bf8d-b0827eb30630/volumes" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.193765 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3c354a8-2d13-4d82-9bd5-1311e1fc5f86" path="/var/lib/kubelet/pods/e3c354a8-2d13-4d82-9bd5-1311e1fc5f86/volumes" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.195818 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.195856 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.200599 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-config-data\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.200970 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-logs\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.201202 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.201304 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk6z6\" (UniqueName: \"kubernetes.io/projected/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-kube-api-access-rk6z6\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.213671 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.218247 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.227219 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.251974 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 10:53:13 crc kubenswrapper[4743]: W1122 10:53:13.270512 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0900ac26_4cb6_4e32_bce4_b2cfce7a18a5.slice/crio-214d3c41d3e52c37400cd5af0f322dd478d14754fafac73c3404bc099e7d7c48 WatchSource:0}: Error finding container 214d3c41d3e52c37400cd5af0f322dd478d14754fafac73c3404bc099e7d7c48: Status 404 returned error can't find the container with id 214d3c41d3e52c37400cd5af0f322dd478d14754fafac73c3404bc099e7d7c48 Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.287995 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.303857 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4f6482d-edd6-49ad-8ef5-625281832a7b-config-data\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.304189 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.304240 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk6z6\" (UniqueName: \"kubernetes.io/projected/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-kube-api-access-rk6z6\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.304284 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2s45\" (UniqueName: \"kubernetes.io/projected/e4f6482d-edd6-49ad-8ef5-625281832a7b-kube-api-access-k2s45\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.304339 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4f6482d-edd6-49ad-8ef5-625281832a7b-logs\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.304356 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-config-data\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.304394 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f6482d-edd6-49ad-8ef5-625281832a7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.304431 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-logs\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.304928 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-logs\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.309727 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms"] Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.312002 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.315124 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.315637 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.315686 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.316395 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.316595 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.319396 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-config-data\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.319997 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.327646 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-t2kg4" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.336313 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk6z6\" (UniqueName: \"kubernetes.io/projected/d74dbd18-0a15-48c8-98f8-c9f4c67e82bd-kube-api-access-rk6z6\") pod \"nova-api-0\" (UID: \"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd\") " pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.343766 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.345234 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms"] Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.406293 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.406361 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4f6482d-edd6-49ad-8ef5-625281832a7b-logs\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.406440 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f6482d-edd6-49ad-8ef5-625281832a7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.406808 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4f6482d-edd6-49ad-8ef5-625281832a7b-logs\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.406978 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407013 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407052 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407074 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407097 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407161 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4f6482d-edd6-49ad-8ef5-625281832a7b-config-data\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407178 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407217 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmvjw\" (UniqueName: \"kubernetes.io/projected/aece6c0f-51a9-4480-8b13-0da51fca1fc8-kube-api-access-kmvjw\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407237 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407263 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407287 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.407320 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2s45\" (UniqueName: \"kubernetes.io/projected/e4f6482d-edd6-49ad-8ef5-625281832a7b-kube-api-access-k2s45\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.410755 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4f6482d-edd6-49ad-8ef5-625281832a7b-config-data\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.411473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4f6482d-edd6-49ad-8ef5-625281832a7b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.429473 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2s45\" (UniqueName: \"kubernetes.io/projected/e4f6482d-edd6-49ad-8ef5-625281832a7b-kube-api-access-k2s45\") pod \"nova-metadata-0\" (UID: \"e4f6482d-edd6-49ad-8ef5-625281832a7b\") " pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.481393 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.508846 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.508908 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.508971 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.510321 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.510480 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.510622 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.510944 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.511016 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.511148 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmvjw\" (UniqueName: \"kubernetes.io/projected/aece6c0f-51a9-4480-8b13-0da51fca1fc8-kube-api-access-kmvjw\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.511190 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.511240 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.511358 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.511662 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.514233 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.514270 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.515341 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.515612 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.515790 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.515897 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.516632 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.519131 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.532916 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmvjw\" (UniqueName: \"kubernetes.io/projected/aece6c0f-51a9-4480-8b13-0da51fca1fc8-kube-api-access-kmvjw\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.557241 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.561235 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 10:53:13 crc kubenswrapper[4743]: I1122 10:53:13.922375 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.025813 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-combined-ca-bundle\") pod \"9b35026a-cef7-4dd0-9446-429b448f7ed9\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.025941 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7tdj\" (UniqueName: \"kubernetes.io/projected/9b35026a-cef7-4dd0-9446-429b448f7ed9-kube-api-access-g7tdj\") pod \"9b35026a-cef7-4dd0-9446-429b448f7ed9\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.025973 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-config-data\") pod \"9b35026a-cef7-4dd0-9446-429b448f7ed9\" (UID: \"9b35026a-cef7-4dd0-9446-429b448f7ed9\") " Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.077804 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b35026a-cef7-4dd0-9446-429b448f7ed9-kube-api-access-g7tdj" (OuterVolumeSpecName: "kube-api-access-g7tdj") pod "9b35026a-cef7-4dd0-9446-429b448f7ed9" (UID: "9b35026a-cef7-4dd0-9446-429b448f7ed9"). InnerVolumeSpecName "kube-api-access-g7tdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.119296 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b35026a-cef7-4dd0-9446-429b448f7ed9" (UID: "9b35026a-cef7-4dd0-9446-429b448f7ed9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.122961 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5","Type":"ContainerStarted","Data":"80e9f5378918e76ff27fc05e1a7980f245a5149447a43ffa90a6bb3fe2fb8930"} Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.123097 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"0900ac26-4cb6-4e32-bce4-b2cfce7a18a5","Type":"ContainerStarted","Data":"214d3c41d3e52c37400cd5af0f322dd478d14754fafac73c3404bc099e7d7c48"} Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.124392 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.130128 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.130181 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7tdj\" (UniqueName: \"kubernetes.io/projected/9b35026a-cef7-4dd0-9446-429b448f7ed9-kube-api-access-g7tdj\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.172235 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.172213707 podStartE2EDuration="2.172213707s" podCreationTimestamp="2025-11-22 10:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:53:14.154292775 +0000 UTC m=+9067.860653827" watchObservedRunningTime="2025-11-22 10:53:14.172213707 +0000 UTC m=+9067.878574759" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.215897 4743 generic.go:334] "Generic (PLEG): container finished" podID="9b35026a-cef7-4dd0-9446-429b448f7ed9" containerID="73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635" exitCode=0 Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.219729 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.220635 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b35026a-cef7-4dd0-9446-429b448f7ed9","Type":"ContainerDied","Data":"73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635"} Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.220674 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b35026a-cef7-4dd0-9446-429b448f7ed9","Type":"ContainerDied","Data":"ca885b9877e2acb7dc9fdb15427eebd92441fc456b271973a7b5edec901980c0"} Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.220692 4743 scope.go:117] "RemoveContainer" containerID="73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.220693 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-config-data" (OuterVolumeSpecName: "config-data") pod "9b35026a-cef7-4dd0-9446-429b448f7ed9" (UID: "9b35026a-cef7-4dd0-9446-429b448f7ed9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.231864 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b35026a-cef7-4dd0-9446-429b448f7ed9-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.326949 4743 scope.go:117] "RemoveContainer" containerID="73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635" Nov 22 10:53:14 crc kubenswrapper[4743]: E1122 10:53:14.329219 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635\": container with ID starting with 73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635 not found: ID does not exist" containerID="73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.329276 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635"} err="failed to get container status \"73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635\": rpc error: code = NotFound desc = could not find container \"73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635\": container with ID starting with 73587c660195581e60af266ec8fdb04b8f17460643cb004f7279f81d8bacf635 not found: ID does not exist" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.339883 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.535348 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.588674 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.607638 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.627232 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 10:53:14 crc kubenswrapper[4743]: E1122 10:53:14.627873 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b35026a-cef7-4dd0-9446-429b448f7ed9" containerName="nova-scheduler-scheduler" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.627889 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b35026a-cef7-4dd0-9446-429b448f7ed9" containerName="nova-scheduler-scheduler" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.628175 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b35026a-cef7-4dd0-9446-429b448f7ed9" containerName="nova-scheduler-scheduler" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.629263 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.631979 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.645501 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.716440 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms"] Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.753350 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/248b4908-139e-45b1-a6cf-b398b9e23b90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"248b4908-139e-45b1-a6cf-b398b9e23b90\") " pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.753976 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248b4908-139e-45b1-a6cf-b398b9e23b90-config-data\") pod \"nova-scheduler-0\" (UID: \"248b4908-139e-45b1-a6cf-b398b9e23b90\") " pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.754228 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t678n\" (UniqueName: \"kubernetes.io/projected/248b4908-139e-45b1-a6cf-b398b9e23b90-kube-api-access-t678n\") pod \"nova-scheduler-0\" (UID: \"248b4908-139e-45b1-a6cf-b398b9e23b90\") " pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.855648 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/248b4908-139e-45b1-a6cf-b398b9e23b90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"248b4908-139e-45b1-a6cf-b398b9e23b90\") " pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.855713 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248b4908-139e-45b1-a6cf-b398b9e23b90-config-data\") pod \"nova-scheduler-0\" (UID: \"248b4908-139e-45b1-a6cf-b398b9e23b90\") " pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.855789 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t678n\" (UniqueName: \"kubernetes.io/projected/248b4908-139e-45b1-a6cf-b398b9e23b90-kube-api-access-t678n\") pod \"nova-scheduler-0\" (UID: \"248b4908-139e-45b1-a6cf-b398b9e23b90\") " pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.859768 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/248b4908-139e-45b1-a6cf-b398b9e23b90-config-data\") pod \"nova-scheduler-0\" (UID: \"248b4908-139e-45b1-a6cf-b398b9e23b90\") " pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.862780 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/248b4908-139e-45b1-a6cf-b398b9e23b90-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"248b4908-139e-45b1-a6cf-b398b9e23b90\") " pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.873359 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t678n\" (UniqueName: \"kubernetes.io/projected/248b4908-139e-45b1-a6cf-b398b9e23b90-kube-api-access-t678n\") pod \"nova-scheduler-0\" (UID: \"248b4908-139e-45b1-a6cf-b398b9e23b90\") " pod="openstack/nova-scheduler-0" Nov 22 10:53:14 crc kubenswrapper[4743]: I1122 10:53:14.949535 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.172267 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b35026a-cef7-4dd0-9446-429b448f7ed9" path="/var/lib/kubelet/pods/9b35026a-cef7-4dd0-9446-429b448f7ed9/volumes" Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.177004 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced675f6-5342-4162-bf69-d8250ee6ba58" path="/var/lib/kubelet/pods/ced675f6-5342-4162-bf69-d8250ee6ba58/volumes" Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.228951 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" event={"ID":"aece6c0f-51a9-4480-8b13-0da51fca1fc8","Type":"ContainerStarted","Data":"bfc692193cc20f7e925b52bab64bab25a23364eac9c6a0f55befc635dd28ac56"} Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.244302 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4f6482d-edd6-49ad-8ef5-625281832a7b","Type":"ContainerStarted","Data":"3292cd2a471977aded7dde106b14617ec24f1806cdbee1b97566429167a25549"} Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.244385 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4f6482d-edd6-49ad-8ef5-625281832a7b","Type":"ContainerStarted","Data":"b248592f450cf63f1c168c8443348a4a82ac67a62ed0de5fe1c431d796f2d9c3"} Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.244400 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e4f6482d-edd6-49ad-8ef5-625281832a7b","Type":"ContainerStarted","Data":"192ceff25cb2d869b9fa08c63b93356e73101430ac54da84f548b1dfacc77f50"} Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.252414 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd","Type":"ContainerStarted","Data":"f0af748fc9d9e4e4d7741459f10101cb48f51b7cbfb5b63d42849c240f3ad32f"} Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.252874 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd","Type":"ContainerStarted","Data":"5a919c0d02032cc0da67ce5e29a567778e45b785856f2cdbbee74deba603142f"} Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.252920 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d74dbd18-0a15-48c8-98f8-c9f4c67e82bd","Type":"ContainerStarted","Data":"e0fb7f29708f2641583af009bfce8ff50902c79a39d8604c95b0707986105e49"} Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.270157 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.270124948 podStartE2EDuration="2.270124948s" podCreationTimestamp="2025-11-22 10:53:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:53:15.266739851 +0000 UTC m=+9068.973100913" watchObservedRunningTime="2025-11-22 10:53:15.270124948 +0000 UTC m=+9068.976486000" Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.297201 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.297177092 podStartE2EDuration="2.297177092s" podCreationTimestamp="2025-11-22 10:53:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:53:15.284482929 +0000 UTC m=+9068.990843971" watchObservedRunningTime="2025-11-22 10:53:15.297177092 +0000 UTC m=+9069.003538144" Nov 22 10:53:15 crc kubenswrapper[4743]: I1122 10:53:15.447202 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Nov 22 10:53:15 crc kubenswrapper[4743]: W1122 10:53:15.746763 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod248b4908_139e_45b1_a6cf_b398b9e23b90.slice/crio-e568e91b96ad0e2d9e36e24009702f8542602024f6530f6b40ca9ed4c4265a40 WatchSource:0}: Error finding container e568e91b96ad0e2d9e36e24009702f8542602024f6530f6b40ca9ed4c4265a40: Status 404 returned error can't find the container with id e568e91b96ad0e2d9e36e24009702f8542602024f6530f6b40ca9ed4c4265a40 Nov 22 10:53:16 crc kubenswrapper[4743]: I1122 10:53:16.272865 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" event={"ID":"aece6c0f-51a9-4480-8b13-0da51fca1fc8","Type":"ContainerStarted","Data":"ed29bd136cfc4a8a77c7465c55c835b79e41265f9525bb626fc991a677f0125f"} Nov 22 10:53:16 crc kubenswrapper[4743]: I1122 10:53:16.275913 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"248b4908-139e-45b1-a6cf-b398b9e23b90","Type":"ContainerStarted","Data":"eb52c5e97761b67627085f4a4e45ac3439bac4c2446686e3ca94616518ca80a5"} Nov 22 10:53:16 crc kubenswrapper[4743]: I1122 10:53:16.275987 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"248b4908-139e-45b1-a6cf-b398b9e23b90","Type":"ContainerStarted","Data":"e568e91b96ad0e2d9e36e24009702f8542602024f6530f6b40ca9ed4c4265a40"} Nov 22 10:53:16 crc kubenswrapper[4743]: I1122 10:53:16.307855 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" podStartSLOduration=2.8066638729999998 podStartE2EDuration="3.307834076s" podCreationTimestamp="2025-11-22 10:53:13 +0000 UTC" firstStartedPulling="2025-11-22 10:53:14.732649496 +0000 UTC m=+9068.439010558" lastFinishedPulling="2025-11-22 10:53:15.233819719 +0000 UTC m=+9068.940180761" observedRunningTime="2025-11-22 10:53:16.300675761 +0000 UTC m=+9070.007036823" watchObservedRunningTime="2025-11-22 10:53:16.307834076 +0000 UTC m=+9070.014195138" Nov 22 10:53:16 crc kubenswrapper[4743]: I1122 10:53:16.329048 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.329031012 podStartE2EDuration="2.329031012s" podCreationTimestamp="2025-11-22 10:53:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 10:53:16.320704554 +0000 UTC m=+9070.027065596" watchObservedRunningTime="2025-11-22 10:53:16.329031012 +0000 UTC m=+9070.035392064" Nov 22 10:53:18 crc kubenswrapper[4743]: I1122 10:53:18.558129 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 10:53:18 crc kubenswrapper[4743]: I1122 10:53:18.558859 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Nov 22 10:53:19 crc kubenswrapper[4743]: I1122 10:53:19.950039 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Nov 22 10:53:21 crc kubenswrapper[4743]: I1122 10:53:21.441655 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Nov 22 10:53:22 crc kubenswrapper[4743]: I1122 10:53:22.749176 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Nov 22 10:53:23 crc kubenswrapper[4743]: I1122 10:53:23.482253 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 10:53:23 crc kubenswrapper[4743]: I1122 10:53:23.482315 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Nov 22 10:53:23 crc kubenswrapper[4743]: I1122 10:53:23.558454 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 10:53:23 crc kubenswrapper[4743]: I1122 10:53:23.558512 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Nov 22 10:53:24 crc kubenswrapper[4743]: I1122 10:53:24.563789 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d74dbd18-0a15-48c8-98f8-c9f4c67e82bd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 10:53:24 crc kubenswrapper[4743]: I1122 10:53:24.563822 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d74dbd18-0a15-48c8-98f8-c9f4c67e82bd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 10:53:24 crc kubenswrapper[4743]: I1122 10:53:24.646794 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e4f6482d-edd6-49ad-8ef5-625281832a7b" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 10:53:24 crc kubenswrapper[4743]: I1122 10:53:24.646835 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e4f6482d-edd6-49ad-8ef5-625281832a7b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 10:53:24 crc kubenswrapper[4743]: I1122 10:53:24.950548 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Nov 22 10:53:24 crc kubenswrapper[4743]: I1122 10:53:24.991152 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Nov 22 10:53:25 crc kubenswrapper[4743]: I1122 10:53:25.394757 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Nov 22 10:53:33 crc kubenswrapper[4743]: I1122 10:53:33.484819 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 10:53:33 crc kubenswrapper[4743]: I1122 10:53:33.485434 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Nov 22 10:53:33 crc kubenswrapper[4743]: I1122 10:53:33.486022 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 10:53:33 crc kubenswrapper[4743]: I1122 10:53:33.486449 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Nov 22 10:53:33 crc kubenswrapper[4743]: I1122 10:53:33.491314 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 10:53:33 crc kubenswrapper[4743]: I1122 10:53:33.491877 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Nov 22 10:53:33 crc kubenswrapper[4743]: I1122 10:53:33.567859 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 10:53:33 crc kubenswrapper[4743]: I1122 10:53:33.568157 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Nov 22 10:53:33 crc kubenswrapper[4743]: I1122 10:53:33.574660 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 10:53:34 crc kubenswrapper[4743]: I1122 10:53:34.446988 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.453547 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j4lgh"] Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.464923 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.472261 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j4lgh"] Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.644147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-catalog-content\") pod \"certified-operators-j4lgh\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.644344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtgmt\" (UniqueName: \"kubernetes.io/projected/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-kube-api-access-xtgmt\") pod \"certified-operators-j4lgh\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.644377 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-utilities\") pod \"certified-operators-j4lgh\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.746169 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtgmt\" (UniqueName: \"kubernetes.io/projected/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-kube-api-access-xtgmt\") pod \"certified-operators-j4lgh\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.746235 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-utilities\") pod \"certified-operators-j4lgh\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.746362 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-catalog-content\") pod \"certified-operators-j4lgh\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.747039 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-catalog-content\") pod \"certified-operators-j4lgh\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.747624 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-utilities\") pod \"certified-operators-j4lgh\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.770634 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtgmt\" (UniqueName: \"kubernetes.io/projected/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-kube-api-access-xtgmt\") pod \"certified-operators-j4lgh\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:10 crc kubenswrapper[4743]: I1122 10:54:10.802514 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:11 crc kubenswrapper[4743]: I1122 10:54:11.390817 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j4lgh"] Nov 22 10:54:11 crc kubenswrapper[4743]: I1122 10:54:11.791405 4743 generic.go:334] "Generic (PLEG): container finished" podID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerID="b112ded39545a8e64196dc8b722f1f6984f8a8b37a72987dbc722be9f6d5263c" exitCode=0 Nov 22 10:54:11 crc kubenswrapper[4743]: I1122 10:54:11.791460 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4lgh" event={"ID":"058c4e46-0368-4ecb-a33f-6bf41a2a4d48","Type":"ContainerDied","Data":"b112ded39545a8e64196dc8b722f1f6984f8a8b37a72987dbc722be9f6d5263c"} Nov 22 10:54:11 crc kubenswrapper[4743]: I1122 10:54:11.791522 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4lgh" event={"ID":"058c4e46-0368-4ecb-a33f-6bf41a2a4d48","Type":"ContainerStarted","Data":"14481d18b350d5088cd0bda885594f02e8c17d46aaf6d1ca2fed67e39e160fdf"} Nov 22 10:54:11 crc kubenswrapper[4743]: I1122 10:54:11.794010 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 10:54:12 crc kubenswrapper[4743]: I1122 10:54:12.802794 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4lgh" event={"ID":"058c4e46-0368-4ecb-a33f-6bf41a2a4d48","Type":"ContainerStarted","Data":"e22e24a95d6b0fb8d9f07bb08a02ec917c32cb9b4344ad0be36dbc5f1ffd9c9e"} Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.459760 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-frwwm"] Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.464594 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.473786 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-frwwm"] Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.616938 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-utilities\") pod \"redhat-operators-frwwm\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.617029 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-catalog-content\") pod \"redhat-operators-frwwm\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.617344 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g82sk\" (UniqueName: \"kubernetes.io/projected/ee3e03e0-b6cb-432c-9553-660658c98683-kube-api-access-g82sk\") pod \"redhat-operators-frwwm\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.719948 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g82sk\" (UniqueName: \"kubernetes.io/projected/ee3e03e0-b6cb-432c-9553-660658c98683-kube-api-access-g82sk\") pod \"redhat-operators-frwwm\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.720048 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-utilities\") pod \"redhat-operators-frwwm\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.720126 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-catalog-content\") pod \"redhat-operators-frwwm\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.720531 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-utilities\") pod \"redhat-operators-frwwm\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.720565 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-catalog-content\") pod \"redhat-operators-frwwm\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.743150 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g82sk\" (UniqueName: \"kubernetes.io/projected/ee3e03e0-b6cb-432c-9553-660658c98683-kube-api-access-g82sk\") pod \"redhat-operators-frwwm\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:13 crc kubenswrapper[4743]: I1122 10:54:13.793731 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:14 crc kubenswrapper[4743]: I1122 10:54:14.336526 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-frwwm"] Nov 22 10:54:14 crc kubenswrapper[4743]: W1122 10:54:14.345783 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee3e03e0_b6cb_432c_9553_660658c98683.slice/crio-91cecc9c8ec16c99df7248e6f08cf5b1e55738929ca27fcd0fdbfe24a0950283 WatchSource:0}: Error finding container 91cecc9c8ec16c99df7248e6f08cf5b1e55738929ca27fcd0fdbfe24a0950283: Status 404 returned error can't find the container with id 91cecc9c8ec16c99df7248e6f08cf5b1e55738929ca27fcd0fdbfe24a0950283 Nov 22 10:54:14 crc kubenswrapper[4743]: I1122 10:54:14.826060 4743 generic.go:334] "Generic (PLEG): container finished" podID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerID="e22e24a95d6b0fb8d9f07bb08a02ec917c32cb9b4344ad0be36dbc5f1ffd9c9e" exitCode=0 Nov 22 10:54:14 crc kubenswrapper[4743]: I1122 10:54:14.826150 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4lgh" event={"ID":"058c4e46-0368-4ecb-a33f-6bf41a2a4d48","Type":"ContainerDied","Data":"e22e24a95d6b0fb8d9f07bb08a02ec917c32cb9b4344ad0be36dbc5f1ffd9c9e"} Nov 22 10:54:14 crc kubenswrapper[4743]: I1122 10:54:14.830549 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frwwm" event={"ID":"ee3e03e0-b6cb-432c-9553-660658c98683","Type":"ContainerDied","Data":"81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906"} Nov 22 10:54:14 crc kubenswrapper[4743]: I1122 10:54:14.830673 4743 generic.go:334] "Generic (PLEG): container finished" podID="ee3e03e0-b6cb-432c-9553-660658c98683" containerID="81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906" exitCode=0 Nov 22 10:54:14 crc kubenswrapper[4743]: I1122 10:54:14.830717 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frwwm" event={"ID":"ee3e03e0-b6cb-432c-9553-660658c98683","Type":"ContainerStarted","Data":"91cecc9c8ec16c99df7248e6f08cf5b1e55738929ca27fcd0fdbfe24a0950283"} Nov 22 10:54:15 crc kubenswrapper[4743]: I1122 10:54:15.847098 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4lgh" event={"ID":"058c4e46-0368-4ecb-a33f-6bf41a2a4d48","Type":"ContainerStarted","Data":"d9e99eb374aab1f7d5bc09c6dab025e78fab279cedebbd76dca438bcf8eec95e"} Nov 22 10:54:15 crc kubenswrapper[4743]: I1122 10:54:15.854126 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frwwm" event={"ID":"ee3e03e0-b6cb-432c-9553-660658c98683","Type":"ContainerStarted","Data":"726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d"} Nov 22 10:54:15 crc kubenswrapper[4743]: I1122 10:54:15.868011 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j4lgh" podStartSLOduration=2.4427778 podStartE2EDuration="5.867991524s" podCreationTimestamp="2025-11-22 10:54:10 +0000 UTC" firstStartedPulling="2025-11-22 10:54:11.793808057 +0000 UTC m=+9125.500169109" lastFinishedPulling="2025-11-22 10:54:15.219021781 +0000 UTC m=+9128.925382833" observedRunningTime="2025-11-22 10:54:15.865907814 +0000 UTC m=+9129.572268866" watchObservedRunningTime="2025-11-22 10:54:15.867991524 +0000 UTC m=+9129.574352576" Nov 22 10:54:20 crc kubenswrapper[4743]: I1122 10:54:20.802625 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:20 crc kubenswrapper[4743]: I1122 10:54:20.803224 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:21 crc kubenswrapper[4743]: I1122 10:54:21.852862 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-j4lgh" podUID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerName="registry-server" probeResult="failure" output=< Nov 22 10:54:21 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 10:54:21 crc kubenswrapper[4743]: > Nov 22 10:54:22 crc kubenswrapper[4743]: I1122 10:54:22.922998 4743 generic.go:334] "Generic (PLEG): container finished" podID="ee3e03e0-b6cb-432c-9553-660658c98683" containerID="726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d" exitCode=0 Nov 22 10:54:22 crc kubenswrapper[4743]: I1122 10:54:22.923077 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frwwm" event={"ID":"ee3e03e0-b6cb-432c-9553-660658c98683","Type":"ContainerDied","Data":"726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d"} Nov 22 10:54:23 crc kubenswrapper[4743]: I1122 10:54:23.940827 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frwwm" event={"ID":"ee3e03e0-b6cb-432c-9553-660658c98683","Type":"ContainerStarted","Data":"aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84"} Nov 22 10:54:23 crc kubenswrapper[4743]: I1122 10:54:23.963870 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-frwwm" podStartSLOduration=2.423198284 podStartE2EDuration="10.963846904s" podCreationTimestamp="2025-11-22 10:54:13 +0000 UTC" firstStartedPulling="2025-11-22 10:54:14.8332305 +0000 UTC m=+9128.539591552" lastFinishedPulling="2025-11-22 10:54:23.37387912 +0000 UTC m=+9137.080240172" observedRunningTime="2025-11-22 10:54:23.959597722 +0000 UTC m=+9137.665958774" watchObservedRunningTime="2025-11-22 10:54:23.963846904 +0000 UTC m=+9137.670207956" Nov 22 10:54:30 crc kubenswrapper[4743]: I1122 10:54:30.856350 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:30 crc kubenswrapper[4743]: I1122 10:54:30.920052 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:31 crc kubenswrapper[4743]: I1122 10:54:31.095099 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j4lgh"] Nov 22 10:54:32 crc kubenswrapper[4743]: I1122 10:54:32.044874 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j4lgh" podUID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerName="registry-server" containerID="cri-o://d9e99eb374aab1f7d5bc09c6dab025e78fab279cedebbd76dca438bcf8eec95e" gracePeriod=2 Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.061267 4743 generic.go:334] "Generic (PLEG): container finished" podID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerID="d9e99eb374aab1f7d5bc09c6dab025e78fab279cedebbd76dca438bcf8eec95e" exitCode=0 Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.061678 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4lgh" event={"ID":"058c4e46-0368-4ecb-a33f-6bf41a2a4d48","Type":"ContainerDied","Data":"d9e99eb374aab1f7d5bc09c6dab025e78fab279cedebbd76dca438bcf8eec95e"} Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.330671 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.481254 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtgmt\" (UniqueName: \"kubernetes.io/projected/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-kube-api-access-xtgmt\") pod \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.481440 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-utilities\") pod \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.481476 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-catalog-content\") pod \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\" (UID: \"058c4e46-0368-4ecb-a33f-6bf41a2a4d48\") " Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.482265 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-utilities" (OuterVolumeSpecName: "utilities") pod "058c4e46-0368-4ecb-a33f-6bf41a2a4d48" (UID: "058c4e46-0368-4ecb-a33f-6bf41a2a4d48"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.488277 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-kube-api-access-xtgmt" (OuterVolumeSpecName: "kube-api-access-xtgmt") pod "058c4e46-0368-4ecb-a33f-6bf41a2a4d48" (UID: "058c4e46-0368-4ecb-a33f-6bf41a2a4d48"). InnerVolumeSpecName "kube-api-access-xtgmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.534251 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "058c4e46-0368-4ecb-a33f-6bf41a2a4d48" (UID: "058c4e46-0368-4ecb-a33f-6bf41a2a4d48"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.584876 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.585136 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.585205 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtgmt\" (UniqueName: \"kubernetes.io/projected/058c4e46-0368-4ecb-a33f-6bf41a2a4d48-kube-api-access-xtgmt\") on node \"crc\" DevicePath \"\"" Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.794336 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:33 crc kubenswrapper[4743]: I1122 10:54:33.794391 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:54:34 crc kubenswrapper[4743]: I1122 10:54:34.073705 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j4lgh" event={"ID":"058c4e46-0368-4ecb-a33f-6bf41a2a4d48","Type":"ContainerDied","Data":"14481d18b350d5088cd0bda885594f02e8c17d46aaf6d1ca2fed67e39e160fdf"} Nov 22 10:54:34 crc kubenswrapper[4743]: I1122 10:54:34.074038 4743 scope.go:117] "RemoveContainer" containerID="d9e99eb374aab1f7d5bc09c6dab025e78fab279cedebbd76dca438bcf8eec95e" Nov 22 10:54:34 crc kubenswrapper[4743]: I1122 10:54:34.073821 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j4lgh" Nov 22 10:54:34 crc kubenswrapper[4743]: I1122 10:54:34.103710 4743 scope.go:117] "RemoveContainer" containerID="e22e24a95d6b0fb8d9f07bb08a02ec917c32cb9b4344ad0be36dbc5f1ffd9c9e" Nov 22 10:54:34 crc kubenswrapper[4743]: I1122 10:54:34.114246 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j4lgh"] Nov 22 10:54:34 crc kubenswrapper[4743]: I1122 10:54:34.123493 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j4lgh"] Nov 22 10:54:34 crc kubenswrapper[4743]: I1122 10:54:34.551695 4743 scope.go:117] "RemoveContainer" containerID="b112ded39545a8e64196dc8b722f1f6984f8a8b37a72987dbc722be9f6d5263c" Nov 22 10:54:34 crc kubenswrapper[4743]: I1122 10:54:34.839701 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-frwwm" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="registry-server" probeResult="failure" output=< Nov 22 10:54:34 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 10:54:34 crc kubenswrapper[4743]: > Nov 22 10:54:35 crc kubenswrapper[4743]: I1122 10:54:35.163216 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" path="/var/lib/kubelet/pods/058c4e46-0368-4ecb-a33f-6bf41a2a4d48/volumes" Nov 22 10:54:44 crc kubenswrapper[4743]: I1122 10:54:44.857072 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-frwwm" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="registry-server" probeResult="failure" output=< Nov 22 10:54:44 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 10:54:44 crc kubenswrapper[4743]: > Nov 22 10:54:54 crc kubenswrapper[4743]: I1122 10:54:54.845472 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-frwwm" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="registry-server" probeResult="failure" output=< Nov 22 10:54:54 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 10:54:54 crc kubenswrapper[4743]: > Nov 22 10:55:01 crc kubenswrapper[4743]: I1122 10:55:01.241243 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:55:01 crc kubenswrapper[4743]: I1122 10:55:01.242022 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:55:04 crc kubenswrapper[4743]: I1122 10:55:04.290241 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:55:04 crc kubenswrapper[4743]: I1122 10:55:04.362734 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:55:04 crc kubenswrapper[4743]: I1122 10:55:04.535035 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-frwwm"] Nov 22 10:55:05 crc kubenswrapper[4743]: I1122 10:55:05.374266 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-frwwm" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="registry-server" containerID="cri-o://aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84" gracePeriod=2 Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.248213 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.338750 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g82sk\" (UniqueName: \"kubernetes.io/projected/ee3e03e0-b6cb-432c-9553-660658c98683-kube-api-access-g82sk\") pod \"ee3e03e0-b6cb-432c-9553-660658c98683\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.338827 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-utilities\") pod \"ee3e03e0-b6cb-432c-9553-660658c98683\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.338975 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-catalog-content\") pod \"ee3e03e0-b6cb-432c-9553-660658c98683\" (UID: \"ee3e03e0-b6cb-432c-9553-660658c98683\") " Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.340161 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-utilities" (OuterVolumeSpecName: "utilities") pod "ee3e03e0-b6cb-432c-9553-660658c98683" (UID: "ee3e03e0-b6cb-432c-9553-660658c98683"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.347877 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee3e03e0-b6cb-432c-9553-660658c98683-kube-api-access-g82sk" (OuterVolumeSpecName: "kube-api-access-g82sk") pod "ee3e03e0-b6cb-432c-9553-660658c98683" (UID: "ee3e03e0-b6cb-432c-9553-660658c98683"). InnerVolumeSpecName "kube-api-access-g82sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.396321 4743 generic.go:334] "Generic (PLEG): container finished" podID="ee3e03e0-b6cb-432c-9553-660658c98683" containerID="aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84" exitCode=0 Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.396376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frwwm" event={"ID":"ee3e03e0-b6cb-432c-9553-660658c98683","Type":"ContainerDied","Data":"aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84"} Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.396409 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-frwwm" event={"ID":"ee3e03e0-b6cb-432c-9553-660658c98683","Type":"ContainerDied","Data":"91cecc9c8ec16c99df7248e6f08cf5b1e55738929ca27fcd0fdbfe24a0950283"} Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.396433 4743 scope.go:117] "RemoveContainer" containerID="aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.396843 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-frwwm" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.437049 4743 scope.go:117] "RemoveContainer" containerID="726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.443830 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g82sk\" (UniqueName: \"kubernetes.io/projected/ee3e03e0-b6cb-432c-9553-660658c98683-kube-api-access-g82sk\") on node \"crc\" DevicePath \"\"" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.443869 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.447901 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee3e03e0-b6cb-432c-9553-660658c98683" (UID: "ee3e03e0-b6cb-432c-9553-660658c98683"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.469154 4743 scope.go:117] "RemoveContainer" containerID="81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.520142 4743 scope.go:117] "RemoveContainer" containerID="aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84" Nov 22 10:55:06 crc kubenswrapper[4743]: E1122 10:55:06.521070 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84\": container with ID starting with aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84 not found: ID does not exist" containerID="aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.521129 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84"} err="failed to get container status \"aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84\": rpc error: code = NotFound desc = could not find container \"aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84\": container with ID starting with aa44e8b67ba850bbc5b81a3a09fdf0f53acd2fc4e15ac2e9c5d6f4fe04df4e84 not found: ID does not exist" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.521158 4743 scope.go:117] "RemoveContainer" containerID="726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d" Nov 22 10:55:06 crc kubenswrapper[4743]: E1122 10:55:06.521761 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d\": container with ID starting with 726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d not found: ID does not exist" containerID="726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.521793 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d"} err="failed to get container status \"726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d\": rpc error: code = NotFound desc = could not find container \"726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d\": container with ID starting with 726a346a52d25529d4a19233884af5808e23e6370cc76a1d469b01ae78078a7d not found: ID does not exist" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.521817 4743 scope.go:117] "RemoveContainer" containerID="81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906" Nov 22 10:55:06 crc kubenswrapper[4743]: E1122 10:55:06.522232 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906\": container with ID starting with 81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906 not found: ID does not exist" containerID="81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.522254 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906"} err="failed to get container status \"81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906\": rpc error: code = NotFound desc = could not find container \"81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906\": container with ID starting with 81ef33a7c0e44a22de6f66c2a0978dc6665e02b4afb3a27c00063961ffb38906 not found: ID does not exist" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.546265 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee3e03e0-b6cb-432c-9553-660658c98683-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.738653 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-frwwm"] Nov 22 10:55:06 crc kubenswrapper[4743]: I1122 10:55:06.749130 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-frwwm"] Nov 22 10:55:07 crc kubenswrapper[4743]: I1122 10:55:07.165355 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" path="/var/lib/kubelet/pods/ee3e03e0-b6cb-432c-9553-660658c98683/volumes" Nov 22 10:55:31 crc kubenswrapper[4743]: I1122 10:55:31.241935 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:55:31 crc kubenswrapper[4743]: I1122 10:55:31.242518 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:56:01 crc kubenswrapper[4743]: I1122 10:56:01.241616 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 10:56:01 crc kubenswrapper[4743]: I1122 10:56:01.242207 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 10:56:01 crc kubenswrapper[4743]: I1122 10:56:01.242263 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 10:56:01 crc kubenswrapper[4743]: I1122 10:56:01.243179 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 10:56:01 crc kubenswrapper[4743]: I1122 10:56:01.243244 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" gracePeriod=600 Nov 22 10:56:01 crc kubenswrapper[4743]: E1122 10:56:01.683886 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:56:01 crc kubenswrapper[4743]: I1122 10:56:01.947116 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" exitCode=0 Nov 22 10:56:01 crc kubenswrapper[4743]: I1122 10:56:01.947196 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63"} Nov 22 10:56:01 crc kubenswrapper[4743]: I1122 10:56:01.947292 4743 scope.go:117] "RemoveContainer" containerID="82d7fb2ec1629cbdcf99bab7b5eb0f8726527d7c49663a3d80c02d05249f64d5" Nov 22 10:56:01 crc kubenswrapper[4743]: I1122 10:56:01.948897 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:56:01 crc kubenswrapper[4743]: E1122 10:56:01.949770 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:56:13 crc kubenswrapper[4743]: I1122 10:56:13.152802 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:56:13 crc kubenswrapper[4743]: E1122 10:56:13.154960 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:56:26 crc kubenswrapper[4743]: I1122 10:56:26.152480 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:56:26 crc kubenswrapper[4743]: E1122 10:56:26.153466 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:56:37 crc kubenswrapper[4743]: I1122 10:56:37.151946 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:56:37 crc kubenswrapper[4743]: E1122 10:56:37.152710 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:56:49 crc kubenswrapper[4743]: I1122 10:56:49.152248 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:56:49 crc kubenswrapper[4743]: E1122 10:56:49.153107 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:57:01 crc kubenswrapper[4743]: I1122 10:57:01.152264 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:57:01 crc kubenswrapper[4743]: E1122 10:57:01.153275 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:57:14 crc kubenswrapper[4743]: I1122 10:57:14.152023 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:57:14 crc kubenswrapper[4743]: E1122 10:57:14.153071 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:57:27 crc kubenswrapper[4743]: I1122 10:57:27.167755 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:57:27 crc kubenswrapper[4743]: E1122 10:57:27.169176 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:57:39 crc kubenswrapper[4743]: I1122 10:57:39.151491 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:57:39 crc kubenswrapper[4743]: E1122 10:57:39.152370 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:57:50 crc kubenswrapper[4743]: I1122 10:57:50.152286 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:57:50 crc kubenswrapper[4743]: E1122 10:57:50.153100 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:58:03 crc kubenswrapper[4743]: I1122 10:58:03.152315 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:58:03 crc kubenswrapper[4743]: E1122 10:58:03.153251 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:58:16 crc kubenswrapper[4743]: I1122 10:58:16.151793 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:58:16 crc kubenswrapper[4743]: E1122 10:58:16.152729 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:58:29 crc kubenswrapper[4743]: I1122 10:58:29.151821 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:58:29 crc kubenswrapper[4743]: E1122 10:58:29.152690 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:58:41 crc kubenswrapper[4743]: I1122 10:58:41.334815 4743 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4" podUID="f5f27cf7-eaa5-4b71-84a6-94fac3920d39" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.90:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Nov 22 10:58:43 crc kubenswrapper[4743]: I1122 10:58:43.152339 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:58:43 crc kubenswrapper[4743]: E1122 10:58:43.152951 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:58:57 crc kubenswrapper[4743]: I1122 10:58:57.160182 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:58:57 crc kubenswrapper[4743]: E1122 10:58:57.161060 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:59:10 crc kubenswrapper[4743]: I1122 10:59:10.151369 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:59:10 crc kubenswrapper[4743]: E1122 10:59:10.152260 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:59:21 crc kubenswrapper[4743]: I1122 10:59:21.152534 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:59:21 crc kubenswrapper[4743]: E1122 10:59:21.153510 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:59:36 crc kubenswrapper[4743]: I1122 10:59:36.151398 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:59:36 crc kubenswrapper[4743]: E1122 10:59:36.152261 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 10:59:49 crc kubenswrapper[4743]: I1122 10:59:49.152175 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 10:59:49 crc kubenswrapper[4743]: E1122 10:59:49.152982 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.157607 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k"] Nov 22 11:00:00 crc kubenswrapper[4743]: E1122 11:00:00.158764 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="extract-utilities" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.158780 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="extract-utilities" Nov 22 11:00:00 crc kubenswrapper[4743]: E1122 11:00:00.158806 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="registry-server" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.158814 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="registry-server" Nov 22 11:00:00 crc kubenswrapper[4743]: E1122 11:00:00.158831 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerName="extract-content" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.158837 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerName="extract-content" Nov 22 11:00:00 crc kubenswrapper[4743]: E1122 11:00:00.158871 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerName="extract-utilities" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.158877 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerName="extract-utilities" Nov 22 11:00:00 crc kubenswrapper[4743]: E1122 11:00:00.158887 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="extract-content" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.158892 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="extract-content" Nov 22 11:00:00 crc kubenswrapper[4743]: E1122 11:00:00.158908 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerName="registry-server" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.158914 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerName="registry-server" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.159129 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee3e03e0-b6cb-432c-9553-660658c98683" containerName="registry-server" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.159190 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="058c4e46-0368-4ecb-a33f-6bf41a2a4d48" containerName="registry-server" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.160120 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.161969 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.162043 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.186261 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k"] Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.271539 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-config-volume\") pod \"collect-profiles-29396820-gtm2k\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.271844 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q7d4\" (UniqueName: \"kubernetes.io/projected/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-kube-api-access-7q7d4\") pod \"collect-profiles-29396820-gtm2k\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.272290 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-secret-volume\") pod \"collect-profiles-29396820-gtm2k\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.373978 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q7d4\" (UniqueName: \"kubernetes.io/projected/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-kube-api-access-7q7d4\") pod \"collect-profiles-29396820-gtm2k\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.374068 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-secret-volume\") pod \"collect-profiles-29396820-gtm2k\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.374125 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-config-volume\") pod \"collect-profiles-29396820-gtm2k\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.375703 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-config-volume\") pod \"collect-profiles-29396820-gtm2k\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.382198 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-secret-volume\") pod \"collect-profiles-29396820-gtm2k\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.390717 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q7d4\" (UniqueName: \"kubernetes.io/projected/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-kube-api-access-7q7d4\") pod \"collect-profiles-29396820-gtm2k\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.489504 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:00 crc kubenswrapper[4743]: I1122 11:00:00.950847 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k"] Nov 22 11:00:01 crc kubenswrapper[4743]: I1122 11:00:01.398601 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" event={"ID":"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6","Type":"ContainerStarted","Data":"c663a3b2923baf6626a441e9e233f91f04e333990f138a6ae2d5f1ed9ba50e21"} Nov 22 11:00:01 crc kubenswrapper[4743]: I1122 11:00:01.398659 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" event={"ID":"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6","Type":"ContainerStarted","Data":"6edd85efe632a922fd3b72183ce82630d8b5a69cfea020388c3df35dafedb7e2"} Nov 22 11:00:02 crc kubenswrapper[4743]: I1122 11:00:02.410144 4743 generic.go:334] "Generic (PLEG): container finished" podID="b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6" containerID="c663a3b2923baf6626a441e9e233f91f04e333990f138a6ae2d5f1ed9ba50e21" exitCode=0 Nov 22 11:00:02 crc kubenswrapper[4743]: I1122 11:00:02.410204 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" event={"ID":"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6","Type":"ContainerDied","Data":"c663a3b2923baf6626a441e9e233f91f04e333990f138a6ae2d5f1ed9ba50e21"} Nov 22 11:00:02 crc kubenswrapper[4743]: I1122 11:00:02.947508 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.136200 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q7d4\" (UniqueName: \"kubernetes.io/projected/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-kube-api-access-7q7d4\") pod \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.136240 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-secret-volume\") pod \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.136558 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-config-volume\") pod \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\" (UID: \"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6\") " Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.137280 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-config-volume" (OuterVolumeSpecName: "config-volume") pod "b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6" (UID: "b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.142150 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6" (UID: "b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.142297 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-kube-api-access-7q7d4" (OuterVolumeSpecName: "kube-api-access-7q7d4") pod "b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6" (UID: "b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6"). InnerVolumeSpecName "kube-api-access-7q7d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.153383 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 11:00:03 crc kubenswrapper[4743]: E1122 11:00:03.153906 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.239770 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q7d4\" (UniqueName: \"kubernetes.io/projected/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-kube-api-access-7q7d4\") on node \"crc\" DevicePath \"\"" Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.239814 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.239849 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.425548 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" event={"ID":"b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6","Type":"ContainerDied","Data":"6edd85efe632a922fd3b72183ce82630d8b5a69cfea020388c3df35dafedb7e2"} Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.425908 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6edd85efe632a922fd3b72183ce82630d8b5a69cfea020388c3df35dafedb7e2" Nov 22 11:00:03 crc kubenswrapper[4743]: I1122 11:00:03.425665 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396820-gtm2k" Nov 22 11:00:04 crc kubenswrapper[4743]: I1122 11:00:04.022449 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf"] Nov 22 11:00:04 crc kubenswrapper[4743]: I1122 11:00:04.034269 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396775-5tknf"] Nov 22 11:00:05 crc kubenswrapper[4743]: I1122 11:00:05.166043 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="096d6397-d699-489c-91ec-371da2bfc7d6" path="/var/lib/kubelet/pods/096d6397-d699-489c-91ec-371da2bfc7d6/volumes" Nov 22 11:00:17 crc kubenswrapper[4743]: I1122 11:00:17.160022 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 11:00:17 crc kubenswrapper[4743]: E1122 11:00:17.161039 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:00:28 crc kubenswrapper[4743]: I1122 11:00:28.152107 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 11:00:28 crc kubenswrapper[4743]: E1122 11:00:28.152927 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:00:39 crc kubenswrapper[4743]: I1122 11:00:39.461893 4743 scope.go:117] "RemoveContainer" containerID="81f0d092ff00bf71679e9441f288cdad51888cc7d2485624aa3145a121358e2b" Nov 22 11:00:43 crc kubenswrapper[4743]: I1122 11:00:43.152147 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 11:00:43 crc kubenswrapper[4743]: E1122 11:00:43.153145 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:00:54 crc kubenswrapper[4743]: I1122 11:00:54.161333 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 11:00:54 crc kubenswrapper[4743]: E1122 11:00:54.162653 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.176631 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29396821-kzkw8"] Nov 22 11:01:00 crc kubenswrapper[4743]: E1122 11:01:00.178594 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6" containerName="collect-profiles" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.178620 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6" containerName="collect-profiles" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.179026 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="b93113b0-cd1a-4d4c-b715-f2dfc3fc2bb6" containerName="collect-profiles" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.180421 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.189335 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29396821-kzkw8"] Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.262911 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-fernet-keys\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.263205 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-combined-ca-bundle\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.263326 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-config-data\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.263356 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxlmj\" (UniqueName: \"kubernetes.io/projected/c8e48404-7c25-4096-8ea8-7e1036cca403-kube-api-access-cxlmj\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.364853 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-combined-ca-bundle\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.364945 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-config-data\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.364970 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxlmj\" (UniqueName: \"kubernetes.io/projected/c8e48404-7c25-4096-8ea8-7e1036cca403-kube-api-access-cxlmj\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.365020 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-fernet-keys\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.372450 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-fernet-keys\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.372519 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-combined-ca-bundle\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.374488 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-config-data\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.384115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxlmj\" (UniqueName: \"kubernetes.io/projected/c8e48404-7c25-4096-8ea8-7e1036cca403-kube-api-access-cxlmj\") pod \"keystone-cron-29396821-kzkw8\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:00 crc kubenswrapper[4743]: I1122 11:01:00.513987 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:01 crc kubenswrapper[4743]: I1122 11:01:01.045016 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29396821-kzkw8"] Nov 22 11:01:02 crc kubenswrapper[4743]: I1122 11:01:02.032455 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396821-kzkw8" event={"ID":"c8e48404-7c25-4096-8ea8-7e1036cca403","Type":"ContainerStarted","Data":"c75c01a0b998252e8820e6831e39c34e3f7099bd986eb8d8eaa007058db7b41b"} Nov 22 11:01:02 crc kubenswrapper[4743]: I1122 11:01:02.034679 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396821-kzkw8" event={"ID":"c8e48404-7c25-4096-8ea8-7e1036cca403","Type":"ContainerStarted","Data":"53f3d7c3ff4b74f707bae355fb1e37230af6216b59162b5258adc969c4431cd4"} Nov 22 11:01:02 crc kubenswrapper[4743]: I1122 11:01:02.060444 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29396821-kzkw8" podStartSLOduration=2.060407615 podStartE2EDuration="2.060407615s" podCreationTimestamp="2025-11-22 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-11-22 11:01:02.048496104 +0000 UTC m=+9535.754857156" watchObservedRunningTime="2025-11-22 11:01:02.060407615 +0000 UTC m=+9535.766768667" Nov 22 11:01:06 crc kubenswrapper[4743]: I1122 11:01:06.151844 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 11:01:07 crc kubenswrapper[4743]: I1122 11:01:07.112832 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"f7db4407799e7db0c973d757e817035cf90ac74b5ed6a44d7d7ae6de2f172b71"} Nov 22 11:01:07 crc kubenswrapper[4743]: I1122 11:01:07.116142 4743 generic.go:334] "Generic (PLEG): container finished" podID="c8e48404-7c25-4096-8ea8-7e1036cca403" containerID="c75c01a0b998252e8820e6831e39c34e3f7099bd986eb8d8eaa007058db7b41b" exitCode=0 Nov 22 11:01:07 crc kubenswrapper[4743]: I1122 11:01:07.116174 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396821-kzkw8" event={"ID":"c8e48404-7c25-4096-8ea8-7e1036cca403","Type":"ContainerDied","Data":"c75c01a0b998252e8820e6831e39c34e3f7099bd986eb8d8eaa007058db7b41b"} Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.559439 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.604196 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-combined-ca-bundle\") pod \"c8e48404-7c25-4096-8ea8-7e1036cca403\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.604349 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-config-data\") pod \"c8e48404-7c25-4096-8ea8-7e1036cca403\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.604421 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxlmj\" (UniqueName: \"kubernetes.io/projected/c8e48404-7c25-4096-8ea8-7e1036cca403-kube-api-access-cxlmj\") pod \"c8e48404-7c25-4096-8ea8-7e1036cca403\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.604722 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-fernet-keys\") pod \"c8e48404-7c25-4096-8ea8-7e1036cca403\" (UID: \"c8e48404-7c25-4096-8ea8-7e1036cca403\") " Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.623724 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c8e48404-7c25-4096-8ea8-7e1036cca403" (UID: "c8e48404-7c25-4096-8ea8-7e1036cca403"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.629810 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e48404-7c25-4096-8ea8-7e1036cca403-kube-api-access-cxlmj" (OuterVolumeSpecName: "kube-api-access-cxlmj") pod "c8e48404-7c25-4096-8ea8-7e1036cca403" (UID: "c8e48404-7c25-4096-8ea8-7e1036cca403"). InnerVolumeSpecName "kube-api-access-cxlmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.690098 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8e48404-7c25-4096-8ea8-7e1036cca403" (UID: "c8e48404-7c25-4096-8ea8-7e1036cca403"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.707303 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-config-data" (OuterVolumeSpecName: "config-data") pod "c8e48404-7c25-4096-8ea8-7e1036cca403" (UID: "c8e48404-7c25-4096-8ea8-7e1036cca403"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.709147 4743 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-fernet-keys\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.709191 4743 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.709207 4743 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e48404-7c25-4096-8ea8-7e1036cca403-config-data\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:08 crc kubenswrapper[4743]: I1122 11:01:08.709221 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxlmj\" (UniqueName: \"kubernetes.io/projected/c8e48404-7c25-4096-8ea8-7e1036cca403-kube-api-access-cxlmj\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:09 crc kubenswrapper[4743]: I1122 11:01:09.144220 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29396821-kzkw8" event={"ID":"c8e48404-7c25-4096-8ea8-7e1036cca403","Type":"ContainerDied","Data":"53f3d7c3ff4b74f707bae355fb1e37230af6216b59162b5258adc969c4431cd4"} Nov 22 11:01:09 crc kubenswrapper[4743]: I1122 11:01:09.144574 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53f3d7c3ff4b74f707bae355fb1e37230af6216b59162b5258adc969c4431cd4" Nov 22 11:01:09 crc kubenswrapper[4743]: I1122 11:01:09.144604 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29396821-kzkw8" Nov 22 11:01:17 crc kubenswrapper[4743]: I1122 11:01:17.251649 4743 generic.go:334] "Generic (PLEG): container finished" podID="aece6c0f-51a9-4480-8b13-0da51fca1fc8" containerID="ed29bd136cfc4a8a77c7465c55c835b79e41265f9525bb626fc991a677f0125f" exitCode=0 Nov 22 11:01:17 crc kubenswrapper[4743]: I1122 11:01:17.251754 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" event={"ID":"aece6c0f-51a9-4480-8b13-0da51fca1fc8","Type":"ContainerDied","Data":"ed29bd136cfc4a8a77c7465c55c835b79e41265f9525bb626fc991a677f0125f"} Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.743764 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.878525 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-combined-ca-bundle\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.878996 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-1\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.879062 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-1\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.879120 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmvjw\" (UniqueName: \"kubernetes.io/projected/aece6c0f-51a9-4480-8b13-0da51fca1fc8-kube-api-access-kmvjw\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.879168 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-0\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.879208 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ssh-key\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.879266 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-inventory\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.879288 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ceph\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.879326 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-0\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.879371 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-1\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:18 crc kubenswrapper[4743]: I1122 11:01:18.879431 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-0\") pod \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\" (UID: \"aece6c0f-51a9-4480-8b13-0da51fca1fc8\") " Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.277938 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" event={"ID":"aece6c0f-51a9-4480-8b13-0da51fca1fc8","Type":"ContainerDied","Data":"bfc692193cc20f7e925b52bab64bab25a23364eac9c6a0f55befc635dd28ac56"} Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.278029 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfc692193cc20f7e925b52bab64bab25a23364eac9c6a0f55befc635dd28ac56" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.278044 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.435676 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.435710 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ceph" (OuterVolumeSpecName: "ceph") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.435736 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aece6c0f-51a9-4480-8b13-0da51fca1fc8-kube-api-access-kmvjw" (OuterVolumeSpecName: "kube-api-access-kmvjw") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "kube-api-access-kmvjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.478342 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.500870 4743 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ceph\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.500908 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.500920 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.500930 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmvjw\" (UniqueName: \"kubernetes.io/projected/aece6c0f-51a9-4480-8b13-0da51fca1fc8-kube-api-access-kmvjw\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.565105 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.566815 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.567312 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.573697 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.584915 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-inventory" (OuterVolumeSpecName: "inventory") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.590433 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.598532 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "aece6c0f-51a9-4480-8b13-0da51fca1fc8" (UID: "aece6c0f-51a9-4480-8b13-0da51fca1fc8"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.605769 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.605827 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.605844 4743 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-ssh-key\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.605861 4743 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-inventory\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.605875 4743 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.605888 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Nov 22 11:01:19 crc kubenswrapper[4743]: I1122 11:01:19.605905 4743 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aece6c0f-51a9-4480-8b13-0da51fca1fc8-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.550264 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-495mz"] Nov 22 11:02:58 crc kubenswrapper[4743]: E1122 11:02:58.551465 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aece6c0f-51a9-4480-8b13-0da51fca1fc8" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.551485 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="aece6c0f-51a9-4480-8b13-0da51fca1fc8" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 22 11:02:58 crc kubenswrapper[4743]: E1122 11:02:58.551561 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8e48404-7c25-4096-8ea8-7e1036cca403" containerName="keystone-cron" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.551571 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e48404-7c25-4096-8ea8-7e1036cca403" containerName="keystone-cron" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.551853 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="aece6c0f-51a9-4480-8b13-0da51fca1fc8" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.551878 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8e48404-7c25-4096-8ea8-7e1036cca403" containerName="keystone-cron" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.553984 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.561164 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-495mz"] Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.747714 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-catalog-content\") pod \"redhat-marketplace-495mz\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.748107 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-utilities\") pod \"redhat-marketplace-495mz\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.748317 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4c2h\" (UniqueName: \"kubernetes.io/projected/adbe615e-5f71-4ec5-b3e3-9187b8009f75-kube-api-access-v4c2h\") pod \"redhat-marketplace-495mz\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.851323 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4c2h\" (UniqueName: \"kubernetes.io/projected/adbe615e-5f71-4ec5-b3e3-9187b8009f75-kube-api-access-v4c2h\") pod \"redhat-marketplace-495mz\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.851631 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-catalog-content\") pod \"redhat-marketplace-495mz\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.851676 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-utilities\") pod \"redhat-marketplace-495mz\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.852309 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-utilities\") pod \"redhat-marketplace-495mz\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.852458 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-catalog-content\") pod \"redhat-marketplace-495mz\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.870401 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4c2h\" (UniqueName: \"kubernetes.io/projected/adbe615e-5f71-4ec5-b3e3-9187b8009f75-kube-api-access-v4c2h\") pod \"redhat-marketplace-495mz\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:58 crc kubenswrapper[4743]: I1122 11:02:58.877709 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:02:59 crc kubenswrapper[4743]: I1122 11:02:59.425480 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-495mz"] Nov 22 11:02:59 crc kubenswrapper[4743]: I1122 11:02:59.475260 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-495mz" event={"ID":"adbe615e-5f71-4ec5-b3e3-9187b8009f75","Type":"ContainerStarted","Data":"21fd4607b93f12fe1e638dd267e54c3c930cb130c00d6f3ed8a5a178913ae508"} Nov 22 11:03:00 crc kubenswrapper[4743]: I1122 11:03:00.487732 4743 generic.go:334] "Generic (PLEG): container finished" podID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerID="5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b" exitCode=0 Nov 22 11:03:00 crc kubenswrapper[4743]: I1122 11:03:00.487830 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-495mz" event={"ID":"adbe615e-5f71-4ec5-b3e3-9187b8009f75","Type":"ContainerDied","Data":"5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b"} Nov 22 11:03:00 crc kubenswrapper[4743]: I1122 11:03:00.490718 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 11:03:01 crc kubenswrapper[4743]: I1122 11:03:01.503607 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-495mz" event={"ID":"adbe615e-5f71-4ec5-b3e3-9187b8009f75","Type":"ContainerStarted","Data":"a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23"} Nov 22 11:03:02 crc kubenswrapper[4743]: I1122 11:03:02.515501 4743 generic.go:334] "Generic (PLEG): container finished" podID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerID="a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23" exitCode=0 Nov 22 11:03:02 crc kubenswrapper[4743]: I1122 11:03:02.516073 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-495mz" event={"ID":"adbe615e-5f71-4ec5-b3e3-9187b8009f75","Type":"ContainerDied","Data":"a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23"} Nov 22 11:03:03 crc kubenswrapper[4743]: I1122 11:03:03.531378 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-495mz" event={"ID":"adbe615e-5f71-4ec5-b3e3-9187b8009f75","Type":"ContainerStarted","Data":"6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca"} Nov 22 11:03:03 crc kubenswrapper[4743]: I1122 11:03:03.554187 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-495mz" podStartSLOduration=3.149336933 podStartE2EDuration="5.554165731s" podCreationTimestamp="2025-11-22 11:02:58 +0000 UTC" firstStartedPulling="2025-11-22 11:03:00.490444367 +0000 UTC m=+9654.196805419" lastFinishedPulling="2025-11-22 11:03:02.895273165 +0000 UTC m=+9656.601634217" observedRunningTime="2025-11-22 11:03:03.552459223 +0000 UTC m=+9657.258820295" watchObservedRunningTime="2025-11-22 11:03:03.554165731 +0000 UTC m=+9657.260526793" Nov 22 11:03:08 crc kubenswrapper[4743]: I1122 11:03:08.878268 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:03:08 crc kubenswrapper[4743]: I1122 11:03:08.878976 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:03:08 crc kubenswrapper[4743]: I1122 11:03:08.941712 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:03:09 crc kubenswrapper[4743]: I1122 11:03:09.637977 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:03:09 crc kubenswrapper[4743]: I1122 11:03:09.683539 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-495mz"] Nov 22 11:03:11 crc kubenswrapper[4743]: I1122 11:03:11.612330 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-495mz" podUID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerName="registry-server" containerID="cri-o://6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca" gracePeriod=2 Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.140274 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.152123 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-catalog-content\") pod \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.152176 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4c2h\" (UniqueName: \"kubernetes.io/projected/adbe615e-5f71-4ec5-b3e3-9187b8009f75-kube-api-access-v4c2h\") pod \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.152249 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-utilities\") pod \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\" (UID: \"adbe615e-5f71-4ec5-b3e3-9187b8009f75\") " Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.153377 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-utilities" (OuterVolumeSpecName: "utilities") pod "adbe615e-5f71-4ec5-b3e3-9187b8009f75" (UID: "adbe615e-5f71-4ec5-b3e3-9187b8009f75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.158940 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.185895 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "adbe615e-5f71-4ec5-b3e3-9187b8009f75" (UID: "adbe615e-5f71-4ec5-b3e3-9187b8009f75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.202914 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adbe615e-5f71-4ec5-b3e3-9187b8009f75-kube-api-access-v4c2h" (OuterVolumeSpecName: "kube-api-access-v4c2h") pod "adbe615e-5f71-4ec5-b3e3-9187b8009f75" (UID: "adbe615e-5f71-4ec5-b3e3-9187b8009f75"). InnerVolumeSpecName "kube-api-access-v4c2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.261264 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/adbe615e-5f71-4ec5-b3e3-9187b8009f75-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.261549 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4c2h\" (UniqueName: \"kubernetes.io/projected/adbe615e-5f71-4ec5-b3e3-9187b8009f75-kube-api-access-v4c2h\") on node \"crc\" DevicePath \"\"" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.623872 4743 generic.go:334] "Generic (PLEG): container finished" podID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerID="6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca" exitCode=0 Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.623933 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-495mz" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.623925 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-495mz" event={"ID":"adbe615e-5f71-4ec5-b3e3-9187b8009f75","Type":"ContainerDied","Data":"6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca"} Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.624328 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-495mz" event={"ID":"adbe615e-5f71-4ec5-b3e3-9187b8009f75","Type":"ContainerDied","Data":"21fd4607b93f12fe1e638dd267e54c3c930cb130c00d6f3ed8a5a178913ae508"} Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.624353 4743 scope.go:117] "RemoveContainer" containerID="6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.658773 4743 scope.go:117] "RemoveContainer" containerID="a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.671962 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-495mz"] Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.682358 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-495mz"] Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.687559 4743 scope.go:117] "RemoveContainer" containerID="5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.740499 4743 scope.go:117] "RemoveContainer" containerID="6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca" Nov 22 11:03:12 crc kubenswrapper[4743]: E1122 11:03:12.742751 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca\": container with ID starting with 6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca not found: ID does not exist" containerID="6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.742808 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca"} err="failed to get container status \"6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca\": rpc error: code = NotFound desc = could not find container \"6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca\": container with ID starting with 6bddb126cf0b227d21832b6d3f094455f859a5ea1b544a67c090428e430c1eca not found: ID does not exist" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.742838 4743 scope.go:117] "RemoveContainer" containerID="a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23" Nov 22 11:03:12 crc kubenswrapper[4743]: E1122 11:03:12.743240 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23\": container with ID starting with a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23 not found: ID does not exist" containerID="a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.743270 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23"} err="failed to get container status \"a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23\": rpc error: code = NotFound desc = could not find container \"a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23\": container with ID starting with a324539888040157a8764387cc01866db932cacc64c2ae2f9f752e24ed7fef23 not found: ID does not exist" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.743286 4743 scope.go:117] "RemoveContainer" containerID="5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b" Nov 22 11:03:12 crc kubenswrapper[4743]: E1122 11:03:12.743628 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b\": container with ID starting with 5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b not found: ID does not exist" containerID="5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b" Nov 22 11:03:12 crc kubenswrapper[4743]: I1122 11:03:12.743659 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b"} err="failed to get container status \"5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b\": rpc error: code = NotFound desc = could not find container \"5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b\": container with ID starting with 5e7b9b4231c381a6a5cdd622fa1c09dbc246e01003593bf6203ae3accf9a850b not found: ID does not exist" Nov 22 11:03:12 crc kubenswrapper[4743]: E1122 11:03:12.874412 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbe615e_5f71_4ec5_b3e3_9187b8009f75.slice/crio-21fd4607b93f12fe1e638dd267e54c3c930cb130c00d6f3ed8a5a178913ae508\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbe615e_5f71_4ec5_b3e3_9187b8009f75.slice\": RecentStats: unable to find data in memory cache]" Nov 22 11:03:13 crc kubenswrapper[4743]: I1122 11:03:13.175803 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" path="/var/lib/kubelet/pods/adbe615e-5f71-4ec5-b3e3-9187b8009f75/volumes" Nov 22 11:03:31 crc kubenswrapper[4743]: I1122 11:03:31.241247 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 11:03:31 crc kubenswrapper[4743]: I1122 11:03:31.242002 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 11:03:43 crc kubenswrapper[4743]: I1122 11:03:43.048771 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Nov 22 11:03:43 crc kubenswrapper[4743]: I1122 11:03:43.050005 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="443c2a25-4980-472c-ab82-682e852ee9ba" containerName="adoption" containerID="cri-o://a1d512483f974358dbb81edbdb2fa7f5965b74263ea1b2745b16187ba9db430a" gracePeriod=30 Nov 22 11:04:01 crc kubenswrapper[4743]: I1122 11:04:01.241810 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 11:04:01 crc kubenswrapper[4743]: I1122 11:04:01.242446 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 11:04:13 crc kubenswrapper[4743]: I1122 11:04:13.302694 4743 generic.go:334] "Generic (PLEG): container finished" podID="443c2a25-4980-472c-ab82-682e852ee9ba" containerID="a1d512483f974358dbb81edbdb2fa7f5965b74263ea1b2745b16187ba9db430a" exitCode=137 Nov 22 11:04:13 crc kubenswrapper[4743]: I1122 11:04:13.302783 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"443c2a25-4980-472c-ab82-682e852ee9ba","Type":"ContainerDied","Data":"a1d512483f974358dbb81edbdb2fa7f5965b74263ea1b2745b16187ba9db430a"} Nov 22 11:04:13 crc kubenswrapper[4743]: I1122 11:04:13.553801 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 22 11:04:13 crc kubenswrapper[4743]: I1122 11:04:13.609777 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8s4n\" (UniqueName: \"kubernetes.io/projected/443c2a25-4980-472c-ab82-682e852ee9ba-kube-api-access-r8s4n\") pod \"443c2a25-4980-472c-ab82-682e852ee9ba\" (UID: \"443c2a25-4980-472c-ab82-682e852ee9ba\") " Nov 22 11:04:13 crc kubenswrapper[4743]: I1122 11:04:13.610983 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75e6afb1-8e09-4829-9796-7e506a806635\") pod \"443c2a25-4980-472c-ab82-682e852ee9ba\" (UID: \"443c2a25-4980-472c-ab82-682e852ee9ba\") " Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.238792 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/443c2a25-4980-472c-ab82-682e852ee9ba-kube-api-access-r8s4n" (OuterVolumeSpecName: "kube-api-access-r8s4n") pod "443c2a25-4980-472c-ab82-682e852ee9ba" (UID: "443c2a25-4980-472c-ab82-682e852ee9ba"). InnerVolumeSpecName "kube-api-access-r8s4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.264334 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75e6afb1-8e09-4829-9796-7e506a806635" (OuterVolumeSpecName: "mariadb-data") pod "443c2a25-4980-472c-ab82-682e852ee9ba" (UID: "443c2a25-4980-472c-ab82-682e852ee9ba"). InnerVolumeSpecName "pvc-75e6afb1-8e09-4829-9796-7e506a806635". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.324408 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"443c2a25-4980-472c-ab82-682e852ee9ba","Type":"ContainerDied","Data":"1598ee464443275dc11c09d52cbc6886bfc577dd15c7411a5248e78782c257a8"} Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.324455 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.324905 4743 scope.go:117] "RemoveContainer" containerID="a1d512483f974358dbb81edbdb2fa7f5965b74263ea1b2745b16187ba9db430a" Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.327432 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8s4n\" (UniqueName: \"kubernetes.io/projected/443c2a25-4980-472c-ab82-682e852ee9ba-kube-api-access-r8s4n\") on node \"crc\" DevicePath \"\"" Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.327492 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-75e6afb1-8e09-4829-9796-7e506a806635\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75e6afb1-8e09-4829-9796-7e506a806635\") on node \"crc\" " Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.362246 4743 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.362538 4743 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-75e6afb1-8e09-4829-9796-7e506a806635" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75e6afb1-8e09-4829-9796-7e506a806635") on node "crc" Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.378564 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.387990 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Nov 22 11:04:14 crc kubenswrapper[4743]: I1122 11:04:14.428976 4743 reconciler_common.go:293] "Volume detached for volume \"pvc-75e6afb1-8e09-4829-9796-7e506a806635\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-75e6afb1-8e09-4829-9796-7e506a806635\") on node \"crc\" DevicePath \"\"" Nov 22 11:04:15 crc kubenswrapper[4743]: I1122 11:04:15.069813 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Nov 22 11:04:15 crc kubenswrapper[4743]: I1122 11:04:15.070397 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="019b3b66-2805-4684-bf84-50705fbbdaf8" containerName="adoption" containerID="cri-o://4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c" gracePeriod=30 Nov 22 11:04:15 crc kubenswrapper[4743]: I1122 11:04:15.164215 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="443c2a25-4980-472c-ab82-682e852ee9ba" path="/var/lib/kubelet/pods/443c2a25-4980-472c-ab82-682e852ee9ba/volumes" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.643988 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wngjg"] Nov 22 11:04:28 crc kubenswrapper[4743]: E1122 11:04:28.645868 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerName="extract-utilities" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.645889 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerName="extract-utilities" Nov 22 11:04:28 crc kubenswrapper[4743]: E1122 11:04:28.645911 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="443c2a25-4980-472c-ab82-682e852ee9ba" containerName="adoption" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.645917 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="443c2a25-4980-472c-ab82-682e852ee9ba" containerName="adoption" Nov 22 11:04:28 crc kubenswrapper[4743]: E1122 11:04:28.645945 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerName="extract-content" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.645952 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerName="extract-content" Nov 22 11:04:28 crc kubenswrapper[4743]: E1122 11:04:28.646003 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerName="registry-server" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.646010 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerName="registry-server" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.646251 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="443c2a25-4980-472c-ab82-682e852ee9ba" containerName="adoption" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.646277 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="adbe615e-5f71-4ec5-b3e3-9187b8009f75" containerName="registry-server" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.648776 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.656448 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wngjg"] Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.737425 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgvs5\" (UniqueName: \"kubernetes.io/projected/d4d801de-ad32-4f73-8201-07b59bf3c98c-kube-api-access-fgvs5\") pod \"certified-operators-wngjg\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.737544 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-catalog-content\") pod \"certified-operators-wngjg\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.738697 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-utilities\") pod \"certified-operators-wngjg\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.846343 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-utilities\") pod \"certified-operators-wngjg\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.846620 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgvs5\" (UniqueName: \"kubernetes.io/projected/d4d801de-ad32-4f73-8201-07b59bf3c98c-kube-api-access-fgvs5\") pod \"certified-operators-wngjg\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.846658 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-catalog-content\") pod \"certified-operators-wngjg\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.847442 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-utilities\") pod \"certified-operators-wngjg\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.847835 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-catalog-content\") pod \"certified-operators-wngjg\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:28 crc kubenswrapper[4743]: I1122 11:04:28.874115 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgvs5\" (UniqueName: \"kubernetes.io/projected/d4d801de-ad32-4f73-8201-07b59bf3c98c-kube-api-access-fgvs5\") pod \"certified-operators-wngjg\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:29 crc kubenswrapper[4743]: I1122 11:04:29.023388 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:29 crc kubenswrapper[4743]: I1122 11:04:29.747435 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wngjg"] Nov 22 11:04:30 crc kubenswrapper[4743]: I1122 11:04:30.541936 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerID="959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7" exitCode=0 Nov 22 11:04:30 crc kubenswrapper[4743]: I1122 11:04:30.542026 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wngjg" event={"ID":"d4d801de-ad32-4f73-8201-07b59bf3c98c","Type":"ContainerDied","Data":"959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7"} Nov 22 11:04:30 crc kubenswrapper[4743]: I1122 11:04:30.542415 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wngjg" event={"ID":"d4d801de-ad32-4f73-8201-07b59bf3c98c","Type":"ContainerStarted","Data":"9d5990c883572691f7f0abcebd81efe25c32af7a06b60484e7ce90f7f3eb021d"} Nov 22 11:04:31 crc kubenswrapper[4743]: I1122 11:04:31.241031 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 11:04:31 crc kubenswrapper[4743]: I1122 11:04:31.241822 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 11:04:31 crc kubenswrapper[4743]: I1122 11:04:31.242737 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 11:04:31 crc kubenswrapper[4743]: I1122 11:04:31.244139 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f7db4407799e7db0c973d757e817035cf90ac74b5ed6a44d7d7ae6de2f172b71"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 11:04:31 crc kubenswrapper[4743]: I1122 11:04:31.244210 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://f7db4407799e7db0c973d757e817035cf90ac74b5ed6a44d7d7ae6de2f172b71" gracePeriod=600 Nov 22 11:04:31 crc kubenswrapper[4743]: I1122 11:04:31.554616 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wngjg" event={"ID":"d4d801de-ad32-4f73-8201-07b59bf3c98c","Type":"ContainerStarted","Data":"640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4"} Nov 22 11:04:31 crc kubenswrapper[4743]: I1122 11:04:31.557546 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="f7db4407799e7db0c973d757e817035cf90ac74b5ed6a44d7d7ae6de2f172b71" exitCode=0 Nov 22 11:04:31 crc kubenswrapper[4743]: I1122 11:04:31.557727 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"f7db4407799e7db0c973d757e817035cf90ac74b5ed6a44d7d7ae6de2f172b71"} Nov 22 11:04:31 crc kubenswrapper[4743]: I1122 11:04:31.557763 4743 scope.go:117] "RemoveContainer" containerID="3e7b7f51c7869f467fff2b9f95523b9a34ca7de4ddfc6982b197905530872f63" Nov 22 11:04:32 crc kubenswrapper[4743]: I1122 11:04:32.572255 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1"} Nov 22 11:04:33 crc kubenswrapper[4743]: I1122 11:04:33.589003 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerID="640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4" exitCode=0 Nov 22 11:04:33 crc kubenswrapper[4743]: I1122 11:04:33.589778 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wngjg" event={"ID":"d4d801de-ad32-4f73-8201-07b59bf3c98c","Type":"ContainerDied","Data":"640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4"} Nov 22 11:04:34 crc kubenswrapper[4743]: I1122 11:04:34.602130 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wngjg" event={"ID":"d4d801de-ad32-4f73-8201-07b59bf3c98c","Type":"ContainerStarted","Data":"5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20"} Nov 22 11:04:34 crc kubenswrapper[4743]: I1122 11:04:34.627748 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wngjg" podStartSLOduration=3.018690472 podStartE2EDuration="6.627726576s" podCreationTimestamp="2025-11-22 11:04:28 +0000 UTC" firstStartedPulling="2025-11-22 11:04:30.544883967 +0000 UTC m=+9744.251245029" lastFinishedPulling="2025-11-22 11:04:34.153920081 +0000 UTC m=+9747.860281133" observedRunningTime="2025-11-22 11:04:34.619682466 +0000 UTC m=+9748.326043518" watchObservedRunningTime="2025-11-22 11:04:34.627726576 +0000 UTC m=+9748.334087628" Nov 22 11:04:39 crc kubenswrapper[4743]: I1122 11:04:39.024245 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:39 crc kubenswrapper[4743]: I1122 11:04:39.025956 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:39 crc kubenswrapper[4743]: I1122 11:04:39.292241 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:39 crc kubenswrapper[4743]: I1122 11:04:39.733592 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:39 crc kubenswrapper[4743]: I1122 11:04:39.794806 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wngjg"] Nov 22 11:04:41 crc kubenswrapper[4743]: I1122 11:04:41.695706 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wngjg" podUID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerName="registry-server" containerID="cri-o://5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20" gracePeriod=2 Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.526186 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.637400 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgvs5\" (UniqueName: \"kubernetes.io/projected/d4d801de-ad32-4f73-8201-07b59bf3c98c-kube-api-access-fgvs5\") pod \"d4d801de-ad32-4f73-8201-07b59bf3c98c\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.637803 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-catalog-content\") pod \"d4d801de-ad32-4f73-8201-07b59bf3c98c\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.637920 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-utilities\") pod \"d4d801de-ad32-4f73-8201-07b59bf3c98c\" (UID: \"d4d801de-ad32-4f73-8201-07b59bf3c98c\") " Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.639048 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-utilities" (OuterVolumeSpecName: "utilities") pod "d4d801de-ad32-4f73-8201-07b59bf3c98c" (UID: "d4d801de-ad32-4f73-8201-07b59bf3c98c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.655704 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d801de-ad32-4f73-8201-07b59bf3c98c-kube-api-access-fgvs5" (OuterVolumeSpecName: "kube-api-access-fgvs5") pod "d4d801de-ad32-4f73-8201-07b59bf3c98c" (UID: "d4d801de-ad32-4f73-8201-07b59bf3c98c"). InnerVolumeSpecName "kube-api-access-fgvs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.678870 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4d801de-ad32-4f73-8201-07b59bf3c98c" (UID: "d4d801de-ad32-4f73-8201-07b59bf3c98c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.707953 4743 generic.go:334] "Generic (PLEG): container finished" podID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerID="5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20" exitCode=0 Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.708034 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wngjg" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.708014 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wngjg" event={"ID":"d4d801de-ad32-4f73-8201-07b59bf3c98c","Type":"ContainerDied","Data":"5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20"} Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.708197 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wngjg" event={"ID":"d4d801de-ad32-4f73-8201-07b59bf3c98c","Type":"ContainerDied","Data":"9d5990c883572691f7f0abcebd81efe25c32af7a06b60484e7ce90f7f3eb021d"} Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.708222 4743 scope.go:117] "RemoveContainer" containerID="5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.745468 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgvs5\" (UniqueName: \"kubernetes.io/projected/d4d801de-ad32-4f73-8201-07b59bf3c98c-kube-api-access-fgvs5\") on node \"crc\" DevicePath \"\"" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.745802 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.746125 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4d801de-ad32-4f73-8201-07b59bf3c98c-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.749471 4743 scope.go:117] "RemoveContainer" containerID="640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.768854 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wngjg"] Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.781110 4743 scope.go:117] "RemoveContainer" containerID="959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.790091 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wngjg"] Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.834164 4743 scope.go:117] "RemoveContainer" containerID="5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20" Nov 22 11:04:42 crc kubenswrapper[4743]: E1122 11:04:42.834798 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20\": container with ID starting with 5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20 not found: ID does not exist" containerID="5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.834837 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20"} err="failed to get container status \"5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20\": rpc error: code = NotFound desc = could not find container \"5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20\": container with ID starting with 5b96bfef7789b23f3c2188b9b7d58b11cda40a373ec1225e6afe50fd8e0b2e20 not found: ID does not exist" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.834860 4743 scope.go:117] "RemoveContainer" containerID="640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4" Nov 22 11:04:42 crc kubenswrapper[4743]: E1122 11:04:42.835184 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4\": container with ID starting with 640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4 not found: ID does not exist" containerID="640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.835229 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4"} err="failed to get container status \"640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4\": rpc error: code = NotFound desc = could not find container \"640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4\": container with ID starting with 640224e73fca14bebf2eed82826d91fb232313a6464e27f3762589bd450aa4c4 not found: ID does not exist" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.835259 4743 scope.go:117] "RemoveContainer" containerID="959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7" Nov 22 11:04:42 crc kubenswrapper[4743]: E1122 11:04:42.835868 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7\": container with ID starting with 959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7 not found: ID does not exist" containerID="959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7" Nov 22 11:04:42 crc kubenswrapper[4743]: I1122 11:04:42.835929 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7"} err="failed to get container status \"959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7\": rpc error: code = NotFound desc = could not find container \"959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7\": container with ID starting with 959f5e2efe323c39404ec86694d7564ddb465d4a22d65dcf7d27e6957b5dd0f7 not found: ID does not exist" Nov 22 11:04:43 crc kubenswrapper[4743]: I1122 11:04:43.167094 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d801de-ad32-4f73-8201-07b59bf3c98c" path="/var/lib/kubelet/pods/d4d801de-ad32-4f73-8201-07b59bf3c98c/volumes" Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.752944 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.778660 4743 generic.go:334] "Generic (PLEG): container finished" podID="019b3b66-2805-4684-bf84-50705fbbdaf8" containerID="4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c" exitCode=137 Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.778907 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"019b3b66-2805-4684-bf84-50705fbbdaf8","Type":"ContainerDied","Data":"4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c"} Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.779148 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"019b3b66-2805-4684-bf84-50705fbbdaf8","Type":"ContainerDied","Data":"5ccb357b5cb4fdd99de7e29494807ac86ac1760203a634f9c8faba84f4ce2007"} Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.779017 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.779179 4743 scope.go:117] "RemoveContainer" containerID="4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c" Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.835533 4743 scope.go:117] "RemoveContainer" containerID="4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c" Nov 22 11:04:45 crc kubenswrapper[4743]: E1122 11:04:45.836274 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c\": container with ID starting with 4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c not found: ID does not exist" containerID="4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c" Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.836322 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c"} err="failed to get container status \"4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c\": rpc error: code = NotFound desc = could not find container \"4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c\": container with ID starting with 4f7ad729a34984c4afa7500c5a963e1e5053bf9732137bf9bda61f548779a83c not found: ID does not exist" Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.925717 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/019b3b66-2805-4684-bf84-50705fbbdaf8-ovn-data-cert\") pod \"019b3b66-2805-4684-bf84-50705fbbdaf8\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.926688 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\") pod \"019b3b66-2805-4684-bf84-50705fbbdaf8\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.926853 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkzf5\" (UniqueName: \"kubernetes.io/projected/019b3b66-2805-4684-bf84-50705fbbdaf8-kube-api-access-mkzf5\") pod \"019b3b66-2805-4684-bf84-50705fbbdaf8\" (UID: \"019b3b66-2805-4684-bf84-50705fbbdaf8\") " Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.936469 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/019b3b66-2805-4684-bf84-50705fbbdaf8-kube-api-access-mkzf5" (OuterVolumeSpecName: "kube-api-access-mkzf5") pod "019b3b66-2805-4684-bf84-50705fbbdaf8" (UID: "019b3b66-2805-4684-bf84-50705fbbdaf8"). InnerVolumeSpecName "kube-api-access-mkzf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.940827 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/019b3b66-2805-4684-bf84-50705fbbdaf8-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "019b3b66-2805-4684-bf84-50705fbbdaf8" (UID: "019b3b66-2805-4684-bf84-50705fbbdaf8"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:04:45 crc kubenswrapper[4743]: I1122 11:04:45.962634 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22" (OuterVolumeSpecName: "ovn-data") pod "019b3b66-2805-4684-bf84-50705fbbdaf8" (UID: "019b3b66-2805-4684-bf84-50705fbbdaf8"). InnerVolumeSpecName "pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22". PluginName "kubernetes.io/csi", VolumeGidValue "" Nov 22 11:04:46 crc kubenswrapper[4743]: I1122 11:04:46.029996 4743 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/019b3b66-2805-4684-bf84-50705fbbdaf8-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Nov 22 11:04:46 crc kubenswrapper[4743]: I1122 11:04:46.030152 4743 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\") on node \"crc\" " Nov 22 11:04:46 crc kubenswrapper[4743]: I1122 11:04:46.030231 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkzf5\" (UniqueName: \"kubernetes.io/projected/019b3b66-2805-4684-bf84-50705fbbdaf8-kube-api-access-mkzf5\") on node \"crc\" DevicePath \"\"" Nov 22 11:04:46 crc kubenswrapper[4743]: I1122 11:04:46.089352 4743 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Nov 22 11:04:46 crc kubenswrapper[4743]: I1122 11:04:46.089831 4743 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22") on node "crc" Nov 22 11:04:46 crc kubenswrapper[4743]: I1122 11:04:46.120044 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Nov 22 11:04:46 crc kubenswrapper[4743]: I1122 11:04:46.134436 4743 reconciler_common.go:293] "Volume detached for volume \"pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7e9a3d1e-84f6-4f0e-a7a4-8636d2059c22\") on node \"crc\" DevicePath \"\"" Nov 22 11:04:46 crc kubenswrapper[4743]: I1122 11:04:46.146627 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Nov 22 11:04:47 crc kubenswrapper[4743]: I1122 11:04:47.168696 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="019b3b66-2805-4684-bf84-50705fbbdaf8" path="/var/lib/kubelet/pods/019b3b66-2805-4684-bf84-50705fbbdaf8/volumes" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.226176 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hd96k"] Nov 22 11:05:11 crc kubenswrapper[4743]: E1122 11:05:11.229391 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerName="extract-utilities" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.229502 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerName="extract-utilities" Nov 22 11:05:11 crc kubenswrapper[4743]: E1122 11:05:11.229615 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerName="extract-content" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.229709 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerName="extract-content" Nov 22 11:05:11 crc kubenswrapper[4743]: E1122 11:05:11.229823 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerName="registry-server" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.229909 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerName="registry-server" Nov 22 11:05:11 crc kubenswrapper[4743]: E1122 11:05:11.230005 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019b3b66-2805-4684-bf84-50705fbbdaf8" containerName="adoption" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.231434 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="019b3b66-2805-4684-bf84-50705fbbdaf8" containerName="adoption" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.231980 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="019b3b66-2805-4684-bf84-50705fbbdaf8" containerName="adoption" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.232112 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d801de-ad32-4f73-8201-07b59bf3c98c" containerName="registry-server" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.234940 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.252701 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hd96k"] Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.280234 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-utilities\") pod \"redhat-operators-hd96k\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.280323 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-catalog-content\") pod \"redhat-operators-hd96k\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.280508 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wbj4\" (UniqueName: \"kubernetes.io/projected/c6db6b08-160a-4bdb-9fa0-a410339c7718-kube-api-access-8wbj4\") pod \"redhat-operators-hd96k\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.384219 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-utilities\") pod \"redhat-operators-hd96k\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.384307 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-catalog-content\") pod \"redhat-operators-hd96k\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.384465 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wbj4\" (UniqueName: \"kubernetes.io/projected/c6db6b08-160a-4bdb-9fa0-a410339c7718-kube-api-access-8wbj4\") pod \"redhat-operators-hd96k\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.385245 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-catalog-content\") pod \"redhat-operators-hd96k\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.385443 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-utilities\") pod \"redhat-operators-hd96k\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.410232 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wbj4\" (UniqueName: \"kubernetes.io/projected/c6db6b08-160a-4bdb-9fa0-a410339c7718-kube-api-access-8wbj4\") pod \"redhat-operators-hd96k\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:11 crc kubenswrapper[4743]: I1122 11:05:11.568273 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:12 crc kubenswrapper[4743]: I1122 11:05:12.246055 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hd96k"] Nov 22 11:05:13 crc kubenswrapper[4743]: I1122 11:05:13.113381 4743 generic.go:334] "Generic (PLEG): container finished" podID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerID="2c9bc13f3c3ba884a5dd7b1acd97ce6ccd63544756d4b482097600a2ea74e907" exitCode=0 Nov 22 11:05:13 crc kubenswrapper[4743]: I1122 11:05:13.113500 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd96k" event={"ID":"c6db6b08-160a-4bdb-9fa0-a410339c7718","Type":"ContainerDied","Data":"2c9bc13f3c3ba884a5dd7b1acd97ce6ccd63544756d4b482097600a2ea74e907"} Nov 22 11:05:13 crc kubenswrapper[4743]: I1122 11:05:13.113917 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd96k" event={"ID":"c6db6b08-160a-4bdb-9fa0-a410339c7718","Type":"ContainerStarted","Data":"868f68c673c090eaccabc224b70d0c6d986a57510a4bd9689c4b067e796b3efe"} Nov 22 11:05:14 crc kubenswrapper[4743]: I1122 11:05:14.130511 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd96k" event={"ID":"c6db6b08-160a-4bdb-9fa0-a410339c7718","Type":"ContainerStarted","Data":"ae374bdcff7b3173a57a97ff5e53e7d64012f7e2d6c18e9bb4fb895929b02afc"} Nov 22 11:05:20 crc kubenswrapper[4743]: I1122 11:05:20.223519 4743 generic.go:334] "Generic (PLEG): container finished" podID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerID="ae374bdcff7b3173a57a97ff5e53e7d64012f7e2d6c18e9bb4fb895929b02afc" exitCode=0 Nov 22 11:05:20 crc kubenswrapper[4743]: I1122 11:05:20.223673 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd96k" event={"ID":"c6db6b08-160a-4bdb-9fa0-a410339c7718","Type":"ContainerDied","Data":"ae374bdcff7b3173a57a97ff5e53e7d64012f7e2d6c18e9bb4fb895929b02afc"} Nov 22 11:05:21 crc kubenswrapper[4743]: I1122 11:05:21.238599 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd96k" event={"ID":"c6db6b08-160a-4bdb-9fa0-a410339c7718","Type":"ContainerStarted","Data":"c993782d3ac2e611c543cfd78a57890da7195b99e1bd28e56e6e3fdf022665df"} Nov 22 11:05:21 crc kubenswrapper[4743]: I1122 11:05:21.283403 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hd96k" podStartSLOduration=2.747164868 podStartE2EDuration="10.283373631s" podCreationTimestamp="2025-11-22 11:05:11 +0000 UTC" firstStartedPulling="2025-11-22 11:05:13.117027864 +0000 UTC m=+9786.823388916" lastFinishedPulling="2025-11-22 11:05:20.653236627 +0000 UTC m=+9794.359597679" observedRunningTime="2025-11-22 11:05:21.272314455 +0000 UTC m=+9794.978675507" watchObservedRunningTime="2025-11-22 11:05:21.283373631 +0000 UTC m=+9794.989734693" Nov 22 11:05:21 crc kubenswrapper[4743]: I1122 11:05:21.569003 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:21 crc kubenswrapper[4743]: I1122 11:05:21.569091 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:22 crc kubenswrapper[4743]: I1122 11:05:22.777945 4743 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hd96k" podUID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerName="registry-server" probeResult="failure" output=< Nov 22 11:05:22 crc kubenswrapper[4743]: timeout: failed to connect service ":50051" within 1s Nov 22 11:05:22 crc kubenswrapper[4743]: > Nov 22 11:05:32 crc kubenswrapper[4743]: I1122 11:05:32.281847 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:32 crc kubenswrapper[4743]: I1122 11:05:32.353270 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:32 crc kubenswrapper[4743]: I1122 11:05:32.527543 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hd96k"] Nov 22 11:05:33 crc kubenswrapper[4743]: I1122 11:05:33.387787 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hd96k" podUID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerName="registry-server" containerID="cri-o://c993782d3ac2e611c543cfd78a57890da7195b99e1bd28e56e6e3fdf022665df" gracePeriod=2 Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.401787 4743 generic.go:334] "Generic (PLEG): container finished" podID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerID="c993782d3ac2e611c543cfd78a57890da7195b99e1bd28e56e6e3fdf022665df" exitCode=0 Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.401882 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd96k" event={"ID":"c6db6b08-160a-4bdb-9fa0-a410339c7718","Type":"ContainerDied","Data":"c993782d3ac2e611c543cfd78a57890da7195b99e1bd28e56e6e3fdf022665df"} Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.656509 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.751779 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wbj4\" (UniqueName: \"kubernetes.io/projected/c6db6b08-160a-4bdb-9fa0-a410339c7718-kube-api-access-8wbj4\") pod \"c6db6b08-160a-4bdb-9fa0-a410339c7718\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.751871 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-catalog-content\") pod \"c6db6b08-160a-4bdb-9fa0-a410339c7718\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.752004 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-utilities\") pod \"c6db6b08-160a-4bdb-9fa0-a410339c7718\" (UID: \"c6db6b08-160a-4bdb-9fa0-a410339c7718\") " Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.753556 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-utilities" (OuterVolumeSpecName: "utilities") pod "c6db6b08-160a-4bdb-9fa0-a410339c7718" (UID: "c6db6b08-160a-4bdb-9fa0-a410339c7718"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.838518 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6db6b08-160a-4bdb-9fa0-a410339c7718-kube-api-access-8wbj4" (OuterVolumeSpecName: "kube-api-access-8wbj4") pod "c6db6b08-160a-4bdb-9fa0-a410339c7718" (UID: "c6db6b08-160a-4bdb-9fa0-a410339c7718"). InnerVolumeSpecName "kube-api-access-8wbj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.854361 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.854402 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wbj4\" (UniqueName: \"kubernetes.io/projected/c6db6b08-160a-4bdb-9fa0-a410339c7718-kube-api-access-8wbj4\") on node \"crc\" DevicePath \"\"" Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.855736 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6db6b08-160a-4bdb-9fa0-a410339c7718" (UID: "c6db6b08-160a-4bdb-9fa0-a410339c7718"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:05:34 crc kubenswrapper[4743]: I1122 11:05:34.957162 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db6b08-160a-4bdb-9fa0-a410339c7718-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 11:05:35 crc kubenswrapper[4743]: I1122 11:05:35.416326 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hd96k" event={"ID":"c6db6b08-160a-4bdb-9fa0-a410339c7718","Type":"ContainerDied","Data":"868f68c673c090eaccabc224b70d0c6d986a57510a4bd9689c4b067e796b3efe"} Nov 22 11:05:35 crc kubenswrapper[4743]: I1122 11:05:35.416374 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hd96k" Nov 22 11:05:35 crc kubenswrapper[4743]: I1122 11:05:35.416397 4743 scope.go:117] "RemoveContainer" containerID="c993782d3ac2e611c543cfd78a57890da7195b99e1bd28e56e6e3fdf022665df" Nov 22 11:05:35 crc kubenswrapper[4743]: I1122 11:05:35.466997 4743 scope.go:117] "RemoveContainer" containerID="ae374bdcff7b3173a57a97ff5e53e7d64012f7e2d6c18e9bb4fb895929b02afc" Nov 22 11:05:35 crc kubenswrapper[4743]: I1122 11:05:35.467003 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hd96k"] Nov 22 11:05:35 crc kubenswrapper[4743]: I1122 11:05:35.477328 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hd96k"] Nov 22 11:05:35 crc kubenswrapper[4743]: I1122 11:05:35.502559 4743 scope.go:117] "RemoveContainer" containerID="2c9bc13f3c3ba884a5dd7b1acd97ce6ccd63544756d4b482097600a2ea74e907" Nov 22 11:05:37 crc kubenswrapper[4743]: I1122 11:05:37.186256 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6db6b08-160a-4bdb-9fa0-a410339c7718" path="/var/lib/kubelet/pods/c6db6b08-160a-4bdb-9fa0-a410339c7718/volumes" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.390364 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-psx7f/must-gather-bwg79"] Nov 22 11:05:55 crc kubenswrapper[4743]: E1122 11:05:55.391907 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerName="registry-server" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.391929 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerName="registry-server" Nov 22 11:05:55 crc kubenswrapper[4743]: E1122 11:05:55.391945 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerName="extract-content" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.391951 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerName="extract-content" Nov 22 11:05:55 crc kubenswrapper[4743]: E1122 11:05:55.391971 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerName="extract-utilities" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.391977 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerName="extract-utilities" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.392246 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6db6b08-160a-4bdb-9fa0-a410339c7718" containerName="registry-server" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.393652 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/must-gather-bwg79" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.397109 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-psx7f"/"openshift-service-ca.crt" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.397291 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-psx7f"/"default-dockercfg-ksl5n" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.399044 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-psx7f"/"kube-root-ca.crt" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.406039 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-psx7f/must-gather-bwg79"] Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.489496 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj889\" (UniqueName: \"kubernetes.io/projected/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-kube-api-access-fj889\") pod \"must-gather-bwg79\" (UID: \"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc\") " pod="openshift-must-gather-psx7f/must-gather-bwg79" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.489698 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-must-gather-output\") pod \"must-gather-bwg79\" (UID: \"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc\") " pod="openshift-must-gather-psx7f/must-gather-bwg79" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.592661 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj889\" (UniqueName: \"kubernetes.io/projected/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-kube-api-access-fj889\") pod \"must-gather-bwg79\" (UID: \"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc\") " pod="openshift-must-gather-psx7f/must-gather-bwg79" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.592816 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-must-gather-output\") pod \"must-gather-bwg79\" (UID: \"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc\") " pod="openshift-must-gather-psx7f/must-gather-bwg79" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.593354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-must-gather-output\") pod \"must-gather-bwg79\" (UID: \"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc\") " pod="openshift-must-gather-psx7f/must-gather-bwg79" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.615625 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj889\" (UniqueName: \"kubernetes.io/projected/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-kube-api-access-fj889\") pod \"must-gather-bwg79\" (UID: \"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc\") " pod="openshift-must-gather-psx7f/must-gather-bwg79" Nov 22 11:05:55 crc kubenswrapper[4743]: I1122 11:05:55.721175 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/must-gather-bwg79" Nov 22 11:05:56 crc kubenswrapper[4743]: I1122 11:05:56.235414 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-psx7f/must-gather-bwg79"] Nov 22 11:05:56 crc kubenswrapper[4743]: I1122 11:05:56.687975 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-psx7f/must-gather-bwg79" event={"ID":"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc","Type":"ContainerStarted","Data":"2be36d5982422f02e543d648b0e0720dcc71d7dd076a5d804191135d86cf1aa8"} Nov 22 11:06:04 crc kubenswrapper[4743]: I1122 11:06:04.851539 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-psx7f/must-gather-bwg79" event={"ID":"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc","Type":"ContainerStarted","Data":"092289b25c3b4d2a54a25663c85a380924a3967c784e224d2a11a02f7f135a29"} Nov 22 11:06:04 crc kubenswrapper[4743]: I1122 11:06:04.852538 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-psx7f/must-gather-bwg79" event={"ID":"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc","Type":"ContainerStarted","Data":"252ef331d1b8566e4586e1fb29bba1780dbe652f9faa375289fc3c49d86ace3a"} Nov 22 11:06:04 crc kubenswrapper[4743]: I1122 11:06:04.885461 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-psx7f/must-gather-bwg79" podStartSLOduration=2.6182765420000003 podStartE2EDuration="9.885427402s" podCreationTimestamp="2025-11-22 11:05:55 +0000 UTC" firstStartedPulling="2025-11-22 11:05:56.274595919 +0000 UTC m=+9829.980956971" lastFinishedPulling="2025-11-22 11:06:03.541746789 +0000 UTC m=+9837.248107831" observedRunningTime="2025-11-22 11:06:04.871909236 +0000 UTC m=+9838.578270288" watchObservedRunningTime="2025-11-22 11:06:04.885427402 +0000 UTC m=+9838.591788454" Nov 22 11:06:08 crc kubenswrapper[4743]: I1122 11:06:08.917653 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-psx7f/crc-debug-k8t94"] Nov 22 11:06:08 crc kubenswrapper[4743]: I1122 11:06:08.920891 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/crc-debug-k8t94" Nov 22 11:06:09 crc kubenswrapper[4743]: I1122 11:06:09.034003 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d2b39f-2326-42ca-86fc-b3abe4027a69-host\") pod \"crc-debug-k8t94\" (UID: \"37d2b39f-2326-42ca-86fc-b3abe4027a69\") " pod="openshift-must-gather-psx7f/crc-debug-k8t94" Nov 22 11:06:09 crc kubenswrapper[4743]: I1122 11:06:09.034124 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfcc\" (UniqueName: \"kubernetes.io/projected/37d2b39f-2326-42ca-86fc-b3abe4027a69-kube-api-access-ggfcc\") pod \"crc-debug-k8t94\" (UID: \"37d2b39f-2326-42ca-86fc-b3abe4027a69\") " pod="openshift-must-gather-psx7f/crc-debug-k8t94" Nov 22 11:06:09 crc kubenswrapper[4743]: I1122 11:06:09.137102 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfcc\" (UniqueName: \"kubernetes.io/projected/37d2b39f-2326-42ca-86fc-b3abe4027a69-kube-api-access-ggfcc\") pod \"crc-debug-k8t94\" (UID: \"37d2b39f-2326-42ca-86fc-b3abe4027a69\") " pod="openshift-must-gather-psx7f/crc-debug-k8t94" Nov 22 11:06:09 crc kubenswrapper[4743]: I1122 11:06:09.137322 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d2b39f-2326-42ca-86fc-b3abe4027a69-host\") pod \"crc-debug-k8t94\" (UID: \"37d2b39f-2326-42ca-86fc-b3abe4027a69\") " pod="openshift-must-gather-psx7f/crc-debug-k8t94" Nov 22 11:06:09 crc kubenswrapper[4743]: I1122 11:06:09.137468 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d2b39f-2326-42ca-86fc-b3abe4027a69-host\") pod \"crc-debug-k8t94\" (UID: \"37d2b39f-2326-42ca-86fc-b3abe4027a69\") " pod="openshift-must-gather-psx7f/crc-debug-k8t94" Nov 22 11:06:09 crc kubenswrapper[4743]: I1122 11:06:09.174174 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfcc\" (UniqueName: \"kubernetes.io/projected/37d2b39f-2326-42ca-86fc-b3abe4027a69-kube-api-access-ggfcc\") pod \"crc-debug-k8t94\" (UID: \"37d2b39f-2326-42ca-86fc-b3abe4027a69\") " pod="openshift-must-gather-psx7f/crc-debug-k8t94" Nov 22 11:06:09 crc kubenswrapper[4743]: I1122 11:06:09.253620 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/crc-debug-k8t94" Nov 22 11:06:09 crc kubenswrapper[4743]: W1122 11:06:09.319680 4743 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37d2b39f_2326_42ca_86fc_b3abe4027a69.slice/crio-85437952bb6a6186e9e11eca4fa115059c4360922c74f9b46bafb1aa19746afd WatchSource:0}: Error finding container 85437952bb6a6186e9e11eca4fa115059c4360922c74f9b46bafb1aa19746afd: Status 404 returned error can't find the container with id 85437952bb6a6186e9e11eca4fa115059c4360922c74f9b46bafb1aa19746afd Nov 22 11:06:09 crc kubenswrapper[4743]: I1122 11:06:09.932053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-psx7f/crc-debug-k8t94" event={"ID":"37d2b39f-2326-42ca-86fc-b3abe4027a69","Type":"ContainerStarted","Data":"85437952bb6a6186e9e11eca4fa115059c4360922c74f9b46bafb1aa19746afd"} Nov 22 11:06:24 crc kubenswrapper[4743]: I1122 11:06:24.117480 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-psx7f/crc-debug-k8t94" event={"ID":"37d2b39f-2326-42ca-86fc-b3abe4027a69","Type":"ContainerStarted","Data":"827d5a9cc002ab0c3cd0f71d63ae7a322cecb6898dc9e3e2f4a7c5f09d0fb438"} Nov 22 11:06:24 crc kubenswrapper[4743]: I1122 11:06:24.140418 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-psx7f/crc-debug-k8t94" podStartSLOduration=1.994348805 podStartE2EDuration="16.140402388s" podCreationTimestamp="2025-11-22 11:06:08 +0000 UTC" firstStartedPulling="2025-11-22 11:06:09.322604941 +0000 UTC m=+9843.028965993" lastFinishedPulling="2025-11-22 11:06:23.468658524 +0000 UTC m=+9857.175019576" observedRunningTime="2025-11-22 11:06:24.133712797 +0000 UTC m=+9857.840073859" watchObservedRunningTime="2025-11-22 11:06:24.140402388 +0000 UTC m=+9857.846763440" Nov 22 11:06:31 crc kubenswrapper[4743]: I1122 11:06:31.240965 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 11:06:31 crc kubenswrapper[4743]: I1122 11:06:31.241665 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 11:06:47 crc kubenswrapper[4743]: I1122 11:06:47.428298 4743 generic.go:334] "Generic (PLEG): container finished" podID="37d2b39f-2326-42ca-86fc-b3abe4027a69" containerID="827d5a9cc002ab0c3cd0f71d63ae7a322cecb6898dc9e3e2f4a7c5f09d0fb438" exitCode=0 Nov 22 11:06:47 crc kubenswrapper[4743]: I1122 11:06:47.428376 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-psx7f/crc-debug-k8t94" event={"ID":"37d2b39f-2326-42ca-86fc-b3abe4027a69","Type":"ContainerDied","Data":"827d5a9cc002ab0c3cd0f71d63ae7a322cecb6898dc9e3e2f4a7c5f09d0fb438"} Nov 22 11:06:48 crc kubenswrapper[4743]: I1122 11:06:48.619221 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/crc-debug-k8t94" Nov 22 11:06:48 crc kubenswrapper[4743]: I1122 11:06:48.670309 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-psx7f/crc-debug-k8t94"] Nov 22 11:06:48 crc kubenswrapper[4743]: I1122 11:06:48.687633 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-psx7f/crc-debug-k8t94"] Nov 22 11:06:48 crc kubenswrapper[4743]: I1122 11:06:48.724754 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d2b39f-2326-42ca-86fc-b3abe4027a69-host\") pod \"37d2b39f-2326-42ca-86fc-b3abe4027a69\" (UID: \"37d2b39f-2326-42ca-86fc-b3abe4027a69\") " Nov 22 11:06:48 crc kubenswrapper[4743]: I1122 11:06:48.724873 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37d2b39f-2326-42ca-86fc-b3abe4027a69-host" (OuterVolumeSpecName: "host") pod "37d2b39f-2326-42ca-86fc-b3abe4027a69" (UID: "37d2b39f-2326-42ca-86fc-b3abe4027a69"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 11:06:48 crc kubenswrapper[4743]: I1122 11:06:48.724909 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggfcc\" (UniqueName: \"kubernetes.io/projected/37d2b39f-2326-42ca-86fc-b3abe4027a69-kube-api-access-ggfcc\") pod \"37d2b39f-2326-42ca-86fc-b3abe4027a69\" (UID: \"37d2b39f-2326-42ca-86fc-b3abe4027a69\") " Nov 22 11:06:48 crc kubenswrapper[4743]: I1122 11:06:48.725986 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/37d2b39f-2326-42ca-86fc-b3abe4027a69-host\") on node \"crc\" DevicePath \"\"" Nov 22 11:06:48 crc kubenswrapper[4743]: I1122 11:06:48.738990 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37d2b39f-2326-42ca-86fc-b3abe4027a69-kube-api-access-ggfcc" (OuterVolumeSpecName: "kube-api-access-ggfcc") pod "37d2b39f-2326-42ca-86fc-b3abe4027a69" (UID: "37d2b39f-2326-42ca-86fc-b3abe4027a69"). InnerVolumeSpecName "kube-api-access-ggfcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:06:48 crc kubenswrapper[4743]: I1122 11:06:48.827893 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggfcc\" (UniqueName: \"kubernetes.io/projected/37d2b39f-2326-42ca-86fc-b3abe4027a69-kube-api-access-ggfcc\") on node \"crc\" DevicePath \"\"" Nov 22 11:06:49 crc kubenswrapper[4743]: I1122 11:06:49.166481 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37d2b39f-2326-42ca-86fc-b3abe4027a69" path="/var/lib/kubelet/pods/37d2b39f-2326-42ca-86fc-b3abe4027a69/volumes" Nov 22 11:06:49 crc kubenswrapper[4743]: E1122 11:06:49.471850 4743 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37d2b39f_2326_42ca_86fc_b3abe4027a69.slice\": RecentStats: unable to find data in memory cache]" Nov 22 11:06:49 crc kubenswrapper[4743]: I1122 11:06:49.475203 4743 scope.go:117] "RemoveContainer" containerID="827d5a9cc002ab0c3cd0f71d63ae7a322cecb6898dc9e3e2f4a7c5f09d0fb438" Nov 22 11:06:49 crc kubenswrapper[4743]: I1122 11:06:49.475512 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/crc-debug-k8t94" Nov 22 11:06:49 crc kubenswrapper[4743]: I1122 11:06:49.959025 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-psx7f/crc-debug-7x42t"] Nov 22 11:06:49 crc kubenswrapper[4743]: E1122 11:06:49.960559 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37d2b39f-2326-42ca-86fc-b3abe4027a69" containerName="container-00" Nov 22 11:06:49 crc kubenswrapper[4743]: I1122 11:06:49.960598 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="37d2b39f-2326-42ca-86fc-b3abe4027a69" containerName="container-00" Nov 22 11:06:49 crc kubenswrapper[4743]: I1122 11:06:49.960975 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="37d2b39f-2326-42ca-86fc-b3abe4027a69" containerName="container-00" Nov 22 11:06:49 crc kubenswrapper[4743]: I1122 11:06:49.962224 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/crc-debug-7x42t" Nov 22 11:06:50 crc kubenswrapper[4743]: I1122 11:06:50.065007 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhcf7\" (UniqueName: \"kubernetes.io/projected/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-kube-api-access-mhcf7\") pod \"crc-debug-7x42t\" (UID: \"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7\") " pod="openshift-must-gather-psx7f/crc-debug-7x42t" Nov 22 11:06:50 crc kubenswrapper[4743]: I1122 11:06:50.065999 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-host\") pod \"crc-debug-7x42t\" (UID: \"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7\") " pod="openshift-must-gather-psx7f/crc-debug-7x42t" Nov 22 11:06:50 crc kubenswrapper[4743]: I1122 11:06:50.168143 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-host\") pod \"crc-debug-7x42t\" (UID: \"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7\") " pod="openshift-must-gather-psx7f/crc-debug-7x42t" Nov 22 11:06:50 crc kubenswrapper[4743]: I1122 11:06:50.168222 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhcf7\" (UniqueName: \"kubernetes.io/projected/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-kube-api-access-mhcf7\") pod \"crc-debug-7x42t\" (UID: \"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7\") " pod="openshift-must-gather-psx7f/crc-debug-7x42t" Nov 22 11:06:50 crc kubenswrapper[4743]: I1122 11:06:50.168317 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-host\") pod \"crc-debug-7x42t\" (UID: \"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7\") " pod="openshift-must-gather-psx7f/crc-debug-7x42t" Nov 22 11:06:50 crc kubenswrapper[4743]: I1122 11:06:50.193107 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhcf7\" (UniqueName: \"kubernetes.io/projected/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-kube-api-access-mhcf7\") pod \"crc-debug-7x42t\" (UID: \"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7\") " pod="openshift-must-gather-psx7f/crc-debug-7x42t" Nov 22 11:06:50 crc kubenswrapper[4743]: I1122 11:06:50.285887 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/crc-debug-7x42t" Nov 22 11:06:50 crc kubenswrapper[4743]: I1122 11:06:50.488892 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-psx7f/crc-debug-7x42t" event={"ID":"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7","Type":"ContainerStarted","Data":"f1e99ece0b833ffe403e16b445965383f752ccc914bbe6211dbeed0344cd4132"} Nov 22 11:06:51 crc kubenswrapper[4743]: I1122 11:06:51.511384 4743 generic.go:334] "Generic (PLEG): container finished" podID="970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7" containerID="2a8c594488fae4602b1d9801415e2d51257f4a23e0bd756180e6415d701a8cf1" exitCode=1 Nov 22 11:06:51 crc kubenswrapper[4743]: I1122 11:06:51.511677 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-psx7f/crc-debug-7x42t" event={"ID":"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7","Type":"ContainerDied","Data":"2a8c594488fae4602b1d9801415e2d51257f4a23e0bd756180e6415d701a8cf1"} Nov 22 11:06:51 crc kubenswrapper[4743]: I1122 11:06:51.559430 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-psx7f/crc-debug-7x42t"] Nov 22 11:06:51 crc kubenswrapper[4743]: I1122 11:06:51.570456 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-psx7f/crc-debug-7x42t"] Nov 22 11:06:52 crc kubenswrapper[4743]: I1122 11:06:52.668799 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/crc-debug-7x42t" Nov 22 11:06:52 crc kubenswrapper[4743]: I1122 11:06:52.774314 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-host\") pod \"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7\" (UID: \"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7\") " Nov 22 11:06:52 crc kubenswrapper[4743]: I1122 11:06:52.774490 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-host" (OuterVolumeSpecName: "host") pod "970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7" (UID: "970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Nov 22 11:06:52 crc kubenswrapper[4743]: I1122 11:06:52.775066 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhcf7\" (UniqueName: \"kubernetes.io/projected/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-kube-api-access-mhcf7\") pod \"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7\" (UID: \"970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7\") " Nov 22 11:06:52 crc kubenswrapper[4743]: I1122 11:06:52.776659 4743 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-host\") on node \"crc\" DevicePath \"\"" Nov 22 11:06:52 crc kubenswrapper[4743]: I1122 11:06:52.784197 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-kube-api-access-mhcf7" (OuterVolumeSpecName: "kube-api-access-mhcf7") pod "970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7" (UID: "970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7"). InnerVolumeSpecName "kube-api-access-mhcf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:06:52 crc kubenswrapper[4743]: I1122 11:06:52.879132 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhcf7\" (UniqueName: \"kubernetes.io/projected/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7-kube-api-access-mhcf7\") on node \"crc\" DevicePath \"\"" Nov 22 11:06:53 crc kubenswrapper[4743]: I1122 11:06:53.169507 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7" path="/var/lib/kubelet/pods/970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7/volumes" Nov 22 11:06:53 crc kubenswrapper[4743]: I1122 11:06:53.536006 4743 scope.go:117] "RemoveContainer" containerID="2a8c594488fae4602b1d9801415e2d51257f4a23e0bd756180e6415d701a8cf1" Nov 22 11:06:53 crc kubenswrapper[4743]: I1122 11:06:53.536171 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/crc-debug-7x42t" Nov 22 11:07:01 crc kubenswrapper[4743]: I1122 11:07:01.241680 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 11:07:01 crc kubenswrapper[4743]: I1122 11:07:01.242921 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 11:07:31 crc kubenswrapper[4743]: I1122 11:07:31.241741 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 11:07:31 crc kubenswrapper[4743]: I1122 11:07:31.242967 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 11:07:31 crc kubenswrapper[4743]: I1122 11:07:31.243064 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 11:07:31 crc kubenswrapper[4743]: I1122 11:07:31.244737 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 11:07:31 crc kubenswrapper[4743]: I1122 11:07:31.244850 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" gracePeriod=600 Nov 22 11:07:31 crc kubenswrapper[4743]: I1122 11:07:31.987001 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" exitCode=0 Nov 22 11:07:31 crc kubenswrapper[4743]: I1122 11:07:31.987325 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1"} Nov 22 11:07:31 crc kubenswrapper[4743]: I1122 11:07:31.987467 4743 scope.go:117] "RemoveContainer" containerID="f7db4407799e7db0c973d757e817035cf90ac74b5ed6a44d7d7ae6de2f172b71" Nov 22 11:07:32 crc kubenswrapper[4743]: E1122 11:07:32.625004 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:07:33 crc kubenswrapper[4743]: I1122 11:07:33.001406 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:07:33 crc kubenswrapper[4743]: E1122 11:07:33.002137 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:07:38 crc kubenswrapper[4743]: I1122 11:07:38.981189 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-84h5p"] Nov 22 11:07:38 crc kubenswrapper[4743]: E1122 11:07:38.982596 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7" containerName="container-00" Nov 22 11:07:38 crc kubenswrapper[4743]: I1122 11:07:38.982618 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7" containerName="container-00" Nov 22 11:07:38 crc kubenswrapper[4743]: I1122 11:07:38.982920 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="970ee3ed-95a9-4673-b5a4-70b1fdfb9fe7" containerName="container-00" Nov 22 11:07:38 crc kubenswrapper[4743]: I1122 11:07:38.984890 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.007973 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84h5p"] Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.009189 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-utilities\") pod \"community-operators-84h5p\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.009363 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jrwd\" (UniqueName: \"kubernetes.io/projected/523f648f-43d2-4e73-81ff-e87f41f07ab1-kube-api-access-9jrwd\") pod \"community-operators-84h5p\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.009512 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-catalog-content\") pod \"community-operators-84h5p\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.110564 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jrwd\" (UniqueName: \"kubernetes.io/projected/523f648f-43d2-4e73-81ff-e87f41f07ab1-kube-api-access-9jrwd\") pod \"community-operators-84h5p\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.110905 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-catalog-content\") pod \"community-operators-84h5p\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.111134 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-utilities\") pod \"community-operators-84h5p\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.111390 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-catalog-content\") pod \"community-operators-84h5p\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.111441 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-utilities\") pod \"community-operators-84h5p\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.241633 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jrwd\" (UniqueName: \"kubernetes.io/projected/523f648f-43d2-4e73-81ff-e87f41f07ab1-kube-api-access-9jrwd\") pod \"community-operators-84h5p\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:39 crc kubenswrapper[4743]: I1122 11:07:39.307649 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:40 crc kubenswrapper[4743]: I1122 11:07:40.279697 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84h5p"] Nov 22 11:07:41 crc kubenswrapper[4743]: I1122 11:07:41.087351 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84h5p" event={"ID":"523f648f-43d2-4e73-81ff-e87f41f07ab1","Type":"ContainerStarted","Data":"ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97"} Nov 22 11:07:41 crc kubenswrapper[4743]: I1122 11:07:41.088053 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84h5p" event={"ID":"523f648f-43d2-4e73-81ff-e87f41f07ab1","Type":"ContainerStarted","Data":"5d953a33a708272bba189c85f4274caf01b1e24ac2492b769a099d39aacaba42"} Nov 22 11:07:42 crc kubenswrapper[4743]: I1122 11:07:42.098862 4743 generic.go:334] "Generic (PLEG): container finished" podID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerID="ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97" exitCode=0 Nov 22 11:07:42 crc kubenswrapper[4743]: I1122 11:07:42.099291 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84h5p" event={"ID":"523f648f-43d2-4e73-81ff-e87f41f07ab1","Type":"ContainerDied","Data":"ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97"} Nov 22 11:07:43 crc kubenswrapper[4743]: I1122 11:07:43.112647 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84h5p" event={"ID":"523f648f-43d2-4e73-81ff-e87f41f07ab1","Type":"ContainerStarted","Data":"ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521"} Nov 22 11:07:45 crc kubenswrapper[4743]: I1122 11:07:45.134520 4743 generic.go:334] "Generic (PLEG): container finished" podID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerID="ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521" exitCode=0 Nov 22 11:07:45 crc kubenswrapper[4743]: I1122 11:07:45.134644 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84h5p" event={"ID":"523f648f-43d2-4e73-81ff-e87f41f07ab1","Type":"ContainerDied","Data":"ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521"} Nov 22 11:07:45 crc kubenswrapper[4743]: I1122 11:07:45.152873 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:07:45 crc kubenswrapper[4743]: E1122 11:07:45.153864 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:07:46 crc kubenswrapper[4743]: I1122 11:07:46.160953 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84h5p" event={"ID":"523f648f-43d2-4e73-81ff-e87f41f07ab1","Type":"ContainerStarted","Data":"b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9"} Nov 22 11:07:46 crc kubenswrapper[4743]: I1122 11:07:46.187805 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-84h5p" podStartSLOduration=4.594916149 podStartE2EDuration="8.187783921s" podCreationTimestamp="2025-11-22 11:07:38 +0000 UTC" firstStartedPulling="2025-11-22 11:07:42.10384085 +0000 UTC m=+9935.810201902" lastFinishedPulling="2025-11-22 11:07:45.696708622 +0000 UTC m=+9939.403069674" observedRunningTime="2025-11-22 11:07:46.185362662 +0000 UTC m=+9939.891723724" watchObservedRunningTime="2025-11-22 11:07:46.187783921 +0000 UTC m=+9939.894144973" Nov 22 11:07:49 crc kubenswrapper[4743]: I1122 11:07:49.308387 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:49 crc kubenswrapper[4743]: I1122 11:07:49.309884 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:49 crc kubenswrapper[4743]: I1122 11:07:49.371514 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:50 crc kubenswrapper[4743]: I1122 11:07:50.288446 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:50 crc kubenswrapper[4743]: I1122 11:07:50.381736 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84h5p"] Nov 22 11:07:52 crc kubenswrapper[4743]: I1122 11:07:52.237115 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-84h5p" podUID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerName="registry-server" containerID="cri-o://b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9" gracePeriod=2 Nov 22 11:07:52 crc kubenswrapper[4743]: I1122 11:07:52.799402 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:52 crc kubenswrapper[4743]: I1122 11:07:52.900856 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jrwd\" (UniqueName: \"kubernetes.io/projected/523f648f-43d2-4e73-81ff-e87f41f07ab1-kube-api-access-9jrwd\") pod \"523f648f-43d2-4e73-81ff-e87f41f07ab1\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " Nov 22 11:07:52 crc kubenswrapper[4743]: I1122 11:07:52.901143 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-catalog-content\") pod \"523f648f-43d2-4e73-81ff-e87f41f07ab1\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " Nov 22 11:07:52 crc kubenswrapper[4743]: I1122 11:07:52.901282 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-utilities\") pod \"523f648f-43d2-4e73-81ff-e87f41f07ab1\" (UID: \"523f648f-43d2-4e73-81ff-e87f41f07ab1\") " Nov 22 11:07:52 crc kubenswrapper[4743]: I1122 11:07:52.902480 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-utilities" (OuterVolumeSpecName: "utilities") pod "523f648f-43d2-4e73-81ff-e87f41f07ab1" (UID: "523f648f-43d2-4e73-81ff-e87f41f07ab1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:07:52 crc kubenswrapper[4743]: I1122 11:07:52.909819 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/523f648f-43d2-4e73-81ff-e87f41f07ab1-kube-api-access-9jrwd" (OuterVolumeSpecName: "kube-api-access-9jrwd") pod "523f648f-43d2-4e73-81ff-e87f41f07ab1" (UID: "523f648f-43d2-4e73-81ff-e87f41f07ab1"). InnerVolumeSpecName "kube-api-access-9jrwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:07:52 crc kubenswrapper[4743]: I1122 11:07:52.956439 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "523f648f-43d2-4e73-81ff-e87f41f07ab1" (UID: "523f648f-43d2-4e73-81ff-e87f41f07ab1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.005511 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.005775 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/523f648f-43d2-4e73-81ff-e87f41f07ab1-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.005787 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jrwd\" (UniqueName: \"kubernetes.io/projected/523f648f-43d2-4e73-81ff-e87f41f07ab1-kube-api-access-9jrwd\") on node \"crc\" DevicePath \"\"" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.255488 4743 generic.go:334] "Generic (PLEG): container finished" podID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerID="b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9" exitCode=0 Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.255548 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84h5p" event={"ID":"523f648f-43d2-4e73-81ff-e87f41f07ab1","Type":"ContainerDied","Data":"b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9"} Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.255568 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84h5p" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.255617 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84h5p" event={"ID":"523f648f-43d2-4e73-81ff-e87f41f07ab1","Type":"ContainerDied","Data":"5d953a33a708272bba189c85f4274caf01b1e24ac2492b769a099d39aacaba42"} Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.255684 4743 scope.go:117] "RemoveContainer" containerID="b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.285903 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84h5p"] Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.294366 4743 scope.go:117] "RemoveContainer" containerID="ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.301390 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-84h5p"] Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.646461 4743 scope.go:117] "RemoveContainer" containerID="ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.721895 4743 scope.go:117] "RemoveContainer" containerID="b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9" Nov 22 11:07:53 crc kubenswrapper[4743]: E1122 11:07:53.722713 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9\": container with ID starting with b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9 not found: ID does not exist" containerID="b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.722769 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9"} err="failed to get container status \"b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9\": rpc error: code = NotFound desc = could not find container \"b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9\": container with ID starting with b9e8951cbae1769b5c59379461ace99770efbba39b26350439f49223879a4ed9 not found: ID does not exist" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.722813 4743 scope.go:117] "RemoveContainer" containerID="ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521" Nov 22 11:07:53 crc kubenswrapper[4743]: E1122 11:07:53.725364 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521\": container with ID starting with ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521 not found: ID does not exist" containerID="ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.725395 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521"} err="failed to get container status \"ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521\": rpc error: code = NotFound desc = could not find container \"ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521\": container with ID starting with ff8c184141df0342788dc595ff6b25a9a3a7d5d638527b2aa714dc22d5899521 not found: ID does not exist" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.725417 4743 scope.go:117] "RemoveContainer" containerID="ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97" Nov 22 11:07:53 crc kubenswrapper[4743]: E1122 11:07:53.730969 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97\": container with ID starting with ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97 not found: ID does not exist" containerID="ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97" Nov 22 11:07:53 crc kubenswrapper[4743]: I1122 11:07:53.731021 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97"} err="failed to get container status \"ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97\": rpc error: code = NotFound desc = could not find container \"ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97\": container with ID starting with ad51d1cdd4a7ad01d9061d4e8272aa0e9dc2d4eefa4582820a7eb5a02166fb97 not found: ID does not exist" Nov 22 11:07:55 crc kubenswrapper[4743]: I1122 11:07:55.181430 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="523f648f-43d2-4e73-81ff-e87f41f07ab1" path="/var/lib/kubelet/pods/523f648f-43d2-4e73-81ff-e87f41f07ab1/volumes" Nov 22 11:07:59 crc kubenswrapper[4743]: I1122 11:07:59.152323 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:07:59 crc kubenswrapper[4743]: E1122 11:07:59.153555 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:08:12 crc kubenswrapper[4743]: I1122 11:08:12.153531 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:08:12 crc kubenswrapper[4743]: E1122 11:08:12.154779 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:08:25 crc kubenswrapper[4743]: I1122 11:08:25.151990 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:08:25 crc kubenswrapper[4743]: E1122 11:08:25.153310 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:08:38 crc kubenswrapper[4743]: I1122 11:08:38.152619 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:08:38 crc kubenswrapper[4743]: E1122 11:08:38.153987 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:08:51 crc kubenswrapper[4743]: I1122 11:08:51.151736 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:08:51 crc kubenswrapper[4743]: E1122 11:08:51.152454 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:09:06 crc kubenswrapper[4743]: I1122 11:09:06.152387 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:09:06 crc kubenswrapper[4743]: E1122 11:09:06.153247 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:09:21 crc kubenswrapper[4743]: I1122 11:09:21.152069 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:09:21 crc kubenswrapper[4743]: E1122 11:09:21.153241 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:09:35 crc kubenswrapper[4743]: I1122 11:09:35.152745 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:09:35 crc kubenswrapper[4743]: E1122 11:09:35.153790 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:09:49 crc kubenswrapper[4743]: I1122 11:09:49.152536 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:09:49 crc kubenswrapper[4743]: E1122 11:09:49.153708 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:09:59 crc kubenswrapper[4743]: I1122 11:09:59.117436 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8fb2090e-f38d-4934-99d3-8756dc9552f2/init-config-reloader/0.log" Nov 22 11:09:59 crc kubenswrapper[4743]: I1122 11:09:59.451235 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8fb2090e-f38d-4934-99d3-8756dc9552f2/init-config-reloader/0.log" Nov 22 11:09:59 crc kubenswrapper[4743]: I1122 11:09:59.472694 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8fb2090e-f38d-4934-99d3-8756dc9552f2/alertmanager/0.log" Nov 22 11:09:59 crc kubenswrapper[4743]: I1122 11:09:59.642954 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_8fb2090e-f38d-4934-99d3-8756dc9552f2/config-reloader/0.log" Nov 22 11:09:59 crc kubenswrapper[4743]: I1122 11:09:59.713173 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b402e4ee-d154-408c-9b02-b5966ebda7f1/aodh-api/0.log" Nov 22 11:09:59 crc kubenswrapper[4743]: I1122 11:09:59.854270 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b402e4ee-d154-408c-9b02-b5966ebda7f1/aodh-evaluator/0.log" Nov 22 11:09:59 crc kubenswrapper[4743]: I1122 11:09:59.917378 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b402e4ee-d154-408c-9b02-b5966ebda7f1/aodh-listener/0.log" Nov 22 11:09:59 crc kubenswrapper[4743]: I1122 11:09:59.980430 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_b402e4ee-d154-408c-9b02-b5966ebda7f1/aodh-notifier/0.log" Nov 22 11:10:00 crc kubenswrapper[4743]: I1122 11:10:00.177406 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6bcdbc7bc8-2tcrc_6e25372c-5d60-43bb-94e2-bb2dbe50da35/barbican-api/0.log" Nov 22 11:10:00 crc kubenswrapper[4743]: I1122 11:10:00.222063 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-6bcdbc7bc8-2tcrc_6e25372c-5d60-43bb-94e2-bb2dbe50da35/barbican-api-log/0.log" Nov 22 11:10:00 crc kubenswrapper[4743]: I1122 11:10:00.399350 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75df9d877b-gr58l_e75ef71d-a2f3-4bf0-9b91-9116d4ebedce/barbican-keystone-listener/0.log" Nov 22 11:10:00 crc kubenswrapper[4743]: I1122 11:10:00.444633 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-75df9d877b-gr58l_e75ef71d-a2f3-4bf0-9b91-9116d4ebedce/barbican-keystone-listener-log/0.log" Nov 22 11:10:01 crc kubenswrapper[4743]: I1122 11:10:01.151539 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56b7f9968f-tlnnl_a3b90d81-ea60-48b8-911b-ba9cfefd71e8/barbican-worker/0.log" Nov 22 11:10:01 crc kubenswrapper[4743]: I1122 11:10:01.195894 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56b7f9968f-tlnnl_a3b90d81-ea60-48b8-911b-ba9cfefd71e8/barbican-worker-log/0.log" Nov 22 11:10:01 crc kubenswrapper[4743]: I1122 11:10:01.429481 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-zmtf8_78397a11-5fa6-4b3d-9c5b-09f32678adca/bootstrap-openstack-openstack-cell1/0.log" Nov 22 11:10:01 crc kubenswrapper[4743]: I1122 11:10:01.533533 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_206dab94-6e44-48a4-8ed8-888e77d0ccd8/ceilometer-central-agent/0.log" Nov 22 11:10:01 crc kubenswrapper[4743]: I1122 11:10:01.549833 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_206dab94-6e44-48a4-8ed8-888e77d0ccd8/ceilometer-notification-agent/0.log" Nov 22 11:10:01 crc kubenswrapper[4743]: I1122 11:10:01.692604 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_206dab94-6e44-48a4-8ed8-888e77d0ccd8/proxy-httpd/0.log" Nov 22 11:10:01 crc kubenswrapper[4743]: I1122 11:10:01.768970 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_206dab94-6e44-48a4-8ed8-888e77d0ccd8/sg-core/0.log" Nov 22 11:10:01 crc kubenswrapper[4743]: I1122 11:10:01.809280 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-vz8zn_f07941f2-7f6c-497e-ad3d-6719b60f0111/ceph-client-openstack-openstack-cell1/0.log" Nov 22 11:10:02 crc kubenswrapper[4743]: I1122 11:10:02.105159 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d12c6f64-316e-4bd4-bdd3-5644106566a0/cinder-api-log/0.log" Nov 22 11:10:02 crc kubenswrapper[4743]: I1122 11:10:02.170359 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d12c6f64-316e-4bd4-bdd3-5644106566a0/cinder-api/0.log" Nov 22 11:10:02 crc kubenswrapper[4743]: I1122 11:10:02.454194 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_49819854-1fda-4d24-b2fc-43443fb9c1ef/cinder-backup/0.log" Nov 22 11:10:02 crc kubenswrapper[4743]: I1122 11:10:02.455179 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_49819854-1fda-4d24-b2fc-43443fb9c1ef/probe/0.log" Nov 22 11:10:02 crc kubenswrapper[4743]: I1122 11:10:02.487836 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b1d2762c-ee40-42c3-84a9-5057136d2208/cinder-scheduler/0.log" Nov 22 11:10:02 crc kubenswrapper[4743]: I1122 11:10:02.737871 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_b1d2762c-ee40-42c3-84a9-5057136d2208/probe/0.log" Nov 22 11:10:02 crc kubenswrapper[4743]: I1122 11:10:02.874129 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d2fab6eb-6c8a-405d-9bb4-b393c1706e4b/cinder-volume/0.log" Nov 22 11:10:02 crc kubenswrapper[4743]: I1122 11:10:02.902062 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_d2fab6eb-6c8a-405d-9bb4-b393c1706e4b/probe/0.log" Nov 22 11:10:03 crc kubenswrapper[4743]: I1122 11:10:03.153591 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:10:03 crc kubenswrapper[4743]: E1122 11:10:03.153872 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:10:03 crc kubenswrapper[4743]: I1122 11:10:03.660054 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-vqx9f_bac0d1da-40df-4390-a976-bd5e354f7e4e/configure-network-openstack-openstack-cell1/0.log" Nov 22 11:10:03 crc kubenswrapper[4743]: I1122 11:10:03.712999 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-dtcqr_94c102fa-b4f9-413a-92fb-533fccbe12c7/configure-os-openstack-openstack-cell1/0.log" Nov 22 11:10:03 crc kubenswrapper[4743]: I1122 11:10:03.740930 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d6cd869d9-wrqfx_62f45629-8f43-4b4c-a775-b49b0ed27106/init/0.log" Nov 22 11:10:04 crc kubenswrapper[4743]: I1122 11:10:04.034469 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d6cd869d9-wrqfx_62f45629-8f43-4b4c-a775-b49b0ed27106/init/0.log" Nov 22 11:10:04 crc kubenswrapper[4743]: I1122 11:10:04.128913 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-fqxgm_8fc576b2-ff5d-47bd-bfae-9cbcc92c632a/download-cache-openstack-openstack-cell1/0.log" Nov 22 11:10:04 crc kubenswrapper[4743]: I1122 11:10:04.131072 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-d6cd869d9-wrqfx_62f45629-8f43-4b4c-a775-b49b0ed27106/dnsmasq-dns/0.log" Nov 22 11:10:04 crc kubenswrapper[4743]: I1122 11:10:04.265347 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7c5e8908-5e95-40ef-bb4d-940cc5c38e49/glance-httpd/0.log" Nov 22 11:10:04 crc kubenswrapper[4743]: I1122 11:10:04.327094 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_7c5e8908-5e95-40ef-bb4d-940cc5c38e49/glance-log/0.log" Nov 22 11:10:04 crc kubenswrapper[4743]: I1122 11:10:04.411610 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b576b318-2d3e-40b4-bdb2-3582ab998152/glance-httpd/0.log" Nov 22 11:10:04 crc kubenswrapper[4743]: I1122 11:10:04.417138 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b576b318-2d3e-40b4-bdb2-3582ab998152/glance-log/0.log" Nov 22 11:10:04 crc kubenswrapper[4743]: I1122 11:10:04.682704 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-cdb5f48d-n9r67_f07af23e-fa72-4754-b88e-59aa7423bd8e/heat-api/0.log" Nov 22 11:10:04 crc kubenswrapper[4743]: I1122 11:10:04.814378 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-d4ff5bc67-qmrg8_5eb34585-5e6a-440e-bfbf-694c35c35cd4/heat-cfnapi/0.log" Nov 22 11:10:04 crc kubenswrapper[4743]: I1122 11:10:04.897894 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-58667d47cd-44kmd_3ad7f997-13cd-4561-8de6-17685d0d649a/heat-engine/0.log" Nov 22 11:10:05 crc kubenswrapper[4743]: I1122 11:10:05.086963 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-76456674f-grnzr_b30c1d40-0697-4337-ba40-9090dc6988a5/horizon/0.log" Nov 22 11:10:05 crc kubenswrapper[4743]: I1122 11:10:05.147882 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-76456674f-grnzr_b30c1d40-0697-4337-ba40-9090dc6988a5/horizon-log/0.log" Nov 22 11:10:05 crc kubenswrapper[4743]: I1122 11:10:05.197883 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-77xtc_381c96a8-6592-46e2-b5b7-2000e2577d5c/install-certs-openstack-openstack-cell1/0.log" Nov 22 11:10:05 crc kubenswrapper[4743]: I1122 11:10:05.395081 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-6kt62_b46ca6e3-19cc-454d-85ce-c57f91d88e20/install-os-openstack-openstack-cell1/0.log" Nov 22 11:10:05 crc kubenswrapper[4743]: I1122 11:10:05.563889 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29396761-5jwng_882fc21a-125e-4e4c-816d-d273f8bc6078/keystone-cron/0.log" Nov 22 11:10:05 crc kubenswrapper[4743]: I1122 11:10:05.621398 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-b8795ddf-b2cbj_dcc88110-2290-4a35-99da-2ec2d74d262a/keystone-api/0.log" Nov 22 11:10:05 crc kubenswrapper[4743]: I1122 11:10:05.789921 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29396821-kzkw8_c8e48404-7c25-4096-8ea8-7e1036cca403/keystone-cron/0.log" Nov 22 11:10:05 crc kubenswrapper[4743]: I1122 11:10:05.860339 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_04f1df88-8ca3-4284-b010-0c09b8acde5f/kube-state-metrics/0.log" Nov 22 11:10:06 crc kubenswrapper[4743]: I1122 11:10:06.001596 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-vt6f7_65ef4bef-62b0-4592-94a2-d93d8679ce08/libvirt-openstack-openstack-cell1/0.log" Nov 22 11:10:06 crc kubenswrapper[4743]: I1122 11:10:06.118236 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_221a89ef-fff9-464c-a3db-61deeb85a20b/manila-api-log/0.log" Nov 22 11:10:06 crc kubenswrapper[4743]: I1122 11:10:06.244892 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_221a89ef-fff9-464c-a3db-61deeb85a20b/manila-api/0.log" Nov 22 11:10:06 crc kubenswrapper[4743]: I1122 11:10:06.358359 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_053bd588-f677-48d5-b22d-93b3a70e8c4c/manila-scheduler/0.log" Nov 22 11:10:06 crc kubenswrapper[4743]: I1122 11:10:06.359849 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_053bd588-f677-48d5-b22d-93b3a70e8c4c/probe/0.log" Nov 22 11:10:06 crc kubenswrapper[4743]: I1122 11:10:06.549854 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_1e8ba118-c440-473d-a783-ff6a8e2e8ee5/probe/0.log" Nov 22 11:10:06 crc kubenswrapper[4743]: I1122 11:10:06.559497 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_1e8ba118-c440-473d-a783-ff6a8e2e8ee5/manila-share/0.log" Nov 22 11:10:06 crc kubenswrapper[4743]: I1122 11:10:06.879413 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7647bcffd5-9jhp5_25112c84-a50c-424f-8f5b-6b815720eaa7/neutron-httpd/0.log" Nov 22 11:10:07 crc kubenswrapper[4743]: I1122 11:10:06.999719 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7647bcffd5-9jhp5_25112c84-a50c-424f-8f5b-6b815720eaa7/neutron-api/0.log" Nov 22 11:10:07 crc kubenswrapper[4743]: I1122 11:10:07.132290 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-nxh2g_8f88c0f4-ddcb-4924-ab2a-3179a3f1f616/neutron-dhcp-openstack-openstack-cell1/0.log" Nov 22 11:10:07 crc kubenswrapper[4743]: I1122 11:10:07.485687 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-q9bgn_28484c70-513c-41a2-b0f7-5922002be895/neutron-metadata-openstack-openstack-cell1/0.log" Nov 22 11:10:07 crc kubenswrapper[4743]: I1122 11:10:07.571257 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-tzsvk_4cac0424-ba03-4f34-8433-acbbdcbaeb73/neutron-sriov-openstack-openstack-cell1/0.log" Nov 22 11:10:07 crc kubenswrapper[4743]: I1122 11:10:07.661425 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d74dbd18-0a15-48c8-98f8-c9f4c67e82bd/nova-api-api/0.log" Nov 22 11:10:07 crc kubenswrapper[4743]: I1122 11:10:07.923819 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_d74dbd18-0a15-48c8-98f8-c9f4c67e82bd/nova-api-log/0.log" Nov 22 11:10:07 crc kubenswrapper[4743]: I1122 11:10:07.995877 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_0900ac26-4cb6-4e32-bce4-b2cfce7a18a5/nova-cell0-conductor-conductor/0.log" Nov 22 11:10:08 crc kubenswrapper[4743]: I1122 11:10:08.520058 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_6d4f3395-9c57-463a-8d49-66ffad381e6c/nova-cell1-conductor-conductor/0.log" Nov 22 11:10:08 crc kubenswrapper[4743]: I1122 11:10:08.673401 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ca4774d0-09fd-4ea1-8445-3f3d7ecdb3e0/nova-cell1-novncproxy-novncproxy/0.log" Nov 22 11:10:08 crc kubenswrapper[4743]: I1122 11:10:08.867275 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellfxzms_aece6c0f-51a9-4480-8b13-0da51fca1fc8/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Nov 22 11:10:09 crc kubenswrapper[4743]: I1122 11:10:09.148229 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-94fht_c7eed5f0-702c-4714-ab82-9d23577c2a5f/nova-cell1-openstack-openstack-cell1/0.log" Nov 22 11:10:09 crc kubenswrapper[4743]: I1122 11:10:09.229127 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e4f6482d-edd6-49ad-8ef5-625281832a7b/nova-metadata-metadata/0.log" Nov 22 11:10:09 crc kubenswrapper[4743]: I1122 11:10:09.255510 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_e4f6482d-edd6-49ad-8ef5-625281832a7b/nova-metadata-log/0.log" Nov 22 11:10:09 crc kubenswrapper[4743]: I1122 11:10:09.479666 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_248b4908-139e-45b1-a6cf-b398b9e23b90/nova-scheduler-scheduler/0.log" Nov 22 11:10:09 crc kubenswrapper[4743]: I1122 11:10:09.567649 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-69f8b8c646-hc89r_ed3ee158-5b93-4fa3-b8ef-13f9e0f19747/init/0.log" Nov 22 11:10:09 crc kubenswrapper[4743]: I1122 11:10:09.773881 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-69f8b8c646-hc89r_ed3ee158-5b93-4fa3-b8ef-13f9e0f19747/init/0.log" Nov 22 11:10:09 crc kubenswrapper[4743]: I1122 11:10:09.845317 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-69f8b8c646-hc89r_ed3ee158-5b93-4fa3-b8ef-13f9e0f19747/octavia-api-provider-agent/0.log" Nov 22 11:10:10 crc kubenswrapper[4743]: I1122 11:10:10.072019 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-hb4xf_48db469e-30ab-4c16-9720-4c7d33df686f/init/0.log" Nov 22 11:10:10 crc kubenswrapper[4743]: I1122 11:10:10.102732 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-69f8b8c646-hc89r_ed3ee158-5b93-4fa3-b8ef-13f9e0f19747/octavia-api/0.log" Nov 22 11:10:10 crc kubenswrapper[4743]: I1122 11:10:10.335490 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-hb4xf_48db469e-30ab-4c16-9720-4c7d33df686f/init/0.log" Nov 22 11:10:10 crc kubenswrapper[4743]: I1122 11:10:10.397855 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-r45h9_dea004b0-e9a6-4823-8692-af0a4c143d7d/init/0.log" Nov 22 11:10:10 crc kubenswrapper[4743]: I1122 11:10:10.450763 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-hb4xf_48db469e-30ab-4c16-9720-4c7d33df686f/octavia-healthmanager/0.log" Nov 22 11:10:10 crc kubenswrapper[4743]: I1122 11:10:10.774142 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-r45h9_dea004b0-e9a6-4823-8692-af0a4c143d7d/init/0.log" Nov 22 11:10:10 crc kubenswrapper[4743]: I1122 11:10:10.782591 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-zwj9g_52777522-005a-4fa2-97dd-3b3c26efc6f9/init/0.log" Nov 22 11:10:10 crc kubenswrapper[4743]: I1122 11:10:10.937697 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-r45h9_dea004b0-e9a6-4823-8692-af0a4c143d7d/octavia-housekeeping/0.log" Nov 22 11:10:11 crc kubenswrapper[4743]: I1122 11:10:11.104980 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-zwj9g_52777522-005a-4fa2-97dd-3b3c26efc6f9/octavia-amphora-httpd/0.log" Nov 22 11:10:11 crc kubenswrapper[4743]: I1122 11:10:11.192660 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-zwj9g_52777522-005a-4fa2-97dd-3b3c26efc6f9/init/0.log" Nov 22 11:10:11 crc kubenswrapper[4743]: I1122 11:10:11.245627 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-446fq_a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748/init/0.log" Nov 22 11:10:11 crc kubenswrapper[4743]: I1122 11:10:11.560277 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-446fq_a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748/octavia-rsyslog/0.log" Nov 22 11:10:11 crc kubenswrapper[4743]: I1122 11:10:11.612870 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5b2dc_9d8f4208-5b31-406d-a7cd-813b92c49e16/init/0.log" Nov 22 11:10:11 crc kubenswrapper[4743]: I1122 11:10:11.674108 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-446fq_a3a49d8d-dd5f-4063-a6b4-99a5bcdf4748/init/0.log" Nov 22 11:10:12 crc kubenswrapper[4743]: I1122 11:10:12.080239 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5b2dc_9d8f4208-5b31-406d-a7cd-813b92c49e16/init/0.log" Nov 22 11:10:12 crc kubenswrapper[4743]: I1122 11:10:12.143062 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e0e64bc7-078f-4609-add5-ac4679314d0a/mysql-bootstrap/0.log" Nov 22 11:10:12 crc kubenswrapper[4743]: I1122 11:10:12.439334 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-5b2dc_9d8f4208-5b31-406d-a7cd-813b92c49e16/octavia-worker/0.log" Nov 22 11:10:12 crc kubenswrapper[4743]: I1122 11:10:12.553619 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e0e64bc7-078f-4609-add5-ac4679314d0a/galera/0.log" Nov 22 11:10:12 crc kubenswrapper[4743]: I1122 11:10:12.573034 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_e0e64bc7-078f-4609-add5-ac4679314d0a/mysql-bootstrap/0.log" Nov 22 11:10:12 crc kubenswrapper[4743]: I1122 11:10:12.693293 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_087b5455-e53b-49da-a7d5-6d2317df7d4f/mysql-bootstrap/0.log" Nov 22 11:10:12 crc kubenswrapper[4743]: I1122 11:10:12.868513 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_ab061051-d1a4-4a0d-bd76-00fdb28c7a13/openstackclient/0.log" Nov 22 11:10:12 crc kubenswrapper[4743]: I1122 11:10:12.972353 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_087b5455-e53b-49da-a7d5-6d2317df7d4f/mysql-bootstrap/0.log" Nov 22 11:10:12 crc kubenswrapper[4743]: I1122 11:10:12.980743 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_087b5455-e53b-49da-a7d5-6d2317df7d4f/galera/0.log" Nov 22 11:10:13 crc kubenswrapper[4743]: I1122 11:10:13.212922 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-lzbkj_5bb3a009-b9ed-4054-ac5f-c7bd866f9634/ovn-controller/0.log" Nov 22 11:10:13 crc kubenswrapper[4743]: I1122 11:10:13.351798 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l4tln_580be048-ba5a-4927-bd45-28d898c01ca1/openstack-network-exporter/0.log" Nov 22 11:10:13 crc kubenswrapper[4743]: I1122 11:10:13.546868 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5jq6t_005d696c-bc18-45cc-bcd9-8d22455874e7/ovsdb-server-init/0.log" Nov 22 11:10:13 crc kubenswrapper[4743]: I1122 11:10:13.742615 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5jq6t_005d696c-bc18-45cc-bcd9-8d22455874e7/ovsdb-server-init/0.log" Nov 22 11:10:13 crc kubenswrapper[4743]: I1122 11:10:13.759906 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5jq6t_005d696c-bc18-45cc-bcd9-8d22455874e7/ovs-vswitchd/0.log" Nov 22 11:10:13 crc kubenswrapper[4743]: I1122 11:10:13.865306 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5jq6t_005d696c-bc18-45cc-bcd9-8d22455874e7/ovsdb-server/0.log" Nov 22 11:10:14 crc kubenswrapper[4743]: I1122 11:10:14.008186 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dd13203f-a0d7-40f3-8e55-62f38fdc76fe/openstack-network-exporter/0.log" Nov 22 11:10:14 crc kubenswrapper[4743]: I1122 11:10:14.038396 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dd13203f-a0d7-40f3-8e55-62f38fdc76fe/ovn-northd/0.log" Nov 22 11:10:14 crc kubenswrapper[4743]: I1122 11:10:14.199351 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-xwbpr_3e80ff58-1768-4d4a-b759-d9d882510ff8/ovn-openstack-openstack-cell1/0.log" Nov 22 11:10:14 crc kubenswrapper[4743]: I1122 11:10:14.322033 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_207fcbef-06d2-4cd9-85d1-f6114591092f/openstack-network-exporter/0.log" Nov 22 11:10:14 crc kubenswrapper[4743]: I1122 11:10:14.426843 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_207fcbef-06d2-4cd9-85d1-f6114591092f/ovsdbserver-nb/0.log" Nov 22 11:10:14 crc kubenswrapper[4743]: I1122 11:10:14.571549 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_3d13534a-43da-4352-b61e-40779ab62237/openstack-network-exporter/0.log" Nov 22 11:10:14 crc kubenswrapper[4743]: I1122 11:10:14.614020 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_3d13534a-43da-4352-b61e-40779ab62237/ovsdbserver-nb/0.log" Nov 22 11:10:14 crc kubenswrapper[4743]: I1122 11:10:14.800915 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0c5e138b-6d40-45d7-b138-bf86c812bd0c/ovsdbserver-nb/0.log" Nov 22 11:10:14 crc kubenswrapper[4743]: I1122 11:10:14.855477 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0c5e138b-6d40-45d7-b138-bf86c812bd0c/openstack-network-exporter/0.log" Nov 22 11:10:15 crc kubenswrapper[4743]: I1122 11:10:15.014177 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_81ac2f2b-4109-4ed3-868d-ea3572055751/ovsdbserver-sb/0.log" Nov 22 11:10:15 crc kubenswrapper[4743]: I1122 11:10:15.036573 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_81ac2f2b-4109-4ed3-868d-ea3572055751/openstack-network-exporter/0.log" Nov 22 11:10:15 crc kubenswrapper[4743]: I1122 11:10:15.139302 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_64b18e0c-c33c-4f05-93e6-3b7ffc82e811/openstack-network-exporter/0.log" Nov 22 11:10:15 crc kubenswrapper[4743]: I1122 11:10:15.251218 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_64b18e0c-c33c-4f05-93e6-3b7ffc82e811/ovsdbserver-sb/0.log" Nov 22 11:10:15 crc kubenswrapper[4743]: I1122 11:10:15.378914 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_2acf2bf5-0ed1-4513-ba48-a5e7a63a6002/openstack-network-exporter/0.log" Nov 22 11:10:15 crc kubenswrapper[4743]: I1122 11:10:15.461019 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_2acf2bf5-0ed1-4513-ba48-a5e7a63a6002/ovsdbserver-sb/0.log" Nov 22 11:10:15 crc kubenswrapper[4743]: I1122 11:10:15.666388 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5f9b766bfb-hp7ll_c4984786-114b-47c3-9dac-ed7029d060d5/placement-api/0.log" Nov 22 11:10:15 crc kubenswrapper[4743]: I1122 11:10:15.794796 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5f9b766bfb-hp7ll_c4984786-114b-47c3-9dac-ed7029d060d5/placement-log/0.log" Nov 22 11:10:15 crc kubenswrapper[4743]: I1122 11:10:15.807949 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cvcsf2_54840e46-1eea-45a3-8028-b05dc2bb08e0/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Nov 22 11:10:16 crc kubenswrapper[4743]: I1122 11:10:16.028869 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d45bcf1f-df0a-4470-acd9-62a70715936e/init-config-reloader/0.log" Nov 22 11:10:16 crc kubenswrapper[4743]: I1122 11:10:16.221598 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d45bcf1f-df0a-4470-acd9-62a70715936e/config-reloader/0.log" Nov 22 11:10:16 crc kubenswrapper[4743]: I1122 11:10:16.275182 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d45bcf1f-df0a-4470-acd9-62a70715936e/prometheus/0.log" Nov 22 11:10:16 crc kubenswrapper[4743]: I1122 11:10:16.295348 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d45bcf1f-df0a-4470-acd9-62a70715936e/thanos-sidecar/0.log" Nov 22 11:10:16 crc kubenswrapper[4743]: I1122 11:10:16.321616 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d45bcf1f-df0a-4470-acd9-62a70715936e/init-config-reloader/0.log" Nov 22 11:10:16 crc kubenswrapper[4743]: I1122 11:10:16.553417 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7/setup-container/0.log" Nov 22 11:10:16 crc kubenswrapper[4743]: I1122 11:10:16.744291 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7/setup-container/0.log" Nov 22 11:10:16 crc kubenswrapper[4743]: I1122 11:10:16.783215 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_fdf7c2ce-bdf7-4156-a7a6-5ba0b43e63a7/rabbitmq/0.log" Nov 22 11:10:16 crc kubenswrapper[4743]: I1122 11:10:16.886792 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1c512ff1-fd60-4b1c-a421-fd277d259d35/setup-container/0.log" Nov 22 11:10:17 crc kubenswrapper[4743]: I1122 11:10:17.106167 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1c512ff1-fd60-4b1c-a421-fd277d259d35/setup-container/0.log" Nov 22 11:10:17 crc kubenswrapper[4743]: I1122 11:10:17.164613 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-w4vp7_d6efb775-64a4-414b-a8e3-8169706ba3de/reboot-os-openstack-openstack-cell1/0.log" Nov 22 11:10:17 crc kubenswrapper[4743]: I1122 11:10:17.221718 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_1c512ff1-fd60-4b1c-a421-fd277d259d35/rabbitmq/0.log" Nov 22 11:10:17 crc kubenswrapper[4743]: I1122 11:10:17.372942 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-94f4d_d2ad6d66-1cc7-4e28-aae8-14d855606aeb/run-os-openstack-openstack-cell1/0.log" Nov 22 11:10:17 crc kubenswrapper[4743]: I1122 11:10:17.507634 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-2wp9n_4aa15537-2948-44af-b30f-ff55c4d4b86d/ssh-known-hosts-openstack/0.log" Nov 22 11:10:17 crc kubenswrapper[4743]: I1122 11:10:17.796293 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-bcrp6_7e221c36-eb02-4ce0-8eda-c568c7adf15c/telemetry-openstack-openstack-cell1/0.log" Nov 22 11:10:17 crc kubenswrapper[4743]: I1122 11:10:17.897558 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-ccs4q_60fc17e1-9296-450c-979c-bd863fb3dce6/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Nov 22 11:10:18 crc kubenswrapper[4743]: I1122 11:10:18.040048 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-xdzxc_29a07311-f1b0-47cd-bf93-0f13bcd05354/validate-network-openstack-openstack-cell1/0.log" Nov 22 11:10:18 crc kubenswrapper[4743]: I1122 11:10:18.152768 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:10:18 crc kubenswrapper[4743]: E1122 11:10:18.153033 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:10:19 crc kubenswrapper[4743]: I1122 11:10:19.579364 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3362e3d8-3a46-4ef9-abb5-0c75ea0d28ce/memcached/0.log" Nov 22 11:10:32 crc kubenswrapper[4743]: I1122 11:10:32.152195 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:10:32 crc kubenswrapper[4743]: E1122 11:10:32.153412 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:10:42 crc kubenswrapper[4743]: I1122 11:10:42.891245 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn_6c17beb4-33b2-4e6d-9ae5-61f396f3c37f/util/0.log" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.058269 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn_6c17beb4-33b2-4e6d-9ae5-61f396f3c37f/util/0.log" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.096982 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn_6c17beb4-33b2-4e6d-9ae5-61f396f3c37f/pull/0.log" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.106467 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn_6c17beb4-33b2-4e6d-9ae5-61f396f3c37f/pull/0.log" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.152392 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:10:43 crc kubenswrapper[4743]: E1122 11:10:43.152838 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.386273 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn_6c17beb4-33b2-4e6d-9ae5-61f396f3c37f/util/0.log" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.419817 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn_6c17beb4-33b2-4e6d-9ae5-61f396f3c37f/extract/0.log" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.432744 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_424849c5a998ab4c2e7db59c17f5d2f5cded7f20cb8988f3e9535d5064qjpzn_6c17beb4-33b2-4e6d-9ae5-61f396f3c37f/pull/0.log" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.630790 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-xts6s_9f1b446c-1023-4682-889f-97abca903826/kube-rbac-proxy/0.log" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.750403 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75fb479bcc-xts6s_9f1b446c-1023-4682-889f-97abca903826/manager/0.log" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.751466 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-8vd9m_ac624665-bb51-4c61-b213-cb07bd43eafe/kube-rbac-proxy/0.log" Nov 22 11:10:43 crc kubenswrapper[4743]: I1122 11:10:43.980558 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-6498cbf48f-8vd9m_ac624665-bb51-4c61-b213-cb07bd43eafe/manager/0.log" Nov 22 11:10:44 crc kubenswrapper[4743]: I1122 11:10:44.000695 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-kcrpp_657d8d61-7be7-42a6-8472-2d70e55a8428/kube-rbac-proxy/0.log" Nov 22 11:10:44 crc kubenswrapper[4743]: I1122 11:10:44.091703 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-767ccfd65f-kcrpp_657d8d61-7be7-42a6-8472-2d70e55a8428/manager/0.log" Nov 22 11:10:44 crc kubenswrapper[4743]: I1122 11:10:44.284797 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-k5rhz_a729ba89-b0fe-4363-b4b4-ffe21f0c627c/kube-rbac-proxy/0.log" Nov 22 11:10:44 crc kubenswrapper[4743]: I1122 11:10:44.453095 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7969689c84-k5rhz_a729ba89-b0fe-4363-b4b4-ffe21f0c627c/manager/0.log" Nov 22 11:10:44 crc kubenswrapper[4743]: I1122 11:10:44.541900 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-7bcj5_81cdc04a-86d9-488d-b854-d941f3f5632e/kube-rbac-proxy/0.log" Nov 22 11:10:44 crc kubenswrapper[4743]: I1122 11:10:44.624749 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-56f54d6746-7bcj5_81cdc04a-86d9-488d-b854-d941f3f5632e/manager/0.log" Nov 22 11:10:44 crc kubenswrapper[4743]: I1122 11:10:44.713165 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-6wnss_f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f/kube-rbac-proxy/0.log" Nov 22 11:10:44 crc kubenswrapper[4743]: I1122 11:10:44.806867 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-598f69df5d-6wnss_f39b340f-3ca1-48cc-a7ca-d2f1cdba1d1f/manager/0.log" Nov 22 11:10:44 crc kubenswrapper[4743]: I1122 11:10:44.994991 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6dd8864d7c-gn9b8_25fd4b24-83f2-4a02-b086-ca0f03cb42a3/kube-rbac-proxy/0.log" Nov 22 11:10:45 crc kubenswrapper[4743]: I1122 11:10:45.255907 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-99b499f4-xbmln_89e94ee4-4365-4e56-a5a2-3d61bbbd8876/kube-rbac-proxy/0.log" Nov 22 11:10:45 crc kubenswrapper[4743]: I1122 11:10:45.260128 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-6dd8864d7c-gn9b8_25fd4b24-83f2-4a02-b086-ca0f03cb42a3/manager/0.log" Nov 22 11:10:45 crc kubenswrapper[4743]: I1122 11:10:45.335969 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-99b499f4-xbmln_89e94ee4-4365-4e56-a5a2-3d61bbbd8876/manager/0.log" Nov 22 11:10:45 crc kubenswrapper[4743]: I1122 11:10:45.469992 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-ktmd4_e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb/kube-rbac-proxy/0.log" Nov 22 11:10:45 crc kubenswrapper[4743]: I1122 11:10:45.584083 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7454b96578-ktmd4_e2c45ecf-0fbe-4f18-a9f3-609bf6db83cb/manager/0.log" Nov 22 11:10:45 crc kubenswrapper[4743]: I1122 11:10:45.740273 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-579sf_50b5cda8-859c-49f0-92aa-601c16eb9a2a/kube-rbac-proxy/0.log" Nov 22 11:10:45 crc kubenswrapper[4743]: I1122 11:10:45.859416 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-58f887965d-579sf_50b5cda8-859c-49f0-92aa-601c16eb9a2a/manager/0.log" Nov 22 11:10:45 crc kubenswrapper[4743]: I1122 11:10:45.913532 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-vt47z_9d349409-980f-4605-bd87-d09fe812dd65/kube-rbac-proxy/0.log" Nov 22 11:10:46 crc kubenswrapper[4743]: I1122 11:10:46.023377 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-54b5986bb8-vt47z_9d349409-980f-4605-bd87-d09fe812dd65/manager/0.log" Nov 22 11:10:46 crc kubenswrapper[4743]: I1122 11:10:46.140933 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-klfvn_d8c50ae0-c8c9-4e87-9130-4c04d5b468ac/kube-rbac-proxy/0.log" Nov 22 11:10:46 crc kubenswrapper[4743]: I1122 11:10:46.255954 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-78bd47f458-klfvn_d8c50ae0-c8c9-4e87-9130-4c04d5b468ac/manager/0.log" Nov 22 11:10:46 crc kubenswrapper[4743]: I1122 11:10:46.355980 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-j682k_1b06dd20-2bb7-4ff2-aa77-997042af333e/kube-rbac-proxy/0.log" Nov 22 11:10:46 crc kubenswrapper[4743]: I1122 11:10:46.618387 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-cfbb9c588-j682k_1b06dd20-2bb7-4ff2-aa77-997042af333e/manager/0.log" Nov 22 11:10:46 crc kubenswrapper[4743]: I1122 11:10:46.631367 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-g6mzk_0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970/kube-rbac-proxy/0.log" Nov 22 11:10:46 crc kubenswrapper[4743]: I1122 11:10:46.697285 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-54cfbf4c7d-g6mzk_0c151bc9-2e8a-4422-a8ed-ecf3e3fa2970/manager/0.log" Nov 22 11:10:46 crc kubenswrapper[4743]: I1122 11:10:46.879114 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4_f5f27cf7-eaa5-4b71-84a6-94fac3920d39/kube-rbac-proxy/0.log" Nov 22 11:10:46 crc kubenswrapper[4743]: I1122 11:10:46.886305 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-8c7444f48-cdfd4_f5f27cf7-eaa5-4b71-84a6-94fac3920d39/manager/0.log" Nov 22 11:10:46 crc kubenswrapper[4743]: I1122 11:10:46.982752 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-654fc8b94c-cp8qz_b247b139-5fdf-426f-8ca6-6bcb58585963/kube-rbac-proxy/0.log" Nov 22 11:10:47 crc kubenswrapper[4743]: I1122 11:10:47.185556 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-58d789d48c-sv4c5_874fc1ac-e9dc-4948-ac24-da9140316fd8/kube-rbac-proxy/0.log" Nov 22 11:10:47 crc kubenswrapper[4743]: I1122 11:10:47.596260 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-58d789d48c-sv4c5_874fc1ac-e9dc-4948-ac24-da9140316fd8/operator/0.log" Nov 22 11:10:47 crc kubenswrapper[4743]: I1122 11:10:47.799536 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-f78gn_995bde5d-4b1f-4ee1-ab0e-eacb5f81c4ea/registry-server/0.log" Nov 22 11:10:47 crc kubenswrapper[4743]: I1122 11:10:47.881707 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-cv2tj_4c6b99d5-9791-40db-91fd-d74c80b2e3a7/kube-rbac-proxy/0.log" Nov 22 11:10:48 crc kubenswrapper[4743]: I1122 11:10:48.052087 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-54fc5f65b7-cv2tj_4c6b99d5-9791-40db-91fd-d74c80b2e3a7/manager/0.log" Nov 22 11:10:48 crc kubenswrapper[4743]: I1122 11:10:48.153614 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-z4flw_3420c6da-358d-4b5c-a383-e25fbc58a2ee/kube-rbac-proxy/0.log" Nov 22 11:10:48 crc kubenswrapper[4743]: I1122 11:10:48.318594 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5b797b8dff-z4flw_3420c6da-358d-4b5c-a383-e25fbc58a2ee/manager/0.log" Nov 22 11:10:48 crc kubenswrapper[4743]: I1122 11:10:48.359135 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-ht4k5_36e85576-c481-4424-aa1c-21a18036d239/operator/0.log" Nov 22 11:10:48 crc kubenswrapper[4743]: I1122 11:10:48.538637 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-jczhh_9f7a1db7-e801-4da6-b64b-f3babcfcd9c6/kube-rbac-proxy/0.log" Nov 22 11:10:48 crc kubenswrapper[4743]: I1122 11:10:48.697716 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-d656998f4-jczhh_9f7a1db7-e801-4da6-b64b-f3babcfcd9c6/manager/0.log" Nov 22 11:10:48 crc kubenswrapper[4743]: I1122 11:10:48.732525 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-rssfb_904fdb49-cc2c-443c-af9e-950b648018e9/kube-rbac-proxy/0.log" Nov 22 11:10:48 crc kubenswrapper[4743]: I1122 11:10:48.972226 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-55bp6_75667626-7d6a-46d0-b0b2-f627257967f4/kube-rbac-proxy/0.log" Nov 22 11:10:48 crc kubenswrapper[4743]: I1122 11:10:48.999176 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-b4c496f69-55bp6_75667626-7d6a-46d0-b0b2-f627257967f4/manager/0.log" Nov 22 11:10:49 crc kubenswrapper[4743]: I1122 11:10:49.077075 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6d4bf84b58-rssfb_904fdb49-cc2c-443c-af9e-950b648018e9/manager/0.log" Nov 22 11:10:49 crc kubenswrapper[4743]: I1122 11:10:49.292136 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-j6vgx_5c593905-2ee5-4990-9f9c-85ca81f38319/manager/0.log" Nov 22 11:10:49 crc kubenswrapper[4743]: I1122 11:10:49.340677 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-8c6448b9f-j6vgx_5c593905-2ee5-4990-9f9c-85ca81f38319/kube-rbac-proxy/0.log" Nov 22 11:10:49 crc kubenswrapper[4743]: I1122 11:10:49.680444 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-654fc8b94c-cp8qz_b247b139-5fdf-426f-8ca6-6bcb58585963/manager/0.log" Nov 22 11:10:57 crc kubenswrapper[4743]: I1122 11:10:57.175346 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:10:57 crc kubenswrapper[4743]: E1122 11:10:57.176306 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:11:08 crc kubenswrapper[4743]: I1122 11:11:08.152607 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:11:08 crc kubenswrapper[4743]: E1122 11:11:08.153723 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:11:09 crc kubenswrapper[4743]: I1122 11:11:09.029194 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-pg7xr_6ecdcc2c-1d03-46ec-96e6-da1e04437140/control-plane-machine-set-operator/0.log" Nov 22 11:11:09 crc kubenswrapper[4743]: I1122 11:11:09.253457 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m8pkf_506d451b-5cf3-44fe-be73-9d43abbbf9a8/kube-rbac-proxy/0.log" Nov 22 11:11:09 crc kubenswrapper[4743]: I1122 11:11:09.335502 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-m8pkf_506d451b-5cf3-44fe-be73-9d43abbbf9a8/machine-api-operator/0.log" Nov 22 11:11:23 crc kubenswrapper[4743]: I1122 11:11:23.153554 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:11:23 crc kubenswrapper[4743]: E1122 11:11:23.155089 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:11:23 crc kubenswrapper[4743]: I1122 11:11:23.768141 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-lx25w_c0e9aa74-1d53-4cc4-b5d9-04cb76cb7520/cert-manager-controller/0.log" Nov 22 11:11:23 crc kubenswrapper[4743]: I1122 11:11:23.995998 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-m7nsn_dd2cf53a-ea0f-4ebd-a8d8-4a39fe9d73d0/cert-manager-cainjector/0.log" Nov 22 11:11:24 crc kubenswrapper[4743]: I1122 11:11:24.028450 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-drxhj_3da5b450-d97d-45e1-9b46-91733e107f14/cert-manager-webhook/0.log" Nov 22 11:11:36 crc kubenswrapper[4743]: I1122 11:11:36.151959 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:11:36 crc kubenswrapper[4743]: E1122 11:11:36.152754 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:11:38 crc kubenswrapper[4743]: I1122 11:11:38.210146 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5874bd7bc5-ggvvt_79b6d2ec-a4d8-4c91-8f86-aed66745f48b/nmstate-console-plugin/0.log" Nov 22 11:11:38 crc kubenswrapper[4743]: I1122 11:11:38.423978 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-ntztm_b5589372-866d-4842-ad30-fdb503b25d3a/kube-rbac-proxy/0.log" Nov 22 11:11:38 crc kubenswrapper[4743]: I1122 11:11:38.431118 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-nhd9z_9919df2d-511a-481f-9506-039359ecbfb1/nmstate-handler/0.log" Nov 22 11:11:38 crc kubenswrapper[4743]: I1122 11:11:38.450918 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-5dcf9c57c5-ntztm_b5589372-866d-4842-ad30-fdb503b25d3a/nmstate-metrics/0.log" Nov 22 11:11:38 crc kubenswrapper[4743]: I1122 11:11:38.610885 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-557fdffb88-rzpwt_f8fe518d-0109-44b8-84a8-7f8d285abb8d/nmstate-operator/0.log" Nov 22 11:11:38 crc kubenswrapper[4743]: I1122 11:11:38.666404 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6b89b748d8-bj7bk_48065f06-9619-4a08-a9a5-c50269da8fbe/nmstate-webhook/0.log" Nov 22 11:11:49 crc kubenswrapper[4743]: I1122 11:11:49.152086 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:11:49 crc kubenswrapper[4743]: E1122 11:11:49.152743 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.013961 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-m9fwk_ec8608f7-e718-49d1-bdba-00dcdb9805b2/kube-rbac-proxy/0.log" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.332532 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-frr-files/0.log" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.448218 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6c7b4b5f48-m9fwk_ec8608f7-e718-49d1-bdba-00dcdb9805b2/controller/0.log" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.485287 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-frr-files/0.log" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.505791 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-reloader/0.log" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.526392 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-metrics/0.log" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.657785 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-reloader/0.log" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.827537 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-frr-files/0.log" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.843288 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-metrics/0.log" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.860374 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-reloader/0.log" Nov 22 11:11:54 crc kubenswrapper[4743]: I1122 11:11:54.885357 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-metrics/0.log" Nov 22 11:11:55 crc kubenswrapper[4743]: I1122 11:11:55.081691 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-reloader/0.log" Nov 22 11:11:55 crc kubenswrapper[4743]: I1122 11:11:55.089395 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-frr-files/0.log" Nov 22 11:11:55 crc kubenswrapper[4743]: I1122 11:11:55.110149 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/controller/0.log" Nov 22 11:11:55 crc kubenswrapper[4743]: I1122 11:11:55.131411 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/cp-metrics/0.log" Nov 22 11:11:55 crc kubenswrapper[4743]: I1122 11:11:55.325214 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/kube-rbac-proxy/0.log" Nov 22 11:11:55 crc kubenswrapper[4743]: I1122 11:11:55.355983 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/frr-metrics/0.log" Nov 22 11:11:55 crc kubenswrapper[4743]: I1122 11:11:55.412105 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/kube-rbac-proxy-frr/0.log" Nov 22 11:11:55 crc kubenswrapper[4743]: I1122 11:11:55.661143 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-6998585d5-5l6gn_8b6ebac3-81ab-499b-bfcf-89e3416072c2/frr-k8s-webhook-server/0.log" Nov 22 11:11:55 crc kubenswrapper[4743]: I1122 11:11:55.665021 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/reloader/0.log" Nov 22 11:11:55 crc kubenswrapper[4743]: I1122 11:11:55.950262 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-66fbf9c95c-xnt8j_8406cff2-721d-4c2b-90e4-343769c8ae38/manager/0.log" Nov 22 11:11:56 crc kubenswrapper[4743]: I1122 11:11:56.260135 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7ff5869b9c-b7qc4_3a874228-69b2-4dd8-b768-3b21fa2c45d8/webhook-server/0.log" Nov 22 11:11:56 crc kubenswrapper[4743]: I1122 11:11:56.379505 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hkg92_683dee48-c9d8-42c2-a0d4-8776fcf48a01/kube-rbac-proxy/0.log" Nov 22 11:11:57 crc kubenswrapper[4743]: I1122 11:11:57.563317 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-hkg92_683dee48-c9d8-42c2-a0d4-8776fcf48a01/speaker/0.log" Nov 22 11:11:59 crc kubenswrapper[4743]: I1122 11:11:59.097015 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g4q9l_c019d9ca-5ddf-4c98-b5f1-c425686a58d4/frr/0.log" Nov 22 11:12:00 crc kubenswrapper[4743]: I1122 11:12:00.151261 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:12:00 crc kubenswrapper[4743]: E1122 11:12:00.151961 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:12:11 crc kubenswrapper[4743]: I1122 11:12:11.153936 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:12:11 crc kubenswrapper[4743]: E1122 11:12:11.155387 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:12:12 crc kubenswrapper[4743]: I1122 11:12:12.001926 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4_b67f1857-71d8-47f0-bee7-d03162f14ef0/util/0.log" Nov 22 11:12:12 crc kubenswrapper[4743]: I1122 11:12:12.788658 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4_b67f1857-71d8-47f0-bee7-d03162f14ef0/util/0.log" Nov 22 11:12:12 crc kubenswrapper[4743]: I1122 11:12:12.826191 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4_b67f1857-71d8-47f0-bee7-d03162f14ef0/pull/0.log" Nov 22 11:12:12 crc kubenswrapper[4743]: I1122 11:12:12.878536 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4_b67f1857-71d8-47f0-bee7-d03162f14ef0/pull/0.log" Nov 22 11:12:12 crc kubenswrapper[4743]: I1122 11:12:12.993472 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4_b67f1857-71d8-47f0-bee7-d03162f14ef0/util/0.log" Nov 22 11:12:13 crc kubenswrapper[4743]: I1122 11:12:13.015342 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4_b67f1857-71d8-47f0-bee7-d03162f14ef0/pull/0.log" Nov 22 11:12:13 crc kubenswrapper[4743]: I1122 11:12:13.037243 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931a665q4_b67f1857-71d8-47f0-bee7-d03162f14ef0/extract/0.log" Nov 22 11:12:13 crc kubenswrapper[4743]: I1122 11:12:13.189803 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z_1b12b44d-49bb-4965-bec8-c49868b581c8/util/0.log" Nov 22 11:12:13 crc kubenswrapper[4743]: I1122 11:12:13.413337 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z_1b12b44d-49bb-4965-bec8-c49868b581c8/util/0.log" Nov 22 11:12:13 crc kubenswrapper[4743]: I1122 11:12:13.414460 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z_1b12b44d-49bb-4965-bec8-c49868b581c8/pull/0.log" Nov 22 11:12:13 crc kubenswrapper[4743]: I1122 11:12:13.438682 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z_1b12b44d-49bb-4965-bec8-c49868b581c8/pull/0.log" Nov 22 11:12:13 crc kubenswrapper[4743]: I1122 11:12:13.585167 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z_1b12b44d-49bb-4965-bec8-c49868b581c8/util/0.log" Nov 22 11:12:13 crc kubenswrapper[4743]: I1122 11:12:13.658110 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z_1b12b44d-49bb-4965-bec8-c49868b581c8/extract/0.log" Nov 22 11:12:13 crc kubenswrapper[4743]: I1122 11:12:13.668214 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5c796334424b8139919e908729ac8fe5c1f6e7b6bc33540f00b4f8772e4sd7z_1b12b44d-49bb-4965-bec8-c49868b581c8/pull/0.log" Nov 22 11:12:13 crc kubenswrapper[4743]: I1122 11:12:13.824016 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8_a0444a53-315b-4e96-852f-5a5db7824935/util/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.018433 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8_a0444a53-315b-4e96-852f-5a5db7824935/pull/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.018598 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8_a0444a53-315b-4e96-852f-5a5db7824935/pull/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.018839 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8_a0444a53-315b-4e96-852f-5a5db7824935/util/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.228796 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8_a0444a53-315b-4e96-852f-5a5db7824935/util/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.230193 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8_a0444a53-315b-4e96-852f-5a5db7824935/extract/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.319429 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210zt8c8_a0444a53-315b-4e96-852f-5a5db7824935/pull/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.427838 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jbq5n_0fc02866-fa76-46c8-9213-6c879aad1284/extract-utilities/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.607973 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jbq5n_0fc02866-fa76-46c8-9213-6c879aad1284/extract-content/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.682759 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jbq5n_0fc02866-fa76-46c8-9213-6c879aad1284/extract-utilities/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.685273 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jbq5n_0fc02866-fa76-46c8-9213-6c879aad1284/extract-content/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.824948 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jbq5n_0fc02866-fa76-46c8-9213-6c879aad1284/extract-content/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.861952 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jbq5n_0fc02866-fa76-46c8-9213-6c879aad1284/extract-utilities/0.log" Nov 22 11:12:14 crc kubenswrapper[4743]: I1122 11:12:14.929735 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5ksl_74807c56-d30f-4fbd-b0ac-44c792f32b99/extract-utilities/0.log" Nov 22 11:12:15 crc kubenswrapper[4743]: I1122 11:12:15.155963 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5ksl_74807c56-d30f-4fbd-b0ac-44c792f32b99/extract-content/0.log" Nov 22 11:12:15 crc kubenswrapper[4743]: I1122 11:12:15.201867 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5ksl_74807c56-d30f-4fbd-b0ac-44c792f32b99/extract-utilities/0.log" Nov 22 11:12:15 crc kubenswrapper[4743]: I1122 11:12:15.258483 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5ksl_74807c56-d30f-4fbd-b0ac-44c792f32b99/extract-content/0.log" Nov 22 11:12:15 crc kubenswrapper[4743]: I1122 11:12:15.468780 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5ksl_74807c56-d30f-4fbd-b0ac-44c792f32b99/extract-utilities/0.log" Nov 22 11:12:15 crc kubenswrapper[4743]: I1122 11:12:15.560515 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5ksl_74807c56-d30f-4fbd-b0ac-44c792f32b99/extract-content/0.log" Nov 22 11:12:15 crc kubenswrapper[4743]: I1122 11:12:15.789287 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5_866ceb06-9d22-46bb-aa63-73c7f2f2e3fb/util/0.log" Nov 22 11:12:15 crc kubenswrapper[4743]: I1122 11:12:15.971141 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5_866ceb06-9d22-46bb-aa63-73c7f2f2e3fb/pull/0.log" Nov 22 11:12:16 crc kubenswrapper[4743]: I1122 11:12:16.047160 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5_866ceb06-9d22-46bb-aa63-73c7f2f2e3fb/util/0.log" Nov 22 11:12:16 crc kubenswrapper[4743]: I1122 11:12:16.065712 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5_866ceb06-9d22-46bb-aa63-73c7f2f2e3fb/pull/0.log" Nov 22 11:12:16 crc kubenswrapper[4743]: I1122 11:12:16.261391 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5_866ceb06-9d22-46bb-aa63-73c7f2f2e3fb/util/0.log" Nov 22 11:12:16 crc kubenswrapper[4743]: I1122 11:12:16.342052 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5_866ceb06-9d22-46bb-aa63-73c7f2f2e3fb/pull/0.log" Nov 22 11:12:16 crc kubenswrapper[4743]: I1122 11:12:16.351251 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e8527aae5664f20f24bf3bbb3fd2981ba838928a8a47ce599ee258e4c64rzz5_866ceb06-9d22-46bb-aa63-73c7f2f2e3fb/extract/0.log" Nov 22 11:12:16 crc kubenswrapper[4743]: I1122 11:12:16.640165 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-5zpxd_87ba7bcd-5643-4c3a-a351-554d57e3c8a0/marketplace-operator/0.log" Nov 22 11:12:16 crc kubenswrapper[4743]: I1122 11:12:16.681502 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-jbq5n_0fc02866-fa76-46c8-9213-6c879aad1284/registry-server/0.log" Nov 22 11:12:16 crc kubenswrapper[4743]: I1122 11:12:16.778312 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mpsmp_00fed59e-401b-4b13-b307-44e90ae76dce/extract-utilities/0.log" Nov 22 11:12:16 crc kubenswrapper[4743]: I1122 11:12:16.991955 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mpsmp_00fed59e-401b-4b13-b307-44e90ae76dce/extract-utilities/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.006117 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mpsmp_00fed59e-401b-4b13-b307-44e90ae76dce/extract-content/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.065688 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mpsmp_00fed59e-401b-4b13-b307-44e90ae76dce/extract-content/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.182849 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m5ksl_74807c56-d30f-4fbd-b0ac-44c792f32b99/registry-server/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.216155 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mpsmp_00fed59e-401b-4b13-b307-44e90ae76dce/extract-utilities/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.226227 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mpsmp_00fed59e-401b-4b13-b307-44e90ae76dce/extract-content/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.406954 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hbk4w_69e2b63b-9379-47e8-92c8-991b9599c53c/extract-utilities/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.653128 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-mpsmp_00fed59e-401b-4b13-b307-44e90ae76dce/registry-server/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.657184 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hbk4w_69e2b63b-9379-47e8-92c8-991b9599c53c/extract-utilities/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.687985 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hbk4w_69e2b63b-9379-47e8-92c8-991b9599c53c/extract-content/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.694264 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hbk4w_69e2b63b-9379-47e8-92c8-991b9599c53c/extract-content/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.813285 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hbk4w_69e2b63b-9379-47e8-92c8-991b9599c53c/extract-utilities/0.log" Nov 22 11:12:17 crc kubenswrapper[4743]: I1122 11:12:17.846995 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hbk4w_69e2b63b-9379-47e8-92c8-991b9599c53c/extract-content/0.log" Nov 22 11:12:19 crc kubenswrapper[4743]: I1122 11:12:19.018802 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-hbk4w_69e2b63b-9379-47e8-92c8-991b9599c53c/registry-server/0.log" Nov 22 11:12:22 crc kubenswrapper[4743]: I1122 11:12:22.153066 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:12:22 crc kubenswrapper[4743]: E1122 11:12:22.154195 4743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xk98p_openshift-machine-config-operator(bae39197-d188-40a8-880d-0d2e6e528f86)\"" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" Nov 22 11:12:32 crc kubenswrapper[4743]: I1122 11:12:32.925235 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-7htfs_518037c9-5978-4eae-bca7-bf63f9dc5b50/prometheus-operator/0.log" Nov 22 11:12:33 crc kubenswrapper[4743]: I1122 11:12:33.131864 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6b4554bdb9-vnzdf_9edf5ca3-00e3-4c62-9ec7-2f67630e4bdf/prometheus-operator-admission-webhook/0.log" Nov 22 11:12:33 crc kubenswrapper[4743]: I1122 11:12:33.189763 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-6b4554bdb9-wk7ht_4a5a5d7d-7a96-472e-b6e1-a3652a73a8b1/prometheus-operator-admission-webhook/0.log" Nov 22 11:12:33 crc kubenswrapper[4743]: I1122 11:12:33.386364 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-mm5nj_86470a8e-5cc0-4f60-9cf7-f9675599a769/operator/0.log" Nov 22 11:12:33 crc kubenswrapper[4743]: I1122 11:12:33.402503 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-gdwmk_1a0a8704-e5ab-4bed-bdd6-bb291e7823e5/perses-operator/0.log" Nov 22 11:12:34 crc kubenswrapper[4743]: I1122 11:12:34.152668 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1" Nov 22 11:12:34 crc kubenswrapper[4743]: I1122 11:12:34.609136 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"27748cf79ea2c8b0ea7f969b56f59641cbd478902f21a0b8e0f088daf78630c6"} Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.461134 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2tp52"] Nov 22 11:13:03 crc kubenswrapper[4743]: E1122 11:13:03.462309 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerName="registry-server" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.462325 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerName="registry-server" Nov 22 11:13:03 crc kubenswrapper[4743]: E1122 11:13:03.462350 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerName="extract-content" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.462356 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerName="extract-content" Nov 22 11:13:03 crc kubenswrapper[4743]: E1122 11:13:03.462379 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerName="extract-utilities" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.462385 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerName="extract-utilities" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.462707 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="523f648f-43d2-4e73-81ff-e87f41f07ab1" containerName="registry-server" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.464853 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.475288 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tp52"] Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.558147 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-utilities\") pod \"redhat-marketplace-2tp52\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.558274 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-catalog-content\") pod \"redhat-marketplace-2tp52\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.558927 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fws44\" (UniqueName: \"kubernetes.io/projected/761fab6a-fb58-4deb-9de3-8942ac14212b-kube-api-access-fws44\") pod \"redhat-marketplace-2tp52\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.661472 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-catalog-content\") pod \"redhat-marketplace-2tp52\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.661671 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fws44\" (UniqueName: \"kubernetes.io/projected/761fab6a-fb58-4deb-9de3-8942ac14212b-kube-api-access-fws44\") pod \"redhat-marketplace-2tp52\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.661738 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-utilities\") pod \"redhat-marketplace-2tp52\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.662354 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-catalog-content\") pod \"redhat-marketplace-2tp52\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.662393 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-utilities\") pod \"redhat-marketplace-2tp52\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.682399 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fws44\" (UniqueName: \"kubernetes.io/projected/761fab6a-fb58-4deb-9de3-8942ac14212b-kube-api-access-fws44\") pod \"redhat-marketplace-2tp52\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:03 crc kubenswrapper[4743]: I1122 11:13:03.788676 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:04 crc kubenswrapper[4743]: I1122 11:13:04.503428 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tp52"] Nov 22 11:13:05 crc kubenswrapper[4743]: I1122 11:13:05.064820 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tp52" event={"ID":"761fab6a-fb58-4deb-9de3-8942ac14212b","Type":"ContainerStarted","Data":"8efe4f50edbf471aa1e16fa7483641baddfce7c0574373e761e43d2396fa1082"} Nov 22 11:13:06 crc kubenswrapper[4743]: I1122 11:13:06.077188 4743 generic.go:334] "Generic (PLEG): container finished" podID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerID="d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82" exitCode=0 Nov 22 11:13:06 crc kubenswrapper[4743]: I1122 11:13:06.077394 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tp52" event={"ID":"761fab6a-fb58-4deb-9de3-8942ac14212b","Type":"ContainerDied","Data":"d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82"} Nov 22 11:13:06 crc kubenswrapper[4743]: I1122 11:13:06.080031 4743 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Nov 22 11:13:07 crc kubenswrapper[4743]: I1122 11:13:07.094412 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tp52" event={"ID":"761fab6a-fb58-4deb-9de3-8942ac14212b","Type":"ContainerStarted","Data":"0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f"} Nov 22 11:13:08 crc kubenswrapper[4743]: I1122 11:13:08.108507 4743 generic.go:334] "Generic (PLEG): container finished" podID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerID="0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f" exitCode=0 Nov 22 11:13:08 crc kubenswrapper[4743]: I1122 11:13:08.108608 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tp52" event={"ID":"761fab6a-fb58-4deb-9de3-8942ac14212b","Type":"ContainerDied","Data":"0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f"} Nov 22 11:13:09 crc kubenswrapper[4743]: I1122 11:13:09.120895 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tp52" event={"ID":"761fab6a-fb58-4deb-9de3-8942ac14212b","Type":"ContainerStarted","Data":"112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f"} Nov 22 11:13:09 crc kubenswrapper[4743]: I1122 11:13:09.146015 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2tp52" podStartSLOduration=3.703666113 podStartE2EDuration="6.145997174s" podCreationTimestamp="2025-11-22 11:13:03 +0000 UTC" firstStartedPulling="2025-11-22 11:13:06.079772568 +0000 UTC m=+10259.786133620" lastFinishedPulling="2025-11-22 11:13:08.522103629 +0000 UTC m=+10262.228464681" observedRunningTime="2025-11-22 11:13:09.144472741 +0000 UTC m=+10262.850833793" watchObservedRunningTime="2025-11-22 11:13:09.145997174 +0000 UTC m=+10262.852358226" Nov 22 11:13:13 crc kubenswrapper[4743]: I1122 11:13:13.789260 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:13 crc kubenswrapper[4743]: I1122 11:13:13.789776 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:13 crc kubenswrapper[4743]: I1122 11:13:13.857228 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:14 crc kubenswrapper[4743]: I1122 11:13:14.228107 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:15 crc kubenswrapper[4743]: I1122 11:13:15.443445 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tp52"] Nov 22 11:13:16 crc kubenswrapper[4743]: I1122 11:13:16.203627 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2tp52" podUID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerName="registry-server" containerID="cri-o://112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f" gracePeriod=2 Nov 22 11:13:16 crc kubenswrapper[4743]: I1122 11:13:16.942320 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.036063 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-catalog-content\") pod \"761fab6a-fb58-4deb-9de3-8942ac14212b\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.036302 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-utilities\") pod \"761fab6a-fb58-4deb-9de3-8942ac14212b\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.036381 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fws44\" (UniqueName: \"kubernetes.io/projected/761fab6a-fb58-4deb-9de3-8942ac14212b-kube-api-access-fws44\") pod \"761fab6a-fb58-4deb-9de3-8942ac14212b\" (UID: \"761fab6a-fb58-4deb-9de3-8942ac14212b\") " Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.038068 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-utilities" (OuterVolumeSpecName: "utilities") pod "761fab6a-fb58-4deb-9de3-8942ac14212b" (UID: "761fab6a-fb58-4deb-9de3-8942ac14212b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.049797 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761fab6a-fb58-4deb-9de3-8942ac14212b-kube-api-access-fws44" (OuterVolumeSpecName: "kube-api-access-fws44") pod "761fab6a-fb58-4deb-9de3-8942ac14212b" (UID: "761fab6a-fb58-4deb-9de3-8942ac14212b"). InnerVolumeSpecName "kube-api-access-fws44". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.064792 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "761fab6a-fb58-4deb-9de3-8942ac14212b" (UID: "761fab6a-fb58-4deb-9de3-8942ac14212b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.139942 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.140010 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/761fab6a-fb58-4deb-9de3-8942ac14212b-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.140025 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fws44\" (UniqueName: \"kubernetes.io/projected/761fab6a-fb58-4deb-9de3-8942ac14212b-kube-api-access-fws44\") on node \"crc\" DevicePath \"\"" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.221436 4743 generic.go:334] "Generic (PLEG): container finished" podID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerID="112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f" exitCode=0 Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.221498 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2tp52" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.221507 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tp52" event={"ID":"761fab6a-fb58-4deb-9de3-8942ac14212b","Type":"ContainerDied","Data":"112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f"} Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.221557 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2tp52" event={"ID":"761fab6a-fb58-4deb-9de3-8942ac14212b","Type":"ContainerDied","Data":"8efe4f50edbf471aa1e16fa7483641baddfce7c0574373e761e43d2396fa1082"} Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.221591 4743 scope.go:117] "RemoveContainer" containerID="112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.249519 4743 scope.go:117] "RemoveContainer" containerID="0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.256761 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tp52"] Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.265709 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2tp52"] Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.270876 4743 scope.go:117] "RemoveContainer" containerID="d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.322212 4743 scope.go:117] "RemoveContainer" containerID="112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f" Nov 22 11:13:17 crc kubenswrapper[4743]: E1122 11:13:17.322618 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f\": container with ID starting with 112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f not found: ID does not exist" containerID="112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.322673 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f"} err="failed to get container status \"112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f\": rpc error: code = NotFound desc = could not find container \"112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f\": container with ID starting with 112628d2096cad2c933fbbf93a93f1fa0691642022c6c3ceefb46e6c7153029f not found: ID does not exist" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.322710 4743 scope.go:117] "RemoveContainer" containerID="0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f" Nov 22 11:13:17 crc kubenswrapper[4743]: E1122 11:13:17.323000 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f\": container with ID starting with 0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f not found: ID does not exist" containerID="0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.323024 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f"} err="failed to get container status \"0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f\": rpc error: code = NotFound desc = could not find container \"0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f\": container with ID starting with 0af2e9d0ac5ddbe37294780551e0a0c057d0249157ad6e7a05b430b956f4ae3f not found: ID does not exist" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.323041 4743 scope.go:117] "RemoveContainer" containerID="d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82" Nov 22 11:13:17 crc kubenswrapper[4743]: E1122 11:13:17.323265 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82\": container with ID starting with d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82 not found: ID does not exist" containerID="d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82" Nov 22 11:13:17 crc kubenswrapper[4743]: I1122 11:13:17.323288 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82"} err="failed to get container status \"d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82\": rpc error: code = NotFound desc = could not find container \"d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82\": container with ID starting with d788e522d927eaf0660fb260ec507be03f01679986045cfd91f88b821a952a82 not found: ID does not exist" Nov 22 11:13:19 crc kubenswrapper[4743]: I1122 11:13:19.163756 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761fab6a-fb58-4deb-9de3-8942ac14212b" path="/var/lib/kubelet/pods/761fab6a-fb58-4deb-9de3-8942ac14212b/volumes" Nov 22 11:14:52 crc kubenswrapper[4743]: I1122 11:14:52.377610 4743 generic.go:334] "Generic (PLEG): container finished" podID="dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc" containerID="252ef331d1b8566e4586e1fb29bba1780dbe652f9faa375289fc3c49d86ace3a" exitCode=0 Nov 22 11:14:52 crc kubenswrapper[4743]: I1122 11:14:52.377674 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-psx7f/must-gather-bwg79" event={"ID":"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc","Type":"ContainerDied","Data":"252ef331d1b8566e4586e1fb29bba1780dbe652f9faa375289fc3c49d86ace3a"} Nov 22 11:14:52 crc kubenswrapper[4743]: I1122 11:14:52.379011 4743 scope.go:117] "RemoveContainer" containerID="252ef331d1b8566e4586e1fb29bba1780dbe652f9faa375289fc3c49d86ace3a" Nov 22 11:14:53 crc kubenswrapper[4743]: I1122 11:14:53.318128 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-psx7f_must-gather-bwg79_dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc/gather/0.log" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.528524 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-88xm9"] Nov 22 11:14:57 crc kubenswrapper[4743]: E1122 11:14:57.530758 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerName="extract-content" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.530789 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerName="extract-content" Nov 22 11:14:57 crc kubenswrapper[4743]: E1122 11:14:57.530819 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerName="extract-utilities" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.530831 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerName="extract-utilities" Nov 22 11:14:57 crc kubenswrapper[4743]: E1122 11:14:57.530855 4743 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerName="registry-server" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.530867 4743 state_mem.go:107] "Deleted CPUSet assignment" podUID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerName="registry-server" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.531336 4743 memory_manager.go:354] "RemoveStaleState removing state" podUID="761fab6a-fb58-4deb-9de3-8942ac14212b" containerName="registry-server" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.534964 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.547377 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-88xm9"] Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.653028 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5knb\" (UniqueName: \"kubernetes.io/projected/ad78b1b6-053e-4756-81a0-7a7b636c020b-kube-api-access-b5knb\") pod \"certified-operators-88xm9\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.653214 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-catalog-content\") pod \"certified-operators-88xm9\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.654419 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-utilities\") pod \"certified-operators-88xm9\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.756915 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-catalog-content\") pod \"certified-operators-88xm9\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.757049 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-utilities\") pod \"certified-operators-88xm9\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.757115 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5knb\" (UniqueName: \"kubernetes.io/projected/ad78b1b6-053e-4756-81a0-7a7b636c020b-kube-api-access-b5knb\") pod \"certified-operators-88xm9\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.757599 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-catalog-content\") pod \"certified-operators-88xm9\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.757682 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-utilities\") pod \"certified-operators-88xm9\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.779167 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5knb\" (UniqueName: \"kubernetes.io/projected/ad78b1b6-053e-4756-81a0-7a7b636c020b-kube-api-access-b5knb\") pod \"certified-operators-88xm9\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:57 crc kubenswrapper[4743]: I1122 11:14:57.869268 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:14:58 crc kubenswrapper[4743]: I1122 11:14:58.564919 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-88xm9"] Nov 22 11:14:59 crc kubenswrapper[4743]: I1122 11:14:59.459452 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88xm9" event={"ID":"ad78b1b6-053e-4756-81a0-7a7b636c020b","Type":"ContainerDied","Data":"a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2"} Nov 22 11:14:59 crc kubenswrapper[4743]: I1122 11:14:59.459274 4743 generic.go:334] "Generic (PLEG): container finished" podID="ad78b1b6-053e-4756-81a0-7a7b636c020b" containerID="a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2" exitCode=0 Nov 22 11:14:59 crc kubenswrapper[4743]: I1122 11:14:59.462956 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88xm9" event={"ID":"ad78b1b6-053e-4756-81a0-7a7b636c020b","Type":"ContainerStarted","Data":"ce416b6684ccf3d096b9f6239adb19a39de7206851ecef2a4716c53de2e91550"} Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.167616 4743 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg"] Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.169977 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.172075 4743 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.173244 4743 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.181072 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg"] Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.332312 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-config-volume\") pod \"collect-profiles-29396835-85tvg\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.332554 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62cff\" (UniqueName: \"kubernetes.io/projected/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-kube-api-access-62cff\") pod \"collect-profiles-29396835-85tvg\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.333175 4743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-secret-volume\") pod \"collect-profiles-29396835-85tvg\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.435065 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-secret-volume\") pod \"collect-profiles-29396835-85tvg\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.435122 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-config-volume\") pod \"collect-profiles-29396835-85tvg\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.435291 4743 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62cff\" (UniqueName: \"kubernetes.io/projected/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-kube-api-access-62cff\") pod \"collect-profiles-29396835-85tvg\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.436713 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-config-volume\") pod \"collect-profiles-29396835-85tvg\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.443432 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-secret-volume\") pod \"collect-profiles-29396835-85tvg\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.453817 4743 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62cff\" (UniqueName: \"kubernetes.io/projected/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-kube-api-access-62cff\") pod \"collect-profiles-29396835-85tvg\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.498837 4743 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:00 crc kubenswrapper[4743]: I1122 11:15:00.984925 4743 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg"] Nov 22 11:15:01 crc kubenswrapper[4743]: I1122 11:15:01.241959 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 11:15:01 crc kubenswrapper[4743]: I1122 11:15:01.242069 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 11:15:01 crc kubenswrapper[4743]: I1122 11:15:01.489187 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88xm9" event={"ID":"ad78b1b6-053e-4756-81a0-7a7b636c020b","Type":"ContainerStarted","Data":"a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e"} Nov 22 11:15:01 crc kubenswrapper[4743]: I1122 11:15:01.494399 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" event={"ID":"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40","Type":"ContainerStarted","Data":"8c07375d0aa3418ed69b474220dbacfa11010193d374f8973dcae26625afa6aa"} Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.089588 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-psx7f/must-gather-bwg79"] Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.089882 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-psx7f/must-gather-bwg79" podUID="dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc" containerName="copy" containerID="cri-o://092289b25c3b4d2a54a25663c85a380924a3967c784e224d2a11a02f7f135a29" gracePeriod=2 Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.102723 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-psx7f/must-gather-bwg79"] Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.516321 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-psx7f_must-gather-bwg79_dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc/copy/0.log" Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.518714 4743 generic.go:334] "Generic (PLEG): container finished" podID="dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc" containerID="092289b25c3b4d2a54a25663c85a380924a3967c784e224d2a11a02f7f135a29" exitCode=143 Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.521329 4743 generic.go:334] "Generic (PLEG): container finished" podID="ad78b1b6-053e-4756-81a0-7a7b636c020b" containerID="a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e" exitCode=0 Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.521368 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88xm9" event={"ID":"ad78b1b6-053e-4756-81a0-7a7b636c020b","Type":"ContainerDied","Data":"a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e"} Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.524249 4743 generic.go:334] "Generic (PLEG): container finished" podID="28e709bc-5dc7-40a1-9468-8a6e7dcfaa40" containerID="aad31d87734f503fe4971d45e7ef2577a23081266a8357c14752617b04694da1" exitCode=0 Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.524290 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" event={"ID":"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40","Type":"ContainerDied","Data":"aad31d87734f503fe4971d45e7ef2577a23081266a8357c14752617b04694da1"} Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.681315 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-psx7f_must-gather-bwg79_dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc/copy/0.log" Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.682016 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/must-gather-bwg79" Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.797649 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-must-gather-output\") pod \"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc\" (UID: \"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc\") " Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.798366 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fj889\" (UniqueName: \"kubernetes.io/projected/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-kube-api-access-fj889\") pod \"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc\" (UID: \"dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc\") " Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.807485 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-kube-api-access-fj889" (OuterVolumeSpecName: "kube-api-access-fj889") pod "dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc" (UID: "dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc"). InnerVolumeSpecName "kube-api-access-fj889". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:15:02 crc kubenswrapper[4743]: I1122 11:15:02.901193 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fj889\" (UniqueName: \"kubernetes.io/projected/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-kube-api-access-fj889\") on node \"crc\" DevicePath \"\"" Nov 22 11:15:03 crc kubenswrapper[4743]: I1122 11:15:03.009062 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc" (UID: "dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:15:03 crc kubenswrapper[4743]: I1122 11:15:03.106329 4743 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc-must-gather-output\") on node \"crc\" DevicePath \"\"" Nov 22 11:15:03 crc kubenswrapper[4743]: I1122 11:15:03.166800 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc" path="/var/lib/kubelet/pods/dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc/volumes" Nov 22 11:15:03 crc kubenswrapper[4743]: I1122 11:15:03.537160 4743 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-psx7f_must-gather-bwg79_dd2d139f-82c2-4a63-97cf-ef3d8c5dacbc/copy/0.log" Nov 22 11:15:03 crc kubenswrapper[4743]: I1122 11:15:03.539640 4743 scope.go:117] "RemoveContainer" containerID="092289b25c3b4d2a54a25663c85a380924a3967c784e224d2a11a02f7f135a29" Nov 22 11:15:03 crc kubenswrapper[4743]: I1122 11:15:03.539664 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-psx7f/must-gather-bwg79" Nov 22 11:15:03 crc kubenswrapper[4743]: I1122 11:15:03.578515 4743 scope.go:117] "RemoveContainer" containerID="252ef331d1b8566e4586e1fb29bba1780dbe652f9faa375289fc3c49d86ace3a" Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.177621 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.347950 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-config-volume\") pod \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.348726 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-secret-volume\") pod \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.348826 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62cff\" (UniqueName: \"kubernetes.io/projected/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-kube-api-access-62cff\") pod \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\" (UID: \"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40\") " Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.349836 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-config-volume" (OuterVolumeSpecName: "config-volume") pod "28e709bc-5dc7-40a1-9468-8a6e7dcfaa40" (UID: "28e709bc-5dc7-40a1-9468-8a6e7dcfaa40"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.350661 4743 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-config-volume\") on node \"crc\" DevicePath \"\"" Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.375791 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-kube-api-access-62cff" (OuterVolumeSpecName: "kube-api-access-62cff") pod "28e709bc-5dc7-40a1-9468-8a6e7dcfaa40" (UID: "28e709bc-5dc7-40a1-9468-8a6e7dcfaa40"). InnerVolumeSpecName "kube-api-access-62cff". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.378289 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "28e709bc-5dc7-40a1-9468-8a6e7dcfaa40" (UID: "28e709bc-5dc7-40a1-9468-8a6e7dcfaa40"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.453003 4743 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-secret-volume\") on node \"crc\" DevicePath \"\"" Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.453061 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62cff\" (UniqueName: \"kubernetes.io/projected/28e709bc-5dc7-40a1-9468-8a6e7dcfaa40-kube-api-access-62cff\") on node \"crc\" DevicePath \"\"" Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.553689 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88xm9" event={"ID":"ad78b1b6-053e-4756-81a0-7a7b636c020b","Type":"ContainerStarted","Data":"fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967"} Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.556316 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" event={"ID":"28e709bc-5dc7-40a1-9468-8a6e7dcfaa40","Type":"ContainerDied","Data":"8c07375d0aa3418ed69b474220dbacfa11010193d374f8973dcae26625afa6aa"} Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.556358 4743 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c07375d0aa3418ed69b474220dbacfa11010193d374f8973dcae26625afa6aa" Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.556418 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29396835-85tvg" Nov 22 11:15:04 crc kubenswrapper[4743]: I1122 11:15:04.585297 4743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-88xm9" podStartSLOduration=3.511442281 podStartE2EDuration="7.585264662s" podCreationTimestamp="2025-11-22 11:14:57 +0000 UTC" firstStartedPulling="2025-11-22 11:14:59.4637284 +0000 UTC m=+10373.170089452" lastFinishedPulling="2025-11-22 11:15:03.537550781 +0000 UTC m=+10377.243911833" observedRunningTime="2025-11-22 11:15:04.575272957 +0000 UTC m=+10378.281634009" watchObservedRunningTime="2025-11-22 11:15:04.585264662 +0000 UTC m=+10378.291625714" Nov 22 11:15:05 crc kubenswrapper[4743]: I1122 11:15:05.257735 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8"] Nov 22 11:15:05 crc kubenswrapper[4743]: I1122 11:15:05.270258 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29396790-cbtl8"] Nov 22 11:15:07 crc kubenswrapper[4743]: I1122 11:15:07.169229 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f170712f-3df4-4ee8-99dd-308c78dce5f5" path="/var/lib/kubelet/pods/f170712f-3df4-4ee8-99dd-308c78dce5f5/volumes" Nov 22 11:15:07 crc kubenswrapper[4743]: I1122 11:15:07.870164 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:15:07 crc kubenswrapper[4743]: I1122 11:15:07.870251 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:15:07 crc kubenswrapper[4743]: I1122 11:15:07.919966 4743 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:15:17 crc kubenswrapper[4743]: I1122 11:15:17.920896 4743 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:15:17 crc kubenswrapper[4743]: I1122 11:15:17.974530 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-88xm9"] Nov 22 11:15:18 crc kubenswrapper[4743]: I1122 11:15:18.710853 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-88xm9" podUID="ad78b1b6-053e-4756-81a0-7a7b636c020b" containerName="registry-server" containerID="cri-o://fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967" gracePeriod=2 Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.240506 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.304691 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5knb\" (UniqueName: \"kubernetes.io/projected/ad78b1b6-053e-4756-81a0-7a7b636c020b-kube-api-access-b5knb\") pod \"ad78b1b6-053e-4756-81a0-7a7b636c020b\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.304772 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-catalog-content\") pod \"ad78b1b6-053e-4756-81a0-7a7b636c020b\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.304918 4743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-utilities\") pod \"ad78b1b6-053e-4756-81a0-7a7b636c020b\" (UID: \"ad78b1b6-053e-4756-81a0-7a7b636c020b\") " Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.306394 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-utilities" (OuterVolumeSpecName: "utilities") pod "ad78b1b6-053e-4756-81a0-7a7b636c020b" (UID: "ad78b1b6-053e-4756-81a0-7a7b636c020b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.311721 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad78b1b6-053e-4756-81a0-7a7b636c020b-kube-api-access-b5knb" (OuterVolumeSpecName: "kube-api-access-b5knb") pod "ad78b1b6-053e-4756-81a0-7a7b636c020b" (UID: "ad78b1b6-053e-4756-81a0-7a7b636c020b"). InnerVolumeSpecName "kube-api-access-b5knb". PluginName "kubernetes.io/projected", VolumeGidValue "" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.362938 4743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad78b1b6-053e-4756-81a0-7a7b636c020b" (UID: "ad78b1b6-053e-4756-81a0-7a7b636c020b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.407270 4743 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-utilities\") on node \"crc\" DevicePath \"\"" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.407334 4743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5knb\" (UniqueName: \"kubernetes.io/projected/ad78b1b6-053e-4756-81a0-7a7b636c020b-kube-api-access-b5knb\") on node \"crc\" DevicePath \"\"" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.407347 4743 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad78b1b6-053e-4756-81a0-7a7b636c020b-catalog-content\") on node \"crc\" DevicePath \"\"" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.721860 4743 generic.go:334] "Generic (PLEG): container finished" podID="ad78b1b6-053e-4756-81a0-7a7b636c020b" containerID="fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967" exitCode=0 Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.721919 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88xm9" event={"ID":"ad78b1b6-053e-4756-81a0-7a7b636c020b","Type":"ContainerDied","Data":"fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967"} Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.721923 4743 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-88xm9" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.721970 4743 scope.go:117] "RemoveContainer" containerID="fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.721956 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-88xm9" event={"ID":"ad78b1b6-053e-4756-81a0-7a7b636c020b","Type":"ContainerDied","Data":"ce416b6684ccf3d096b9f6239adb19a39de7206851ecef2a4716c53de2e91550"} Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.759413 4743 scope.go:117] "RemoveContainer" containerID="a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.780826 4743 scope.go:117] "RemoveContainer" containerID="a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.789400 4743 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-88xm9"] Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.809780 4743 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-88xm9"] Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.835129 4743 scope.go:117] "RemoveContainer" containerID="fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967" Nov 22 11:15:19 crc kubenswrapper[4743]: E1122 11:15:19.835559 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967\": container with ID starting with fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967 not found: ID does not exist" containerID="fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.835600 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967"} err="failed to get container status \"fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967\": rpc error: code = NotFound desc = could not find container \"fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967\": container with ID starting with fa961775d5ef84c56bca38d670e43fd5969c9d247527e1ecc5caa91b0f067967 not found: ID does not exist" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.835622 4743 scope.go:117] "RemoveContainer" containerID="a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e" Nov 22 11:15:19 crc kubenswrapper[4743]: E1122 11:15:19.836149 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e\": container with ID starting with a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e not found: ID does not exist" containerID="a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.836206 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e"} err="failed to get container status \"a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e\": rpc error: code = NotFound desc = could not find container \"a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e\": container with ID starting with a105eb70d055c19875e8a330a844c3cba06c2b3d10fbda263d1abe69540cbe8e not found: ID does not exist" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.836269 4743 scope.go:117] "RemoveContainer" containerID="a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2" Nov 22 11:15:19 crc kubenswrapper[4743]: E1122 11:15:19.836640 4743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2\": container with ID starting with a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2 not found: ID does not exist" containerID="a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2" Nov 22 11:15:19 crc kubenswrapper[4743]: I1122 11:15:19.836664 4743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2"} err="failed to get container status \"a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2\": rpc error: code = NotFound desc = could not find container \"a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2\": container with ID starting with a7549a9cacbae42e4cb8f4999edc4d5b5506d325d4c9a1365d15be02ce5085d2 not found: ID does not exist" Nov 22 11:15:21 crc kubenswrapper[4743]: I1122 11:15:21.167085 4743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad78b1b6-053e-4756-81a0-7a7b636c020b" path="/var/lib/kubelet/pods/ad78b1b6-053e-4756-81a0-7a7b636c020b/volumes" Nov 22 11:15:31 crc kubenswrapper[4743]: I1122 11:15:31.242507 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 11:15:31 crc kubenswrapper[4743]: I1122 11:15:31.243479 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 11:15:40 crc kubenswrapper[4743]: I1122 11:15:40.014550 4743 scope.go:117] "RemoveContainer" containerID="2caaa734dbe8b17f8ca0a964de030ed4f89482693a31c43feb9d362afb59a86e" Nov 22 11:16:01 crc kubenswrapper[4743]: I1122 11:16:01.242028 4743 patch_prober.go:28] interesting pod/machine-config-daemon-xk98p container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Nov 22 11:16:01 crc kubenswrapper[4743]: I1122 11:16:01.242819 4743 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Nov 22 11:16:01 crc kubenswrapper[4743]: I1122 11:16:01.242885 4743 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" Nov 22 11:16:01 crc kubenswrapper[4743]: I1122 11:16:01.244052 4743 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27748cf79ea2c8b0ea7f969b56f59641cbd478902f21a0b8e0f088daf78630c6"} pod="openshift-machine-config-operator/machine-config-daemon-xk98p" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Nov 22 11:16:01 crc kubenswrapper[4743]: I1122 11:16:01.244122 4743 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" podUID="bae39197-d188-40a8-880d-0d2e6e528f86" containerName="machine-config-daemon" containerID="cri-o://27748cf79ea2c8b0ea7f969b56f59641cbd478902f21a0b8e0f088daf78630c6" gracePeriod=600 Nov 22 11:16:02 crc kubenswrapper[4743]: I1122 11:16:02.248928 4743 generic.go:334] "Generic (PLEG): container finished" podID="bae39197-d188-40a8-880d-0d2e6e528f86" containerID="27748cf79ea2c8b0ea7f969b56f59641cbd478902f21a0b8e0f088daf78630c6" exitCode=0 Nov 22 11:16:02 crc kubenswrapper[4743]: I1122 11:16:02.248997 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerDied","Data":"27748cf79ea2c8b0ea7f969b56f59641cbd478902f21a0b8e0f088daf78630c6"} Nov 22 11:16:02 crc kubenswrapper[4743]: I1122 11:16:02.249562 4743 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xk98p" event={"ID":"bae39197-d188-40a8-880d-0d2e6e528f86","Type":"ContainerStarted","Data":"5a0d486876579a5a6715502d2295b4a673c245ff7548cd35a75f60ae62ba20ca"} Nov 22 11:16:02 crc kubenswrapper[4743]: I1122 11:16:02.249604 4743 scope.go:117] "RemoveContainer" containerID="d89cfa34fc7fe6d3dc2f6c72471b9e7c43bbbf1955e2c5417f5f1926010eefe1"